Browse Source

[DOCS] Document ingest pipelines for Fleet and Elastic Agent (#70907)

James Rodewig 4 years ago
parent
commit
27abdd9f2a

BIN
docs/reference/images/ingest/custom-logs-pipeline.png


BIN
docs/reference/images/ingest/custom-logs.png


+ 149 - 0
docs/reference/ingest.asciidoc

@@ -271,6 +271,155 @@ Use the <<index-final-pipeline,`index.final_pipeline`>> index setting to set a
 final pipeline. {es} applies this pipeline after the request or default
 pipeline, even if neither is specified.
 
+[discrete]
+[[pipelines-for-fleet-elastic-agent]]
+=== Pipelines for {fleet} and {agent}
+
+{fleet-guide}/index.html[{fleet}] automatically adds ingest pipelines for its
+integrations. {fleet} applies these pipelines using <<index-templates,index
+templates>> that include <<set-default-pipeline,pipeline index settings>>. {es}
+matches these templates to your {fleet} data streams based on the
+{fleet-guide}/data-streams.html#data-streams-naming-scheme[stream's naming
+scheme].
+
+WARNING: Do not change {fleet}'s ingest pipelines or use custom pipelines for
+your {fleet} integrations. Doing so can break your {fleet} data streams.
+
+{fleet} doesn't provide an ingest pipeline for the **Custom logs** integration.
+You can safely specify a pipeline for this integration in one of two ways: an
+<<pipeline-custom-logs-index-template,index template>> or a
+<<pipeline-custom-logs-configuration,custom configuration>>.
+
+[[pipeline-custom-logs-index-template]]
+**Option 1: Index template**
+
+// tag::create-name-custom-logs-pipeline[]
+. <<create-manage-ingest-pipelines,Create>> and <<test-pipeline,test>> your
+ingest pipeline. Name your pipeline `logs-<dataset-name>-default`. This makes
+tracking the pipeline for your integration easier.
++
+--
+For example, the following request creates a pipeline for the `my-app` dataset.
+The pipeline's name is `logs-my_app-default`.
+
+[source,console]
+----
+PUT _ingest/pipeline/logs-my_app-default
+{
+  "description": "Pipeline for `my_app` dataset",
+  "processors": [ ... ]
+}
+----
+// TEST[s/\.\.\./{"lowercase": {"field":"my-keyword-field"}}/]
+--
+// end::create-name-custom-logs-pipeline[]
+
+. Create an <<index-templates,index template>> that includes your pipeline in
+the <<index-default-pipeline,`index.default_pipeline`>> or
+<<index-final-pipeline,`index.final_pipeline`>> index setting. Ensure the
+template is <<create-a-data-stream-template,data stream enabled>>. The
+template's index pattern should match `logs-<dataset-name>-*`.
++
+--
+You can create this template using {kib}'s <<manage-index-templates,**Index
+Management**>> feature or the <<indices-put-template,create index template
+API>>. 
+
+For example, the following request creates a template matching `logs-my_app-*`.
+The template uses a component template that contains the
+`index.default_pipeline` index setting.
+
+[source,console]
+----
+# Creates a component template for index settings
+PUT _component_template/logs-my_app-settings
+{
+  "template": {
+    "settings": {
+      "index.default_pipeline": "logs-my_app-default",
+      "index.lifecycle.name": "logs"
+    }
+  }
+}
+
+# Creates an index template matching `logs-my_app-*`
+PUT _index_template/logs-my_app-template
+{
+  "index_patterns": ["logs-my_app-*"],
+  "data_stream": { },
+  "priority": 500,
+  "composed_of": ["logs-my_app-settings", "logs-my_app-mappings"]
+}
+----
+// TEST[continued]
+// TEST[s/, "logs-my_app-mappings"//]
+--
+// tag::name-custom-logs-dataset[]
+. When adding or editing your **Custom logs** integration in {fleet},
+click **Configure integration > Custom log file > Advanced options**.
+
+. In **Dataset name**, specify your dataset's name. {fleet} will add new data
+for the integration to the resulting `logs-<dataset-name>-default` data stream.
++
+For example, if your dataset's name was `my_app`, {fleet} adds new data to the
+`logs-my_app-default` data stream.
+// end::name-custom-logs-dataset[]
++
+[role="screenshot"]
+image::images/ingest/custom-logs.png[Set up custom log integration in Fleet,align="center"]
+
+. Use the <<indices-rollover-index,rollover API>> to roll over your data stream.
+This ensures {es} applies the index template and its pipeline settings to any
+new data for the integration.
++
+--
+////
+[source,console]
+----
+PUT _data_stream/logs-my_app-default
+----
+// TEST[continued]
+////
+
+[source,console]
+----
+POST logs-my_app-default/_rollover/
+----
+// TEST[continued]
+
+////
+[source,console]
+----
+DELETE _data_stream/*
+DELETE _index_template/*
+----
+// TEST[continued]
+////
+--
+
+[[pipeline-custom-logs-configuration]]
+**Option 2: Custom configuration**
+
+include::ingest.asciidoc[tag=create-name-custom-logs-pipeline]
+
+include::ingest.asciidoc[tag=name-custom-logs-dataset]
+
+. In **Custom Configurations**, specify your pipeline in the `pipeline` policy
+setting.
++
+[role="screenshot"]
+image::images/ingest/custom-logs-pipeline.png[Custom pipeline configuration for custom log integration,align="center"]
+
+**{agent} standalone**
+
+If you run {agent} standalone, you can apply pipelines using an
+<<index-templates,index template>> that includes the
+<<index-default-pipeline,`index.default_pipeline`>> or
+<<index-final-pipeline,`index.final_pipeline`>> index setting. Alternatively,
+you can specify the `pipeline` policy setting in your `elastic-agent.yml`
+configuration. See {fleet-guide}/run-elastic-agent-standalone.html[Run {agent}
+standalone].
+
 [discrete]
 [[access-source-fields]]
 === Access source fields in a processor