| 123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487 | [[use-a-data-stream]]== Use a data streamAfter you <<set-up-a-data-stream,set up a data stream>>, you can dothe following:* <<add-documents-to-a-data-stream>>* <<search-a-data-stream>>* <<manually-roll-over-a-data-stream>>* <<reindex-with-a-data-stream>>* <<update-delete-docs-in-a-data-stream>>////[source,console]----PUT /_index_template/logs_data_stream{  "index_patterns": [ "logs*" ],  "data_stream": {    "timestamp_field": "@timestamp"  },  "template": {    "mappings": {      "properties": {        "@timestamp": {          "type": "date"        }      }    }  }}PUT /_data_stream/logs----////[discrete][[add-documents-to-a-data-stream]]=== Add documents to a data streamYou can add documents to a data stream using the following requests:* An <<docs-index_,index API>> request with an<<docs-index-api-op_type,`op_type`>> set to `create`. Specify the datastream's name in place of an index name.+--NOTE: The `op_type` parameter defaults to `create` when adding new documents..*Example: Index API request*[%collapsible]====The following index API request adds a new document to the `logs` datastream.[source,console]----POST /logs/_doc/{  "@timestamp": "2020-12-07T11:06:07.000Z",  "user": {    "id": "8a4f500d"  },  "message": "Login successful"}----// TEST[continued]====--* A <<docs-bulk,bulk API>> request using the `create` action. Specify the datastream's name in place of an index name.+--NOTE: Data streams do not support other bulk actions, such as `index`..*Example: Bulk API request*[%collapsible]====The following bulk API request adds several new documents tothe `logs` data stream. Note that only the `create` action is used.[source,console]----PUT /logs/_bulk?refresh{"create":{"_index" : "logs"}}{ "@timestamp": "2020-12-08T11:04:05.000Z", "user": { "id": "vlb44hny" }, "message": "Login attempt failed" }{"create":{"_index" : "logs"}}{ "@timestamp": "2020-12-08T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }{"create":{"_index" : "logs"}}{ "@timestamp": "2020-12-09T11:07:08.000Z", "user": { "id": "l7gk7f82" }, "message": "Logout successful" }----// TEST[continued]====--[discrete][[search-a-data-stream]]=== Search a data streamThe following search APIs support data streams:* <<search-search, Search>>* <<async-search, Async search>>* <<search-multi-search, Multi search>>* <<search-field-caps, Field capabilities>>////* <<eql-search-api, EQL search>>////.*Example*[%collapsible]====The following <<search-search,search API>> request searches the `logs` datastream for documents with a timestamp between today and yesterday that also have`message` value of `login successful`.[source,console]----GET /logs/_search{  "query": {    "bool": {      "must": {        "range": {          "@timestamp": {            "gte": "now-1d/d",            "lt": "now/d"          }        }      },      "should": {        "match": {          "message": "login successful"        }      }    }  }}----// TEST[continued]====[discrete][[manually-roll-over-a-data-stream]]=== Manually roll over a data streamA rollover creates a new backing index for a data stream. This new backing indexbecomes the stream's <<data-stream-write-index,write index>> and incrementsthe stream's <<data-streams-generation,generation>>.In most cases, we recommend using <<index-lifecycle-management,{ilm-init}>> toautomate rollovers for data streams. This lets you automatically roll over thecurrent write index when it meets specified criteria, such as a maximum age orsize.However, you can also use the <<indices-rollover-index,rollover API>> tomanually perform a rollover. This can be useful if you want to<<data-streams-change-mappings-and-settings,apply mapping or setting changes>>to the stream's write index after updating a data stream's template..*Example*[%collapsible]====The following <<indices-rollover-index,rollover API>> request submits a manualrollover request for the `logs` data stream.[source,console]----POST /logs/_rollover/{  "conditions": {    "max_docs":   "1"  }}----// TEST[continued]====[discrete][[reindex-with-a-data-stream]]=== Reindex with a data streamYou can use the <<docs-reindex,reindex API>> to copy documents to a data streamfrom an existing index, index alias, or data stream.A reindex copies documents from a _source_ to a _destination_. The source anddestination can be any pre-existing index, index alias, or data stream. However,the source and destination must be different. You cannot reindex a data streaminto itself.Because data streams are <<data-streams-append-only,append-only>>, a reindexrequest to a data stream destination must have an `op_type` of `create`. Thismeans a reindex can only add new documents to a data stream. It cannot updateexisting documents in the data stream destination.A reindex can be used to:* Convert an existing index alias and collection of time-based indices into a  data stream.* Apply a new or updated <<create-a-data-stream-template,composable template>>  by reindexing an existing data stream into a new one. This applies mapping  and setting changes in the template to each document and backing index of the  data stream destination. See  <<data-streams-use-reindex-to-change-mappings-settings>>.TIP: If you only want to update the mappings or settings of a data stream'swrite index, we recommend you update the <<create-a-data-stream-template,datastream's template>> and perform a <<manually-roll-over-a-data-stream,rollover>>..*Example*[%collapsible]====The following reindex request copies documents from the `archive` index alias tothe existing `logs` data stream. Because the destination is a data stream, therequest's `op_type` is `create`.////[source,console]----PUT /_bulk?refresh=wait_for{"create":{"_index" : "archive_1"}}{ "@timestamp": "2020-12-08T11:04:05.000Z" }{"create":{"_index" : "archive_2"}}{ "@timestamp": "2020-12-08T11:06:07.000Z" }{"create":{"_index" : "archive_2"}}{ "@timestamp": "2020-12-09T11:07:08.000Z" }{"create":{"_index" : "archive_2"}}{ "@timestamp": "2020-12-09T11:07:08.000Z" }POST /_aliases{  "actions" : [    { "add" : { "index" : "archive_1", "alias" : "archive" } },    { "add" : { "index" : "archive_2", "alias" : "archive", "is_write_index" : true} }  ]}----// TEST[continued]////[source,console]----POST /_reindex{  "source": {    "index": "archive"  },  "dest": {    "index": "logs",    "op_type": "create"  }}----// TEST[continued]====You can also reindex documents from a data stream to an index, indexalias, or data stream..*Example*[%collapsible]====The following reindex request copies documents from the `logs` data streamto the existing `archive` index alias. Because the destination is not a datastream, the `op_type` does not need to be specified.[source,console]----POST /_reindex{  "source": {    "index": "logs"  },  "dest": {    "index": "archive"  }}----// TEST[continued]====[discrete][[update-delete-docs-in-a-data-stream]]=== Update or delete documents in a data streamData streams are designed to be <<data-streams-append-only,append-only>>. Thismeans you cannot send update or deletion requests for existing documents to adata stream. However, you can send update or deletion requests to the backingindex containing the document.To delete or update a document in a data stream, you first need to get:* The <<mapping-id-field,document ID>>* The name of the backing index that contains the documentIf you want to update a document, you must also get its current<<optimistic-concurrency-control,sequence number and primary term>>.You can use a <<search-a-data-stream,search request>> to retrieve thisinformation..*Example*[%collapsible]====////[source,console]----PUT /logs/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for{  "@timestamp": "2020-12-07T11:06:07.000Z",  "user": {    "id": "yWIumJd7"  },  "message": "Login successful"}----// TEST[continued]////The following search request retrieves documents in the `logs` data stream witha `user.id` of `yWIumJd7`. By default, this search returns the document ID andbacking index for any matching documents.The request includes a `"seq_no_primary_term": true` argument. This means thesearch also returns the sequence number and primary term for any matchingdocuments.[source,console]----GET /logs/_search{  "seq_no_primary_term": true,  "query": {    "match": {      "user.id": "yWIumJd7"    }  }}----// TEST[continued]The API returns the following response. The `hits.hits` property containsinformation for any documents matching the search.[source,console-result]----{  "took": 20,  "timed_out": false,  "_shards": {    "total": 2,    "successful": 2,    "skipped": 0,    "failed": 0  },  "hits": {    "total": {      "value": 1,      "relation": "eq"    },    "max_score": 0.2876821,    "hits": [      {        "_index": ".ds-logs-000002",                <1>        "_id": "bfspvnIBr7VVZlfp2lqX",              <2>        "_seq_no": 4,                               <3>        "_primary_term": 1,                         <4>        "_score": 0.2876821,        "_source": {          "@timestamp": "2020-12-07T11:06:07.000Z",          "user": {            "id": "yWIumJd7"          },          "message": "Login successful"        }      }    ]  }}----// TESTRESPONSE[s/"took": 20/"took": $body.took/]<1> Backing index containing the matching document<2> Document ID for the document<3> Current sequence number for the document<4> Primary term for the document====You can use an <<docs-index_,index API>> request to update an individualdocument. To prevent an accidental overwrite, this request must include valid`if_seq_no` and `if_primary_term` arguments..*Example*[%collapsible]====The following index API request updates an existing document in the `logs` datastream. The request targets document ID `bfspvnIBr7VVZlfp2lqX` in the`.ds-logs-000002` backing index.The request also includes the current sequence number and primary term in therespective `if_seq_no` and `if_primary_term` query parameters. The request bodycontains a new JSON source for the document.[source,console]----PUT /.ds-logs-000002/_doc/bfspvnIBr7VVZlfp2lqX?if_seq_no=4&if_primary_term=1{  "@timestamp": "2020-12-07T11:06:07.000Z",  "user": {    "id": "8a4f500d"  },  "message": "Login successful"}----// TEST[continued]====You use the <<docs-delete,delete API>> to delete individual documents. Deletionrequests do not require a sequence number or primary term..*Example*[%collapsible]====The following index API request deletes an existing document in the `logs` datastream. The request targets document ID `bfspvnIBr7VVZlfp2lqX` in the`.ds-logs-000002` backing index.[source,console]----DELETE /.ds-logs-000002/_doc/bfspvnIBr7VVZlfp2lqX----// TEST[continued]====You can use the <<docs-bulk,bulk API>> to delete or update multiple documents inone request using `delete`, `index`, or `update` actions.If the action type is `index`, the action must include valid<<bulk-optimistic-concurrency-control,`if_seq_no` and `if_primary_term`>>arguments..*Example*[%collapsible]====////[source,console]----PUT /logs/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for{  "@timestamp": "2020-12-07T11:06:07.000Z",  "user": {    "id": "yWIumJd7"  },  "message": "Login successful"}----// TEST[continued]////The following bulk API request uses an `index` action to update an existingdocument in the `logs` data stream.The `index` action targets document ID `bfspvnIBr7VVZlfp2lqX` in the`.ds-logs-000002` backing index. The action also includes the current sequencenumber and primary term in the respective `if_seq_no` and `if_primary_term`parameters.[source,console]----PUT /_bulk?refresh{ "index": { "_index": ".ds-logs-000002", "_id": "bfspvnIBr7VVZlfp2lqX", "if_seq_no": 4, "if_primary_term": 1 } }{ "@timestamp": "2020-12-07T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }----// TEST[continued]====////[source,console]----DELETE /_data_stream/logsDELETE /_index_template/logs_data_stream----// TEST[continued]////
 |