123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290 |
- [[use-a-data-stream]]
- == Use a data stream
- After you <<set-up-a-data-stream,set up a data stream>>, you can do
- the following:
- * <<add-documents-to-a-data-stream>>
- * <<search-a-data-stream>>
- * <<manually-roll-over-a-data-stream>>
- * <<reindex-with-a-data-stream>>
- ////
- [source,console]
- ----
- PUT /_index_template/logs_data_stream
- {
- "index_patterns": [ "logs*" ],
- "data_stream": {
- "timestamp_field": "@timestamp"
- },
- "template": {
- "mappings": {
- "properties": {
- "@timestamp": {
- "type": "date"
- }
- }
- }
- }
- }
- PUT /_data_stream/logs
- ----
- ////
- [discrete]
- [[add-documents-to-a-data-stream]]
- === Add documents to a data stream
- You can add documents to a data stream using the following requests:
- * An <<docs-index_,index API>> request with an
- <<docs-index-api-op_type,`op_type`>> set to `create`. Specify the data
- stream's name in place of an index name.
- +
- --
- NOTE: The `op_type` parameter defaults to `create` when adding new documents.
- .*Example: Index API request*
- [%collapsible]
- ====
- The following <<docs-index_,index API>> adds a new document to the `logs` data
- stream.
- [source,console]
- ----
- POST /logs/_doc/
- {
- "@timestamp": "2020-12-07T11:06:07.000Z",
- "user": {
- "id": "8a4f500d"
- },
- "message": "Login successful"
- }
- ----
- // TEST[continued]
- ====
- --
- * A <<docs-bulk,bulk API>> request using the `create` action. Specify the data
- stream's name in place of an index name.
- +
- --
- NOTE: Data streams do not support other bulk actions, such as `index`.
- .*Example: Bulk API request*
- [%collapsible]
- ====
- The following <<docs-bulk,bulk API>> index request adds several new documents to
- the `logs` data stream. Note that only the `create` action is used.
- [source,console]
- ----
- PUT /logs/_bulk?refresh
- {"create":{"_index" : "logs"}}
- { "@timestamp": "2020-12-08T11:04:05.000Z", "user": { "id": "vlb44hny" }, "message": "Login attempt failed" }
- {"create":{"_index" : "logs"}}
- { "@timestamp": "2020-12-08T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }
- {"create":{"_index" : "logs"}}
- { "@timestamp": "2020-12-09T11:07:08.000Z", "user": { "id": "l7gk7f82" }, "message": "Logout successful" }
- ----
- // TEST[continued]
- ====
- --
- [discrete]
- [[search-a-data-stream]]
- === Search a data stream
- The following search APIs support data streams:
- * <<search-search, Search>>
- * <<async-search, Async search>>
- * <<search-multi-search, Multi search>>
- * <<search-field-caps, Field capabilities>>
- ////
- * <<eql-search-api, EQL search>>
- ////
- .*Example*
- [%collapsible]
- ====
- The following <<search-search,search API>> request searches the `logs` data
- stream for documents with a timestamp between today and yesterday that also have
- `message` value of `login successful`.
- [source,console]
- ----
- GET /logs/_search
- {
- "query": {
- "bool": {
- "must": {
- "range": {
- "@timestamp": {
- "gte": "now-1d/d",
- "lt": "now/d"
- }
- }
- },
- "should": {
- "match": {
- "message": "login successful"
- }
- }
- }
- }
- }
- ----
- // TEST[continued]
- ====
- [discrete]
- [[manually-roll-over-a-data-stream]]
- === Manually roll over a data stream
- A rollover creates a new backing index for a data stream. This new backing index
- becomes the stream's <<data-stream-write-index,write index>> and increments
- the stream's <<data-streams-generation,generation>>.
- In most cases, we recommend using <<index-lifecycle-management,{ilm-init}>> to
- automate rollovers for data streams. This lets you automatically roll over the
- current write index when it meets specified criteria, such as a maximum age or
- size.
- However, you can also use the <<indices-rollover-index,rollover API>> to
- manually perform a rollover. This can be useful if you want to apply mapping or
- setting changes to the stream's write index after updating a data stream's
- template.
- .*Example*
- [%collapsible]
- ====
- The following <<indices-rollover-index,rollover API>> request submits a manual
- rollover request for the `logs` data stream.
- [source,console]
- ----
- POST /logs/_rollover/
- {
- "conditions": {
- "max_docs": "1"
- }
- }
- ----
- // TEST[continued]
- ====
- [discrete]
- [[reindex-with-a-data-stream]]
- === Reindex with a data stream
- You can use the <<docs-reindex,reindex API>> to copy documents to a data stream
- from an existing index, index alias, or data stream.
- A reindex copies documents from a _source_ to a _destination_. The source and
- destination can be any pre-existing index, index alias, or data stream. However,
- the source and destination must be different. You cannot reindex a data stream
- into itself.
- Because data streams are <<data-streams-append-only,append-only>>, a reindex
- request to a data stream destination must have an `op_type` of `create`. This
- means a reindex can only add new documents to a data stream. It cannot update
- existing documents in the data stream destination.
- A reindex can be used to:
- * Convert an existing index alias and collection of time-based indices into a
- data stream.
- * Apply a new or updated <<create-a-data-stream-template,composable template>>
- by reindexing an existing data stream into a new one. This applies mapping
- and setting changes in the template to each document and backing index of the
- data stream destination.
- TIP: If you only want to update the mappings or settings of a data stream's
- write index, we recommend you update the <<create-a-data-stream-template,data
- stream's template>> and perform a <<manually-roll-over-a-data-stream,rollover>>.
- .*Example*
- [%collapsible]
- ====
- The following reindex request copies documents from the `archive` index alias to
- the existing `logs` data stream. Because the destination is a data stream, the
- the request's `op_type` is `create`.
- ////
- [source,console]
- ----
- PUT /_bulk?refresh=wait_for
- {"create":{"_index" : "archive_1"}}
- { "@timestamp": "2020-12-08T11:04:05.000Z" }
- {"create":{"_index" : "archive_2"}}
- { "@timestamp": "2020-12-08T11:06:07.000Z" }
- {"create":{"_index" : "archive_2"}}
- { "@timestamp": "2020-12-09T11:07:08.000Z" }
- {"create":{"_index" : "archive_2"}}
- { "@timestamp": "2020-12-09T11:07:08.000Z" }
- POST /_aliases
- {
- "actions" : [
- { "add" : { "index" : "archive_1", "alias" : "archive" } },
- { "add" : { "index" : "archive_2", "alias" : "archive", "is_write_index" : true} }
- ]
- }
- ----
- // TEST[continued]
- ////
- [source,console]
- ----
- POST /_reindex
- {
- "source": {
- "index": "archive"
- },
- "dest": {
- "index": "logs",
- "op_type": "create"
- }
- }
- ----
- // TEST[continued]
- ====
- You can also reindex documents from a data stream to an index, index
- alias, or data stream.
- .*Example*
- [%collapsible]
- ====
- The following reindex request copies documents from the `logs` data stream
- to the existing `archive` index alias. Because the destination is not a data
- stream, the `op_type` does not need to be specified.
- [source,console]
- ----
- POST /_reindex
- {
- "source": {
- "index": "logs"
- },
- "dest": {
- "index": "archive"
- }
- }
- ----
- // TEST[continued]
- ====
- ////
- [source,console]
- ----
- DELETE /_data_stream/logs
- DELETE /_index_template/logs_data_stream
- ----
- // TEST[continued]
- ////
|