use-a-data-stream.asciidoc 20 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737
  1. [role="xpack"]
  2. [[use-a-data-stream]]
  3. == Use a data stream
  4. After you <<set-up-a-data-stream,set up a data stream>>, you can do
  5. the following:
  6. * <<add-documents-to-a-data-stream>>
  7. * <<search-a-data-stream>>
  8. * <<get-stats-for-a-data-stream>>
  9. * <<manually-roll-over-a-data-stream>>
  10. * <<open-closed-backing-indices>>
  11. * <<reindex-with-a-data-stream>>
  12. * <<update-docs-in-a-data-stream-by-query>>
  13. * <<delete-docs-in-a-data-stream-by-query>>
  14. * <<update-delete-docs-in-a-backing-index>>
  15. ////
  16. [source,console]
  17. ----
  18. PUT /_index_template/my-data-stream-template
  19. {
  20. "index_patterns": [ "my-data-stream*" ],
  21. "data_stream": { }
  22. }
  23. PUT /_data_stream/my-data-stream
  24. POST /my-data-stream/_rollover/
  25. POST /my-data-stream/_rollover/
  26. PUT /my-data-stream/_create/bfspvnIBr7VVZlfp2lqX?refresh=wait_for
  27. {
  28. "@timestamp": "2020-12-07T11:06:07.000Z",
  29. "user": {
  30. "id": "yWIumJd7"
  31. },
  32. "message": "Login successful"
  33. }
  34. PUT /_data_stream/my-data-stream-alt
  35. ----
  36. // TESTSETUP
  37. [source,console]
  38. ----
  39. DELETE /_data_stream/*
  40. DELETE /_index_template/*
  41. ----
  42. // TEARDOWN
  43. ////
  44. [discrete]
  45. [[add-documents-to-a-data-stream]]
  46. === Add documents to a data stream
  47. You can add documents to a data stream using two types of indexing requests:
  48. * <<data-streams-individual-indexing-requests>>
  49. * <<data-streams-bulk-indexing-requests>>
  50. Adding a document to a data stream adds the document to stream's current
  51. <<data-stream-write-index,write index>>.
  52. You cannot add new documents to a stream's other backing indices, even by
  53. sending requests directly to the index. This means you cannot submit the
  54. following requests directly to any backing index except the write index:
  55. * An <<docs-index_,index API>> request with an
  56. <<docs-index-api-op_type,`op_type`>> of `create`. The `op_type` parameter
  57. defaults to `create` when adding new documents.
  58. * A <<docs-bulk,bulk API>> request using a `create` action
  59. [discrete]
  60. [[data-streams-individual-indexing-requests]]
  61. ==== Individual indexing requests
  62. You can use an <<docs-index_,index API>> request with an
  63. <<docs-index-api-op_type,`op_type`>> of `create` to add individual documents
  64. to a data stream.
  65. NOTE: The `op_type` parameter defaults to `create` when adding new documents.
  66. The following index API request adds a new document to `my-data-stream`.
  67. [source,console]
  68. ----
  69. POST /my-data-stream/_doc/
  70. {
  71. "@timestamp": "2020-12-07T11:06:07.000Z",
  72. "user": {
  73. "id": "8a4f500d"
  74. },
  75. "message": "Login successful"
  76. }
  77. ----
  78. IMPORTANT: You cannot add new documents to a data stream using the index API's
  79. `PUT /<target>/_doc/<_id>` request format. To specify a document ID, use the
  80. `PUT /<target>/_create/<_id>` format instead.
  81. [discrete]
  82. [[data-streams-bulk-indexing-requests]]
  83. ==== Bulk indexing requests
  84. You can use the <<docs-bulk,bulk API>> to add multiple documents to a data
  85. stream in a single request. Each action in the bulk request must use the
  86. `create` action.
  87. NOTE: Data streams do not support other bulk actions, such as `index`.
  88. The following bulk API request adds several new documents to
  89. `my-data-stream`. Only the `create` action is used.
  90. [source,console]
  91. ----
  92. PUT /my-data-stream/_bulk?refresh
  93. {"create":{ }}
  94. { "@timestamp": "2020-12-08T11:04:05.000Z", "user": { "id": "vlb44hny" }, "message": "Login attempt failed" }
  95. {"create":{ }}
  96. { "@timestamp": "2020-12-08T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }
  97. {"create":{ }}
  98. { "@timestamp": "2020-12-09T11:07:08.000Z", "user": { "id": "l7gk7f82" }, "message": "Logout successful" }
  99. ----
  100. [discrete]
  101. [[data-streams-index-with-an-ingest-pipeline]]
  102. ==== Index with an ingest pipeline
  103. You can use an <<ingest,ingest pipeline>> with an indexing request to
  104. pre-process data before it's indexed to a data stream.
  105. The following <<put-pipeline-api,put pipeline API>> request creates the
  106. `lowercase_message_field` ingest pipeline. The pipeline uses the
  107. <<lowercase-processor,`lowercase` ingest processor>> to change the `message`
  108. field value to lowercase before indexing.
  109. [source,console]
  110. ----
  111. PUT /_ingest/pipeline/lowercase_message_field
  112. {
  113. "description" : "Lowercases the message field value",
  114. "processors" : [
  115. {
  116. "lowercase" : {
  117. "field" : "message"
  118. }
  119. }
  120. ]
  121. }
  122. ----
  123. // TEST[continued]
  124. The following index API request adds a new document to `my-data-stream`.
  125. The request includes a `?pipeline=lowercase_message_field` query parameter.
  126. This parameter indicates {es} should use the `lowercase_message_field` pipeline
  127. to pre-process the document before indexing it.
  128. During pre-processing, the pipeline changes the letter case of the document's
  129. `message` field value from `LOGIN Successful` to `login successful`.
  130. [source,console]
  131. ----
  132. POST /my-data-stream/_doc?pipeline=lowercase_message_field
  133. {
  134. "@timestamp": "2020-12-08T11:12:01.000Z",
  135. "user": {
  136. "id": "I1YBEOxJ"
  137. },
  138. "message": "LOGIN Successful"
  139. }
  140. ----
  141. // TEST[continued]
  142. ////
  143. [source,console]
  144. ----
  145. DELETE /_ingest/pipeline/lowercase_message_field
  146. ----
  147. // TEST[continued]
  148. ////
  149. [discrete]
  150. [[search-a-data-stream]]
  151. === Search a data stream
  152. The following search APIs support data streams:
  153. * <<search-search, Search>>
  154. * <<async-search, Async search>>
  155. * <<search-multi-search, Multi search>>
  156. * <<search-field-caps, Field capabilities>>
  157. * <<eql-search-api, EQL search>>
  158. The following <<search-search,search API>> request searches `my-data-stream`
  159. for documents with a timestamp between today and yesterday that also have
  160. `message` value of `login successful`.
  161. [source,console]
  162. ----
  163. GET /my-data-stream/_search
  164. {
  165. "query": {
  166. "bool": {
  167. "must": {
  168. "range": {
  169. "@timestamp": {
  170. "gte": "now-1d/d",
  171. "lt": "now/d"
  172. }
  173. }
  174. },
  175. "should": {
  176. "match": {
  177. "message": "login successful"
  178. }
  179. }
  180. }
  181. }
  182. }
  183. ----
  184. You can use a comma-separated list or wildcard (`*`) expression to search
  185. multiple data streams, indices, and index aliases in the same request.
  186. The following request searches `my-data-stream` and `my-data-stream-alt`,
  187. which are specified as a comma-separated list in the request path.
  188. [source,console]
  189. ----
  190. GET /my-data-stream,my-data-stream-alt/_search
  191. {
  192. "query": {
  193. "match": {
  194. "user.id": "8a4f500d"
  195. }
  196. }
  197. }
  198. ----
  199. The following request uses the `my-data-stream*` wildcard expression to search any data
  200. stream, index, or index alias beginning with `my-data-stream`.
  201. [source,console]
  202. ----
  203. GET /my-data-stream*/_search
  204. {
  205. "query": {
  206. "match": {
  207. "user.id": "vlb44hny"
  208. }
  209. }
  210. }
  211. ----
  212. The following search request omits a target in the request path. The request
  213. searches all data streams and indices in the cluster.
  214. [source,console]
  215. ----
  216. GET /_search
  217. {
  218. "query": {
  219. "match": {
  220. "user.id": "l7gk7f82"
  221. }
  222. }
  223. }
  224. ----
  225. [discrete]
  226. [[get-stats-for-a-data-stream]]
  227. === Get statistics for a data stream
  228. You can use the <<data-stream-stats-api,data stream stats API>> to retrieve
  229. statistics for one or more data streams. These statistics include:
  230. * A count of the stream's backing indices
  231. * The total store size of all shards for the stream's backing indices
  232. * The highest `@timestamp` value for the stream
  233. .*Example*
  234. [%collapsible]
  235. ====
  236. The following data stream stats API request retrieves statistics for
  237. `my-data-stream`.
  238. [source,console]
  239. ----
  240. GET /_data_stream/my-data-stream/_stats?human=true
  241. ----
  242. The API returns the following response.
  243. [source,console-result]
  244. ----
  245. {
  246. "_shards": {
  247. "total": 6,
  248. "successful": 3,
  249. "failed": 0
  250. },
  251. "data_stream_count": 1,
  252. "backing_indices": 3,
  253. "total_store_size": "624b",
  254. "total_store_size_bytes": 624,
  255. "data_streams": [
  256. {
  257. "data_stream": "my-data-stream",
  258. "backing_indices": 3,
  259. "store_size": "624b",
  260. "store_size_bytes": 624,
  261. "maximum_timestamp": 1607339167000
  262. }
  263. ]
  264. }
  265. ----
  266. // TESTRESPONSE[s/"total_store_size": "624b"/"total_store_size": $body.total_store_size/]
  267. // TESTRESPONSE[s/"total_store_size_bytes": 624/"total_store_size_bytes": $body.total_store_size_bytes/]
  268. // TESTRESPONSE[s/"store_size": "624b"/"store_size": $body.data_streams.0.store_size/]
  269. // TESTRESPONSE[s/"store_size_bytes": 624/"store_size_bytes": $body.data_streams.0.store_size_bytes/]
  270. ====
  271. [discrete]
  272. [[manually-roll-over-a-data-stream]]
  273. === Manually roll over a data stream
  274. A rollover creates a new backing index for a data stream. This new backing index
  275. becomes the stream's <<data-stream-write-index,write index>> and increments
  276. the stream's <<data-streams-generation,generation>>.
  277. In most cases, we recommend using <<index-lifecycle-management,{ilm-init}>> to
  278. automate rollovers for data streams. This lets you automatically roll over the
  279. current write index when it meets specified criteria, such as a maximum age or
  280. size.
  281. However, you can also use the <<indices-rollover-index,rollover API>> to
  282. manually perform a rollover. This can be useful if you want to
  283. <<data-streams-change-mappings-and-settings,apply mapping or setting changes>>
  284. to the stream's write index after updating a data stream's template.
  285. The following <<indices-rollover-index,rollover API>> request submits a manual
  286. rollover request for `my-data-stream`.
  287. [source,console]
  288. ----
  289. POST /my-data-stream/_rollover/
  290. ----
  291. [discrete]
  292. [[open-closed-backing-indices]]
  293. === Open closed backing indices
  294. You may <<indices-close,close>> one or more of a data stream's backing indices
  295. as part of its {ilm-init} lifecycle or another workflow. A closed backing index
  296. cannot be searched, even for searches targeting its data stream. You also can't
  297. <<update-docs-in-a-data-stream-by-query,update>> or
  298. <<delete-docs-in-a-data-stream-by-query,delete>> documents in a closed index.
  299. You can re-open individual backing indices by sending an
  300. <<indices-open-close,open request>> directly to the index.
  301. You also can conveniently re-open all closed backing indices for a data stream
  302. by sending an open request directly to the stream.
  303. The following <<cat-indices,cat indices>> API request retrieves the status for
  304. `my-data-stream`'s backing indices.
  305. ////
  306. [source,console]
  307. ----
  308. POST /.ds-my-data-stream-000001,.ds-my-data-stream-000002/_close/
  309. ----
  310. ////
  311. [source,console]
  312. ----
  313. GET /_cat/indices/my-data-stream?v&s=index&h=index,status
  314. ----
  315. // TEST[continued]
  316. The API returns the following response. The response indicates
  317. `my-data-stream` contains two closed backing indices:
  318. `.ds-my-data-stream-000001` and `.ds-my-data-stream-000002`.
  319. [source,txt]
  320. ----
  321. index status
  322. .ds-my-data-stream-000001 close
  323. .ds-my-data-stream-000002 close
  324. .ds-my-data-stream-000003 open
  325. ----
  326. // TESTRESPONSE[non_json]
  327. The following <<indices-open-close,open API>> request re-opens any closed
  328. backing indices for `my-data-stream`, including
  329. `.ds-my-data-stream-000001` and `.ds-my-data-stream-000002`.
  330. [source,console]
  331. ----
  332. POST /my-data-stream/_open/
  333. ----
  334. // TEST[continued]
  335. You can resubmit the original cat indices API request to verify
  336. `.ds-my-data-stream-000001` and `.ds-my-data-stream-000002` were re-opened.
  337. [source,console]
  338. ----
  339. GET /_cat/indices/my-data-stream?v&s=index&h=index,status
  340. ----
  341. // TEST[continued]
  342. The API returns the following response.
  343. [source,txt]
  344. ----
  345. index status
  346. .ds-my-data-stream-000001 open
  347. .ds-my-data-stream-000002 open
  348. .ds-my-data-stream-000003 open
  349. ----
  350. // TESTRESPONSE[non_json]
  351. [discrete]
  352. [[reindex-with-a-data-stream]]
  353. === Reindex with a data stream
  354. You can use the <<docs-reindex,reindex API>> to copy documents to a data stream
  355. from an existing index, index alias, or data stream.
  356. A reindex copies documents from a _source_ to a _destination_. The source and
  357. destination can be any pre-existing index, index alias, or data stream. However,
  358. the source and destination must be different. You cannot reindex a data stream
  359. into itself.
  360. Because data streams are <<data-streams-append-only,append-only>>, a reindex
  361. request to a data stream destination must have an `op_type` of `create`. This
  362. means a reindex can only add new documents to a data stream. It cannot update
  363. existing documents in the data stream destination.
  364. A reindex can be used to:
  365. * Convert an existing index alias and collection of time-based indices into a
  366. data stream.
  367. * Apply a new or updated <<create-a-data-stream-template,index template>>
  368. by reindexing an existing data stream into a new one. This applies mapping
  369. and setting changes in the template to each document and backing index of the
  370. data stream destination. See
  371. <<data-streams-use-reindex-to-change-mappings-settings>>.
  372. TIP: If you only want to update the mappings or settings of a data stream's
  373. write index, we recommend you update the <<create-a-data-stream-template,data
  374. stream's template>> and perform a <<manually-roll-over-a-data-stream,rollover>>.
  375. The following reindex request copies documents from the `archive` index alias to
  376. `my-data-stream`. Because the destination is a data
  377. stream, the request's `op_type` is `create`.
  378. ////
  379. [source,console]
  380. ----
  381. PUT /_bulk?refresh=wait_for
  382. {"create":{"_index" : "archive_1"}}
  383. { "@timestamp": "2020-12-08T11:04:05.000Z" }
  384. {"create":{"_index" : "archive_2"}}
  385. { "@timestamp": "2020-12-08T11:06:07.000Z" }
  386. {"create":{"_index" : "archive_2"}}
  387. { "@timestamp": "2020-12-09T11:07:08.000Z" }
  388. {"create":{"_index" : "archive_2"}}
  389. { "@timestamp": "2020-12-09T11:07:08.000Z" }
  390. POST /_aliases
  391. {
  392. "actions" : [
  393. { "add" : { "index" : "archive_1", "alias" : "archive" } },
  394. { "add" : { "index" : "archive_2", "alias" : "archive", "is_write_index" : true} }
  395. ]
  396. }
  397. ----
  398. ////
  399. [source,console]
  400. ----
  401. POST /_reindex
  402. {
  403. "source": {
  404. "index": "archive"
  405. },
  406. "dest": {
  407. "index": "my-data-stream",
  408. "op_type": "create"
  409. }
  410. }
  411. ----
  412. // TEST[continued]
  413. You can also reindex documents from a data stream to an index, index
  414. alias, or data stream.
  415. The following reindex request copies documents from `my-data-stream`
  416. to the existing `archive` index alias. Because the destination is not a
  417. data stream, the `op_type` does not need to be specified.
  418. [source,console]
  419. ----
  420. POST /_reindex
  421. {
  422. "source": {
  423. "index": "my-data-stream"
  424. },
  425. "dest": {
  426. "index": "archive"
  427. }
  428. }
  429. ----
  430. // TEST[continued]
  431. [discrete]
  432. [[update-docs-in-a-data-stream-by-query]]
  433. === Update documents in a data stream by query
  434. You cannot send indexing or update requests for existing documents directly to a
  435. data stream. These prohibited requests include:
  436. * An <<docs-index_,index API>> request with an
  437. <<docs-index-api-op_type,`op_type`>> of `index`. The `op_type` parameter
  438. defaults to `index` for existing documents.
  439. * A <<docs-bulk,bulk API>> request using the `index` or `update`
  440. action.
  441. Instead, you can use the <<docs-update-by-query,update by query API>> to update
  442. documents in a data stream that matches a provided query.
  443. The following update by query request updates documents in `my-data-stream`
  444. with a `user.id` of `l7gk7f82`. The request uses a
  445. <<modules-scripting-using,script>> to assign matching documents a new `user.id`
  446. value of `XgdX0NoX`.
  447. [source,console]
  448. ----
  449. POST /my-data-stream/_update_by_query
  450. {
  451. "query": {
  452. "match": {
  453. "user.id": "l7gk7f82"
  454. }
  455. },
  456. "script": {
  457. "source": "ctx._source.user.id = params.new_id",
  458. "params": {
  459. "new_id": "XgdX0NoX"
  460. }
  461. }
  462. }
  463. ----
  464. [discrete]
  465. [[delete-docs-in-a-data-stream-by-query]]
  466. === Delete documents in a data stream by query
  467. You cannot send document deletion requests directly to a data stream. These
  468. prohibited requests include:
  469. * A <<docs-delete,delete API>> request
  470. * A <<docs-bulk,bulk API>> request using the `delete` action.
  471. Instead, you can use the <<docs-delete-by-query,delete by query API>> to delete
  472. documents in a data stream that matches a provided query.
  473. The following delete by query request deletes documents in `my-data-stream`
  474. with a `user.id` of `vlb44hny`.
  475. [source,console]
  476. ----
  477. POST /my-data-stream/_delete_by_query
  478. {
  479. "query": {
  480. "match": {
  481. "user.id": "vlb44hny"
  482. }
  483. }
  484. }
  485. ----
  486. [discrete]
  487. [[update-delete-docs-in-a-backing-index]]
  488. === Update or delete documents in a backing index
  489. Alternatively, you can update or delete documents in a data stream by sending
  490. the update or deletion request to the backing index containing the document. To
  491. do this, you first need to get:
  492. * The <<mapping-id-field,document ID>>
  493. * The name of the backing index that contains the document
  494. If you want to update a document, you must also get its current
  495. <<optimistic-concurrency-control,sequence number and primary term>>.
  496. You can use a <<search-a-data-stream,search request>> to retrieve this
  497. information.
  498. The following search request retrieves documents in `my-data-stream`
  499. with a `user.id` of `yWIumJd7`. By default, this search returns the
  500. document ID and backing index for any matching documents.
  501. The request includes a `"seq_no_primary_term": true` argument. This means the
  502. search also returns the sequence number and primary term for any matching
  503. documents.
  504. [source,console]
  505. ----
  506. GET /my-data-stream/_search
  507. {
  508. "seq_no_primary_term": true,
  509. "query": {
  510. "match": {
  511. "user.id": "yWIumJd7"
  512. }
  513. }
  514. }
  515. ----
  516. The API returns the following response. The `hits.hits` property contains
  517. information for any documents matching the search.
  518. [source,console-result]
  519. ----
  520. {
  521. "took": 20,
  522. "timed_out": false,
  523. "_shards": {
  524. "total": 3,
  525. "successful": 3,
  526. "skipped": 0,
  527. "failed": 0
  528. },
  529. "hits": {
  530. "total": {
  531. "value": 1,
  532. "relation": "eq"
  533. },
  534. "max_score": 0.2876821,
  535. "hits": [
  536. {
  537. "_index": ".ds-my-data-stream-000003", <1>
  538. "_id": "bfspvnIBr7VVZlfp2lqX", <2>
  539. "_seq_no": 0, <3>
  540. "_primary_term": 1, <4>
  541. "_score": 0.2876821,
  542. "_source": {
  543. "@timestamp": "2020-12-07T11:06:07.000Z",
  544. "user": {
  545. "id": "yWIumJd7"
  546. },
  547. "message": "Login successful"
  548. }
  549. }
  550. ]
  551. }
  552. }
  553. ----
  554. // TESTRESPONSE[s/"took": 20/"took": $body.took/]
  555. // TESTRESPONSE[s/"max_score": 0.2876821/"max_score": $body.hits.max_score/]
  556. // TESTRESPONSE[s/"_score": 0.2876821/"_score": $body.hits.hits.0._score/]
  557. <1> Backing index containing the matching document
  558. <2> Document ID for the document
  559. <3> Current sequence number for the document
  560. <4> Primary term for the document
  561. You can use an <<docs-index_,index API>> request to update an individual
  562. document. To prevent an accidental overwrite, this request must include valid
  563. `if_seq_no` and `if_primary_term` arguments.
  564. The following index API request updates an existing document in
  565. `my-data-stream`. The request targets document ID
  566. `bfspvnIBr7VVZlfp2lqX` in the `.ds-my-data-stream-000003` backing index.
  567. The request also includes the current sequence number and primary term in the
  568. respective `if_seq_no` and `if_primary_term` query parameters. The request body
  569. contains a new JSON source for the document.
  570. [source,console]
  571. ----
  572. PUT /.ds-my-data-stream-000003/_doc/bfspvnIBr7VVZlfp2lqX?if_seq_no=0&if_primary_term=1
  573. {
  574. "@timestamp": "2020-12-07T11:06:07.000Z",
  575. "user": {
  576. "id": "8a4f500d"
  577. },
  578. "message": "Login successful"
  579. }
  580. ----
  581. You use the <<docs-delete,delete API>> to delete individual documents. Deletion
  582. requests do not require a sequence number or primary term.
  583. The following index API request deletes an existing document in
  584. `my-data-stream`. The request targets document ID
  585. `bfspvnIBr7VVZlfp2lqX` in the `.ds-my-data-stream-000003` backing index.
  586. [source,console]
  587. ----
  588. DELETE /.ds-my-data-stream-000003/_doc/bfspvnIBr7VVZlfp2lqX
  589. ----
  590. You can use the <<docs-bulk,bulk API>> to delete or update multiple documents in
  591. one request using `delete`, `index`, or `update` actions.
  592. If the action type is `index`, the action must include valid
  593. <<bulk-optimistic-concurrency-control,`if_seq_no` and `if_primary_term`>>
  594. arguments.
  595. The following bulk API request uses an `index` action to update an existing
  596. document in `my-data-stream`.
  597. The `index` action targets document ID `bfspvnIBr7VVZlfp2lqX` in the
  598. `.ds-my-data-stream-000003` backing index. The action also includes the current
  599. sequence number and primary term in the respective `if_seq_no` and
  600. `if_primary_term` parameters.
  601. [source,console]
  602. ----
  603. PUT /_bulk?refresh
  604. { "index": { "_index": ".ds-my-data-stream-000003", "_id": "bfspvnIBr7VVZlfp2lqX", "if_seq_no": 0, "if_primary_term": 1 } }
  605. { "@timestamp": "2020-12-07T11:06:07.000Z", "user": { "id": "8a4f500d" }, "message": "Login successful" }
  606. ----