downsampling.asciidoc 6.5 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176
  1. [[downsampling]]
  2. === Downsampling a time series data stream
  3. preview::[]
  4. Downsampling provides a method to reduce the footprint of your <<tsds,time
  5. series data>> by storing it at reduced granularity.
  6. Metrics solutions collect large amounts of time series data that grow over time.
  7. As that data ages, it becomes less relevant to the current state of the system.
  8. The downsampling process rolls up documents within a fixed time interval into a
  9. single summary document. Each summary document includes statistical
  10. representations of the original data: the `min`, `max`, `sum`, `value_count`,
  11. and `average` for each metric. Data stream <<time-series-dimension,time series
  12. dimensions>> are stored unchanged.
  13. Downsampling, in effect, lets you to trade data resolution and precision for
  14. storage size. You can include it in an <<index-lifecycle-management,{ilm}
  15. ({ilm-init})>> policy to automatically manage the volume and associated cost of
  16. your metrics data at it ages.
  17. Check the following sections to learn more:
  18. * <<how-downsampling-works>>
  19. * <<running-downsampling>>
  20. * <<querying-downsampled-indices>>
  21. * <<downsampling-restrictions>>
  22. * <<try-out-downsampling>>
  23. [discrete]
  24. [[how-downsampling-works]]
  25. === How it works
  26. A <<time-series,time series>> is a sequence of observations taken over time for
  27. a specific entity. The observed samples can be represented as a continuous
  28. function, where the time series dimensions remain constant and the time series
  29. metrics change over time.
  30. //.Sampling a continuous function
  31. image::images/data-streams/time-series-function.png[align="center"]
  32. In an Elasticsearch index, a single document is created for each timestamp,
  33. containing the immutable time series dimensions, together with the metrics names
  34. and the changing metrics values. For a single timestamp, several time series
  35. dimensions and metrics may be stored.
  36. //.Metric anatomy
  37. image::images/data-streams/time-series-metric-anatomy.png[align="center"]
  38. For your most current and relevant data, the metrics series typically has a low
  39. sampling time interval, so it's optimized for queries that require a high data
  40. resolution.
  41. .Original metrics series
  42. image::images/data-streams/time-series-original.png[align="center"]
  43. Downsampling works on older, less frequently accessed data by replacing the
  44. original time series with both a data stream of a higher sampling interval and
  45. statistical representations of that data. Where the original metrics samples may
  46. have been taken, for example, every ten seconds, as the data ages you may choose
  47. to reduce the sample granularity to hourly or daily. You may choose to reduce
  48. the granularity of `cold` archival data to monthly or less.
  49. .Downsampled metrics series
  50. image::images/data-streams/time-series-downsampled.png[align="center"]
  51. [discrete]
  52. [[running-downsampling]]
  53. === Running downsampling on time series data
  54. To downsample a time series index, use the
  55. <<indices-downsample-data-stream,Downsample API>> and set `fixed_interval` to
  56. the level of granularity that you'd like:
  57. ```
  58. POST /<source_index>/_downsample/<new_index>
  59. {
  60. "fixed_interval": "1d"
  61. }
  62. ```
  63. To downsample time series data as part of ILM, include a
  64. <<ilm-downsample,Downsample action>> in your ILM policy and set `fixed_interval`
  65. to the level of granularity that you'd like:
  66. ```
  67. PUT _ilm/policy/my_policy
  68. {
  69. "policy": {
  70. "phases": {
  71. "warm": {
  72. "actions": {
  73. "downsample" : {
  74. "fixed_interval": "1h"
  75. }
  76. }
  77. }
  78. }
  79. }
  80. }
  81. ```
  82. [discrete]
  83. [[querying-downsampled-indices]]
  84. === Querying downsampled indices
  85. You can use the <<search-search,`_search`>> and <<async-search,`_async_search`>>
  86. endpoints to query a downsampled index. Multiple raw data and downsampled
  87. indices can be queried in a single request, and a single request can include
  88. downsampled indices at different granularities (different bucket timespan). That
  89. is, you can query data streams that contain downsampled indices with multiple
  90. downsampling intervals (for example, `15m`, `1h`, `1d`).
  91. The result of a time based histogram aggregation is in a uniform bucket size and
  92. each downsampled index returns data ignoring the downsampling time interval. For
  93. example, if you run a `date_histogram` aggregation with `"fixed_interval": "1m"`
  94. on a downsampled index that has been downsampled at an hourly resolution
  95. (`"fixed_interval": "1h"`), the query returns one bucket with all of the data at
  96. minute 0, then 59 empty buckets, and then a bucket with data again for the next
  97. hour.
  98. NOTE:
  99. There are a few things to note when querying downsampled indices:
  100. * When you run queries in {kib} and through Elastic solutions, a normal
  101. response is returned without notification that some of the queried indices are
  102. downsampled.
  103. * For
  104. <<search-aggregations-bucket-datehistogram-aggregation,date histogram aggregations>>,
  105. only `fixed_intervals` (and not calendar-aware intervals) are supported.
  106. * Only Coordinated Universal Time (UTC) date-times are supported.
  107. [discrete]
  108. [[downsampling-restrictions]]
  109. === Restrictions and limitations
  110. The following restrictions and limitations apply for downsampling:
  111. * Only indices in a <<tsds,time series data stream>> are supported.
  112. * Data is downsampled based on the time dimension only. All other dimensions are
  113. copied to the new index without any modification.
  114. * Within a data stream, a downsampled index replaces the original index and the
  115. original index is deleted. Only one index can exist for a given time period.
  116. * A source index must be in read-only mode for the downsampling process to
  117. succeed. Check the <<downsampling-manual,Run downsampling manually>> example for
  118. details.
  119. * Downsampling data for the same period many times (downsampling of a
  120. downsampled index) is supported. The downsampling interval must be a multiple of
  121. the interval of the downsampled index.
  122. * Downsampling is provided as an ILM action. See <<ilm-downsample,Downsample>>.
  123. * The new, downsampled index is created on the data tier of the original index
  124. and it inherits its settings (for example, the number of shards and replicas).
  125. * The numeric `gauge` and `counter` <<mapping-field-meta,metric types>> are
  126. supported.
  127. * The downsampling configuration is extracted from the time series data stream
  128. <<tsds-create-mappings-component-template,index mapping>>. The only additional
  129. required setting is the downsampling `fixed_interval`.
  130. [discrete]
  131. [[try-out-downsampling]]
  132. === Try it out
  133. To take downsampling for a test run, try our example of
  134. <<downsampling-manual,running downsampling manually>>.
  135. Downsampling can easily be added to your ILM policy. To learn how, try our
  136. <<downsampling-ilm,Run downsampling with ILM>> example.