1
0

esql-process-data-with-dissect-grok.asciidoc 9.0 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258
  1. [[esql-process-data-with-dissect-and-grok]]
  2. === Data processing with DISSECT and GROK
  3. ++++
  4. <titleabbrev>Data processing with DISSECT and GROK</titleabbrev>
  5. ++++
  6. Your data may contain unstructured strings that you want to structure. This
  7. makes it easier to analyze the data. For example, log messages may contain IP
  8. addresses that you want to extract so you can find the most active IP addresses.
  9. image::images/esql/unstructured-data.png[align="center",width=75%]
  10. {es} can structure your data at index time or query time. At index time, you can
  11. use the <<dissect-processor,Dissect>> and <<grok-processor,Grok>> ingest
  12. processors, or the {ls} {logstash-ref}/plugins-filters-dissect.html[Dissect] and
  13. {logstash-ref}/plugins-filters-grok.html[Grok] filters. At query time, you can
  14. use the {esql} <<esql-dissect>> and <<esql-grok>> commands.
  15. [[esql-grok-or-dissect]]
  16. ==== `DISSECT` or `GROK`? Or both?
  17. `DISSECT` works by breaking up a string using a delimiter-based pattern. `GROK`
  18. works similarly, but uses regular expressions. This make `GROK` more powerful,
  19. but generally also slower. `DISSECT` works well when data is reliably repeated.
  20. `GROK` is a better choice when you really need the power of regular expressions,
  21. for example when the structure of your text varies from row to row.
  22. You can use both `DISSECT` and `GROK` for hybrid use cases. For example when a
  23. section of the line is reliably repeated, but the entire line is not. `DISSECT`
  24. can deconstruct the section of the line that is repeated. `GROK` can process the
  25. remaining field values using regular expressions.
  26. [[esql-process-data-with-dissect]]
  27. ==== Process data with `DISSECT`
  28. The <<esql-dissect>> processing command matches a string against a
  29. delimiter-based pattern, and extracts the specified keys as columns.
  30. For example, the following pattern:
  31. [source,txt]
  32. ----
  33. %{clientip} [%{@timestamp}] %{status}
  34. ----
  35. matches a log line of this format:
  36. [source,txt]
  37. ----
  38. 1.2.3.4 [2023-01-23T12:15:00.000Z] Connected
  39. ----
  40. and results in adding the following columns to the input table:
  41. [%header.monospaced.styled,format=dsv,separator=|]
  42. |===
  43. clientip:keyword | @timestamp:keyword | status:keyword
  44. 1.2.3.4 | 2023-01-23T12:15:00.000Z | Connected
  45. |===
  46. [[esql-dissect-patterns]]
  47. ===== Dissect patterns
  48. include::../ingest/processors/dissect.asciidoc[tag=intro-example-explanation]
  49. An empty key `%{}` or a <<esql-named-skip-key,named skip key>> can be used to
  50. match values, but exclude the value from the output.
  51. All matched values are output as keyword string data types. Use the
  52. <<esql-type-conversion-functions>> to convert to another data type.
  53. Dissect also supports <<esql-dissect-key-modifiers,key modifiers>> that can
  54. change dissect's default behavior. For example, you can instruct dissect to
  55. ignore certain fields, append fields, skip over padding, etc.
  56. [[esql-dissect-terminology]]
  57. ===== Terminology
  58. dissect pattern::
  59. the set of fields and delimiters describing the textual
  60. format. Also known as a dissection.
  61. The dissection is described using a set of `%{}` sections:
  62. `%{a} - %{b} - %{c}`
  63. field::
  64. the text from `%{` to `}` inclusive.
  65. delimiter::
  66. the text between `}` and the next `%{` characters.
  67. Any set of characters other than `%{`, `'not }'`, or `}` is a delimiter.
  68. key::
  69. +
  70. --
  71. the text between the `%{` and `}`, exclusive of the `?`, `+`, `&` prefixes
  72. and the ordinal suffix.
  73. Examples:
  74. * `%{?aaa}` - the key is `aaa`
  75. * `%{+bbb/3}` - the key is `bbb`
  76. * `%{&ccc}` - the key is `ccc`
  77. --
  78. [[esql-dissect-examples]]
  79. ===== Examples
  80. include::processing-commands/dissect.asciidoc[tag=examples]
  81. [[esql-dissect-key-modifiers]]
  82. ===== Dissect key modifiers
  83. include::../ingest/processors/dissect.asciidoc[tag=dissect-key-modifiers]
  84. [[esql-dissect-key-modifiers-table]]
  85. .Dissect key modifiers
  86. [options="header",role="styled"]
  87. |======
  88. | Modifier | Name | Position | Example | Description | Details
  89. | `->` | Skip right padding | (far) right | `%{keyname1->}` | Skips any repeated characters to the right | <<esql-dissect-modifier-skip-right-padding,link>>
  90. | `+` | Append | left | `%{+keyname} %{+keyname}` | Appends two or more fields together | <<esql-append-modifier,link>>
  91. | `+` with `/n` | Append with order | left and right | `%{+keyname/2} %{+keyname/1}` | Appends two or more fields together in the order specified | <<esql-append-order-modifier,link>>
  92. | `?` | Named skip key | left | `%{?ignoreme}` | Skips the matched value in the output. Same behavior as `%{}`| <<esql-named-skip-key,link>>
  93. | `*` and `&` | Reference keys | left | `%{*r1} %{&r1}` | Sets the output key as value of `*` and output value of `&` | <<esql-reference-keys,link>>
  94. |======
  95. [[esql-dissect-modifier-skip-right-padding]]
  96. ====== Right padding modifier (`->`)
  97. include::../ingest/processors/dissect.asciidoc[tag=dissect-modifier-skip-right-padding]
  98. [[esql-append-modifier]]
  99. ====== Append modifier (`+`)
  100. include::../ingest/processors/dissect.asciidoc[tag=append-modifier]
  101. [[esql-append-order-modifier]]
  102. ====== Append with order modifier (`+` and `/n`)
  103. include::../ingest/processors/dissect.asciidoc[tag=append-order-modifier]
  104. [[esql-named-skip-key]]
  105. ====== Named skip key (`?`)
  106. include::../ingest/processors/dissect.asciidoc[tag=named-skip-key]
  107. [[esql-reference-keys]]
  108. ====== Reference keys (`*` and `&`)
  109. include::../ingest/processors/dissect.asciidoc[tag=reference-keys]
  110. [[esql-process-data-with-grok]]
  111. ==== Process data with `GROK`
  112. The <<esql-grok>> processing command matches a string against a pattern based on
  113. regular expressions, and extracts the specified keys as columns.
  114. For example, the following pattern:
  115. [source,txt]
  116. ----
  117. %{IP:ip} \[%{TIMESTAMP_ISO8601:@timestamp}\] %{GREEDYDATA:status}
  118. ----
  119. matches a log line of this format:
  120. [source,txt]
  121. ----
  122. 1.2.3.4 [2023-01-23T12:15:00.000Z] Connected
  123. ----
  124. and results in adding the following columns to the input table:
  125. [%header.monospaced.styled,format=dsv,separator=|]
  126. |===
  127. @timestamp:keyword | ip:keyword | status:keyword
  128. 2023-01-23T12:15:00.000Z | 1.2.3.4 | Connected
  129. |===
  130. [[esql-grok-patterns]]
  131. ===== Grok patterns
  132. The syntax for a grok pattern is `%{SYNTAX:SEMANTIC}`
  133. The `SYNTAX` is the name of the pattern that matches your text. For example,
  134. `3.44` is matched by the `NUMBER` pattern and `55.3.244.1` is matched by the
  135. `IP` pattern. The syntax is how you match.
  136. The `SEMANTIC` is the identifier you give to the piece of text being matched.
  137. For example, `3.44` could be the duration of an event, so you could call it
  138. simply `duration`. Further, a string `55.3.244.1` might identify the `client`
  139. making a request.
  140. By default, matched values are output as keyword string data types. To convert a
  141. semantic's data type, suffix it with the target data type. For example
  142. `%{NUMBER:num:int}`, which converts the `num` semantic from a string to an
  143. integer. Currently the only supported conversions are `int` and `float`. For
  144. other types, use the <<esql-type-conversion-functions>>.
  145. For an overview of the available patterns, refer to
  146. {es-repo}/blob/{branch}/libs/grok/src/main/resources/patterns[GitHub]. You can
  147. also retrieve a list of all patterns using a <<grok-processor-rest-get,REST
  148. API>>.
  149. [[esql-grok-regex]]
  150. ===== Regular expressions
  151. Grok is based on regular expressions. Any regular expressions are valid in grok
  152. as well. Grok uses the Oniguruma regular expression library. Refer to
  153. https://github.com/kkos/oniguruma/blob/master/doc/RE[the Oniguruma GitHub
  154. repository] for the full supported regexp syntax.
  155. [NOTE]
  156. ====
  157. Special regex characters like `[` and `]` need to be escaped with a `\`. For
  158. example, in the earlier pattern:
  159. [source,txt]
  160. ----
  161. %{IP:ip} \[%{TIMESTAMP_ISO8601:@timestamp}\] %{GREEDYDATA:status}
  162. ----
  163. In {esql} queries, the backslash character itself is a special character that
  164. needs to be escaped with another `\`. For this example, the corresponding {esql}
  165. query becomes:
  166. [source.merge.styled,esql]
  167. ----
  168. include::{esql-specs}/docs.csv-spec[tag=grokWithEscape]
  169. ----
  170. ====
  171. [[esql-custom-patterns]]
  172. ===== Custom patterns
  173. If grok doesn't have a pattern you need, you can use the Oniguruma syntax for
  174. named capture which lets you match a piece of text and save it as a column:
  175. [source,txt]
  176. ----
  177. (?<field_name>the pattern here)
  178. ----
  179. For example, postfix logs have a `queue id` that is a 10 or 11-character
  180. hexadecimal value. This can be captured to a column named `queue_id` with:
  181. [source,txt]
  182. ----
  183. (?<queue_id>[0-9A-F]{10,11})
  184. ----
  185. [[esql-grok-examples]]
  186. ===== Examples
  187. include::processing-commands/grok.asciidoc[tag=examples]
  188. [[esql-grok-debugger]]
  189. ===== Grok debugger
  190. To write and debug grok patterns, you can use the
  191. {kibana-ref}/xpack-grokdebugger.html[Grok Debugger]. It provides a UI for
  192. testing patterns against sample data. Under the covers, it uses the same engine
  193. as the `GROK` command.
  194. [[esql-grok-limitations]]
  195. ===== Limitations
  196. The `GROK` command does not support configuring <<custom-patterns,custom
  197. patterns>>, or <<trace-match,multiple patterns>>. The `GROK` command is not
  198. subject to <<grok-watchdog,Grok watchdog settings>>.