grok.asciidoc 10 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336
  1. [[grok-processor]]
  2. === Grok processor
  3. ++++
  4. <titleabbrev>Grok</titleabbrev>
  5. ++++
  6. Extracts structured fields out of a single text field within a document. You choose which field to
  7. extract matched fields from, as well as the grok pattern you expect will match. A grok pattern is like a regular
  8. expression that supports aliased expressions that can be reused.
  9. This processor comes packaged with many
  10. https://github.com/elastic/elasticsearch/blob/{branch}/libs/grok/src/main/resources/patterns[reusable patterns].
  11. If you need help building patterns to match your logs, you will find the
  12. {kibana-ref}/xpack-grokdebugger.html[Grok Debugger] tool quite useful!
  13. The https://grokconstructor.appspot.com[Grok Constructor] is also a useful tool.
  14. [[using-grok]]
  15. ==== Using the Grok Processor in a Pipeline
  16. [[grok-options]]
  17. .Grok Options
  18. [options="header"]
  19. |======
  20. | Name | Required | Default | Description
  21. | `field` | yes | - | The field to use for grok expression parsing
  22. | `patterns` | yes | - | An ordered list of grok expression to match and extract named captures with. Returns on the first expression in the list that matches.
  23. | `pattern_definitions` | no | - | A map of pattern-name and pattern tuples defining custom patterns to be used by the current processor. Patterns matching existing names will override the pre-existing definition.
  24. | `trace_match` | no | false | when true, `_ingest._grok_match_index` will be inserted into your matched document's metadata with the index into the pattern found in `patterns` that matched.
  25. | `ignore_missing` | no | false | If `true` and `field` does not exist or is `null`, the processor quietly exits without modifying the document
  26. include::common-options.asciidoc[]
  27. |======
  28. Here is an example of using the provided patterns to extract out and name structured fields from a string field in
  29. a document.
  30. [source,console]
  31. --------------------------------------------------
  32. POST _ingest/pipeline/_simulate
  33. {
  34. "pipeline": {
  35. "description" : "...",
  36. "processors": [
  37. {
  38. "grok": {
  39. "field": "message",
  40. "patterns": ["%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes:int} %{NUMBER:duration:double}"]
  41. }
  42. }
  43. ]
  44. },
  45. "docs":[
  46. {
  47. "_source": {
  48. "message": "55.3.244.1 GET /index.html 15824 0.043"
  49. }
  50. }
  51. ]
  52. }
  53. --------------------------------------------------
  54. This pipeline will insert these named captures as new fields within the document, like so:
  55. [source,console-result]
  56. --------------------------------------------------
  57. {
  58. "docs": [
  59. {
  60. "doc": {
  61. "_index": "_index",
  62. "_id": "_id",
  63. "_source" : {
  64. "duration" : 0.043,
  65. "request" : "/index.html",
  66. "method" : "GET",
  67. "bytes" : 15824,
  68. "client" : "55.3.244.1",
  69. "message" : "55.3.244.1 GET /index.html 15824 0.043"
  70. },
  71. "_ingest": {
  72. "timestamp": "2016-11-08T19:43:03.850+0000"
  73. }
  74. }
  75. }
  76. ]
  77. }
  78. --------------------------------------------------
  79. // TESTRESPONSE[s/2016-11-08T19:43:03.850\+0000/$body.docs.0.doc._ingest.timestamp/]
  80. [[custom-patterns]]
  81. ==== Custom Patterns
  82. The Grok processor comes pre-packaged with a base set of patterns. These patterns may not always have
  83. what you are looking for. Patterns have a very basic format. Each entry has a name and the pattern itself.
  84. You can add your own patterns to a processor definition under the `pattern_definitions` option.
  85. Here is an example of a pipeline specifying custom pattern definitions:
  86. [source,js]
  87. --------------------------------------------------
  88. {
  89. "description" : "...",
  90. "processors": [
  91. {
  92. "grok": {
  93. "field": "message",
  94. "patterns": ["my %{FAVORITE_DOG:dog} is colored %{RGB:color}"],
  95. "pattern_definitions" : {
  96. "FAVORITE_DOG" : "beagle",
  97. "RGB" : "RED|GREEN|BLUE"
  98. }
  99. }
  100. }
  101. ]
  102. }
  103. --------------------------------------------------
  104. // NOTCONSOLE
  105. [[trace-match]]
  106. ==== Providing Multiple Match Patterns
  107. Sometimes one pattern is not enough to capture the potential structure of a field. Let's assume we
  108. want to match all messages that contain your favorite pet breeds of either cats or dogs. One way to accomplish
  109. this is to provide two distinct patterns that can be matched, instead of one really complicated expression capturing
  110. the same `or` behavior.
  111. Here is an example of such a configuration executed against the simulate API:
  112. [source,console]
  113. --------------------------------------------------
  114. POST _ingest/pipeline/_simulate
  115. {
  116. "pipeline": {
  117. "description" : "parse multiple patterns",
  118. "processors": [
  119. {
  120. "grok": {
  121. "field": "message",
  122. "patterns": ["%{FAVORITE_DOG:pet}", "%{FAVORITE_CAT:pet}"],
  123. "pattern_definitions" : {
  124. "FAVORITE_DOG" : "beagle",
  125. "FAVORITE_CAT" : "burmese"
  126. }
  127. }
  128. }
  129. ]
  130. },
  131. "docs":[
  132. {
  133. "_source": {
  134. "message": "I love burmese cats!"
  135. }
  136. }
  137. ]
  138. }
  139. --------------------------------------------------
  140. response:
  141. [source,console-result]
  142. --------------------------------------------------
  143. {
  144. "docs": [
  145. {
  146. "doc": {
  147. "_index": "_index",
  148. "_id": "_id",
  149. "_source": {
  150. "message": "I love burmese cats!",
  151. "pet": "burmese"
  152. },
  153. "_ingest": {
  154. "timestamp": "2016-11-08T19:43:03.850+0000"
  155. }
  156. }
  157. }
  158. ]
  159. }
  160. --------------------------------------------------
  161. // TESTRESPONSE[s/2016-11-08T19:43:03.850\+0000/$body.docs.0.doc._ingest.timestamp/]
  162. Both patterns will set the field `pet` with the appropriate match, but what if we want to trace which of our
  163. patterns matched and populated our fields? We can do this with the `trace_match` parameter. Here is the output of
  164. that same pipeline, but with `"trace_match": true` configured:
  165. ////
  166. Hidden setup for example:
  167. [source,console]
  168. --------------------------------------------------
  169. POST _ingest/pipeline/_simulate
  170. {
  171. "pipeline": {
  172. "description" : "parse multiple patterns",
  173. "processors": [
  174. {
  175. "grok": {
  176. "field": "message",
  177. "patterns": ["%{FAVORITE_DOG:pet}", "%{FAVORITE_CAT:pet}"],
  178. "trace_match": true,
  179. "pattern_definitions" : {
  180. "FAVORITE_DOG" : "beagle",
  181. "FAVORITE_CAT" : "burmese"
  182. }
  183. }
  184. }
  185. ]
  186. },
  187. "docs":[
  188. {
  189. "_source": {
  190. "message": "I love burmese cats!"
  191. }
  192. }
  193. ]
  194. }
  195. --------------------------------------------------
  196. ////
  197. [source,console-result]
  198. --------------------------------------------------
  199. {
  200. "docs": [
  201. {
  202. "doc": {
  203. "_index": "_index",
  204. "_id": "_id",
  205. "_source": {
  206. "message": "I love burmese cats!",
  207. "pet": "burmese"
  208. },
  209. "_ingest": {
  210. "_grok_match_index": "1",
  211. "timestamp": "2016-11-08T19:43:03.850+0000"
  212. }
  213. }
  214. }
  215. ]
  216. }
  217. --------------------------------------------------
  218. // TESTRESPONSE[s/2016-11-08T19:43:03.850\+0000/$body.docs.0.doc._ingest.timestamp/]
  219. In the above response, you can see that the index of the pattern that matched was `"1"`. This is to say that it was the
  220. second (index starts at zero) pattern in `patterns` to match.
  221. This trace metadata enables debugging which of the patterns matched. This information is stored in the ingest
  222. metadata and will not be indexed.
  223. [[grok-processor-rest-get]]
  224. ==== Retrieving patterns from REST endpoint
  225. The Grok Processor comes packaged with its own REST endpoint for retrieving which patterns the processor is packaged with.
  226. [source,console]
  227. --------------------------------------------------
  228. GET _ingest/processor/grok
  229. --------------------------------------------------
  230. The above request will return a response body containing a key-value representation of the built-in patterns dictionary.
  231. [source,js]
  232. --------------------------------------------------
  233. {
  234. "patterns" : {
  235. "BACULA_CAPACITY" : "%{INT}{1,3}(,%{INT}{3})*",
  236. "PATH" : "(?:%{UNIXPATH}|%{WINPATH})",
  237. ...
  238. }
  239. --------------------------------------------------
  240. // NOTCONSOLE
  241. By default, the API returns patterns in the order they are read from disk. This
  242. sort order preserves groupings of related patterns. For example, all patterns
  243. related to parsing Linux syslog lines stay grouped together.
  244. You can use the optional boolean `s` query parameter to sort returned patterns
  245. by key name instead.
  246. [source,console]
  247. --------------------------------------------------
  248. GET _ingest/processor/grok?s
  249. --------------------------------------------------
  250. The API returns the following response.
  251. [source,js]
  252. --------------------------------------------------
  253. {
  254. "patterns" : {
  255. "BACULA_CAPACITY" : "%{INT}{1,3}(,%{INT}{3})*",
  256. "BACULA_DEVICE" : "%{USER}",
  257. "BACULA_DEVICEPATH" : "%{UNIXPATH}",
  258. ...
  259. }
  260. --------------------------------------------------
  261. // NOTCONSOLE
  262. This can be useful to reference as the built-in patterns change across versions.
  263. [[grok-watchdog]]
  264. ==== Grok watchdog
  265. Grok expressions that take too long to execute are interrupted and
  266. the grok processor then fails with an exception. The grok
  267. processor has a watchdog thread that determines when evaluation of
  268. a grok expression takes too long and is controlled by the following
  269. settings:
  270. [[grok-watchdog-options]]
  271. .Grok watchdog settings
  272. [options="header"]
  273. |======
  274. | Name | Default | Description
  275. | `ingest.grok.watchdog.interval` | 1s | How often to check whether there are grok evaluations that take longer than the maximum allowed execution time.
  276. | `ingest.grok.watchdog.max_execution_time` | 1s | The maximum allowed execution of a grok expression evaluation.
  277. |======
  278. [[grok-debugging]]
  279. ==== Grok debugging
  280. It is advised to use the {kibana-ref}/xpack-grokdebugger.html[Grok Debugger] to debug grok patterns. From there you can test one or more
  281. patterns in the UI against sample data. Under the covers it uses the same engine as ingest node processor.
  282. Additionally, it is recommended to enable debug logging for Grok so that any additional messages may also be seen in the Elasticsearch
  283. server log.
  284. [source,js]
  285. --------------------------------------------------
  286. PUT _cluster/settings
  287. {
  288. "transient": {
  289. "logger.org.elasticsearch.ingest.common.GrokProcessor": "debug"
  290. }
  291. }
  292. --------------------------------------------------
  293. // NOTCONSOLE