inference.asciidoc 5.4 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181
  1. [role="xpack"]
  2. [testenv="basic"]
  3. [[inference-processor]]
  4. === {infer-cap} Processor
  5. Uses a pre-trained {dfanalytics} model to infer against the data that is being
  6. ingested in the pipeline.
  7. [[inference-options]]
  8. .{infer-cap} Options
  9. [options="header"]
  10. |======
  11. | Name | Required | Default | Description
  12. | `model_id` | yes | - | (String) The ID of the model to load and infer against.
  13. | `target_field` | no | `ml.inference.<processor_tag>` | (String) Field added to incoming documents to contain results objects.
  14. | `field_map` | no | If defined the model's default field map | (Object) Maps the document field names to the known field names of the model. This mapping takes precedence over any default mappings provided in the model configuration.
  15. | `inference_config` | no | The default settings defined in the model | (Object) Contains the inference type and its options. There are two types: <<inference-processor-regression-opt,`regression`>> and <<inference-processor-classification-opt,`classification`>>.
  16. include::common-options.asciidoc[]
  17. |======
  18. [source,js]
  19. --------------------------------------------------
  20. {
  21. "inference": {
  22. "model_id": "flight_delay_regression-1571767128603",
  23. "target_field": "FlightDelayMin_prediction_infer",
  24. "field_map": {
  25. "your_field": "my_field"
  26. },
  27. "inference_config": { "regression": {} }
  28. }
  29. }
  30. --------------------------------------------------
  31. // NOTCONSOLE
  32. [discrete]
  33. [[inference-processor-regression-opt]]
  34. ==== {regression-cap} configuration options
  35. Regression configuration for inference.
  36. `results_field`::
  37. (Optional, string)
  38. include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=inference-config-results-field-processor]
  39. `num_top_feature_importance_values`::
  40. (Optional, integer)
  41. include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=inference-config-regression-num-top-feature-importance-values]
  42. [discrete]
  43. [[inference-processor-classification-opt]]
  44. ==== {classification-cap} configuration options
  45. Classification configuration for inference.
  46. `num_top_classes`::
  47. (Optional, integer)
  48. include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=inference-config-classification-num-top-classes]
  49. `num_top_feature_importance_values`::
  50. (Optional, integer)
  51. include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=inference-config-classification-num-top-feature-importance-values]
  52. `results_field`::
  53. (Optional, string)
  54. include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=inference-config-results-field-processor]
  55. `top_classes_results_field`::
  56. (Optional, string)
  57. include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=inference-config-classification-top-classes-results-field]
  58. `prediction_field_type`::
  59. (Optional, string)
  60. include::{es-repo-dir}/ml/ml-shared.asciidoc[tag=inference-config-classification-prediction-field-type]
  61. [discrete]
  62. [[inference-processor-config-example]]
  63. ==== `inference_config` examples
  64. [source,js]
  65. --------------------------------------------------
  66. {
  67. "inference_config": {
  68. "regression": {
  69. "results_field": "my_regression"
  70. }
  71. }
  72. }
  73. --------------------------------------------------
  74. // NOTCONSOLE
  75. This configuration specifies a `regression` inference and the results are
  76. written to the `my_regression` field contained in the `target_field` results
  77. object.
  78. [source,js]
  79. --------------------------------------------------
  80. {
  81. "inference_config": {
  82. "classification": {
  83. "num_top_classes": 2,
  84. "results_field": "prediction",
  85. "top_classes_results_field": "probabilities"
  86. }
  87. }
  88. }
  89. --------------------------------------------------
  90. // NOTCONSOLE
  91. This configuration specifies a `classification` inference. The number of
  92. categories for which the predicted probabilities are reported is 2
  93. (`num_top_classes`). The result is written to the `prediction` field and the top
  94. classes to the `probabilities` field. Both fields are contained in the
  95. `target_field` results object.
  96. [discrete]
  97. [[inference-processor-feature-importance]]
  98. ==== {feat-imp-cap} object mapping
  99. Update your index mapping of the {feat-imp} result field as you can see below to
  100. get the full benefit of aggregating and searching for
  101. {ml-docs}/ml-feature-importance.html[{feat-imp}].
  102. [source,js]
  103. --------------------------------------------------
  104. "ml.inference.feature_importance": {
  105. "type": "nested",
  106. "dynamic": true,
  107. "properties": {
  108. "feature_name": {
  109. "type": "keyword"
  110. },
  111. "importance": {
  112. "type": "double"
  113. }
  114. }
  115. }
  116. --------------------------------------------------
  117. // NOTCONSOLE
  118. The mapping field name for {feat-imp} is compounded as follows:
  119. `<ml.inference.target_field>`.`<inference.tag>`.`feature_importance`
  120. If `inference.tag` is not provided in the processor definition, it is not part
  121. of the field path. The `<ml.inference.target_field>` defaults to `ml.inference`.
  122. For example, you provide a tag `foo` in the definition as you can see below:
  123. [source,js]
  124. --------------------------------------------------
  125. {
  126. "tag": "foo",
  127. ...
  128. }
  129. --------------------------------------------------
  130. // NOTCONSOLE
  131. The {feat-imp} value is written to the `ml.inference.foo.feature_importance`
  132. field.
  133. You can also specify a target field as follows:
  134. [source,js]
  135. --------------------------------------------------
  136. {
  137. "tag": "foo",
  138. "target_field": "my_field"
  139. }
  140. --------------------------------------------------
  141. // NOTCONSOLE
  142. In this case, {feat-imp} is exposed in the
  143. `my_field.foo.feature_importance` field.