get-inference-trained-model.asciidoc 4.1 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167
  1. [role="xpack"]
  2. [testenv="basic"]
  3. [[get-inference]]
  4. === Get {infer} trained model API
  5. [subs="attributes"]
  6. ++++
  7. <titleabbrev>Get {infer} trained model</titleabbrev>
  8. ++++
  9. Retrieves configuration information for a trained {infer} model.
  10. experimental[]
  11. [[ml-get-inference-request]]
  12. ==== {api-request-title}
  13. `GET _ml/inference/` +
  14. `GET _ml/inference/<model_id>` +
  15. `GET _ml/inference/_all` +
  16. `GET _ml/inference/<model_id1>,<model_id2>` +
  17. `GET _ml/inference/<model_id_pattern*>`
  18. [[ml-get-inference-prereq]]
  19. ==== {api-prereq-title}
  20. If the {es} {security-features} are enabled, you must have the following
  21. privileges:
  22. * cluster: `monitor_ml`
  23. For more information, see <<security-privileges>> and <<built-in-roles>>.
  24. [[ml-get-inference-desc]]
  25. ==== {api-description-title}
  26. You can get information for multiple trained models in a single API request by
  27. using a comma-separated list of model IDs or a wildcard expression.
  28. [[ml-get-inference-path-params]]
  29. ==== {api-path-parms-title}
  30. `<model_id>`::
  31. (Optional, string)
  32. include::{docdir}/ml/ml-shared.asciidoc[tag=model-id]
  33. [[ml-get-inference-query-params]]
  34. ==== {api-query-parms-title}
  35. `allow_no_match`::
  36. (Optional, boolean)
  37. include::{docdir}/ml/ml-shared.asciidoc[tag=allow-no-match]
  38. `decompress_definition`::
  39. (Optional, boolean)
  40. Specifies whether the included model definition should be returned as a JSON map
  41. (`true`) or in a custom compressed format (`false`). Defaults to `true`.
  42. `from`::
  43. (Optional, integer)
  44. include::{docdir}/ml/ml-shared.asciidoc[tag=from]
  45. `include_model_definition`::
  46. (Optional, boolean)
  47. Specifies whether the model definition is returned in the response. Defaults
  48. to `false`. When `true`, only a single model must match the ID patterns
  49. provided. Otherwise, a bad request is returned.
  50. `size`::
  51. (Optional, integer)
  52. include::{docdir}/ml/ml-shared.asciidoc[tag=size]
  53. `tags`::
  54. (Optional, string)
  55. include::{docdir}/ml/ml-shared.asciidoc[tag=tags]
  56. [role="child_attributes"]
  57. [[ml-get-inference-results]]
  58. ==== {api-response-body-title}
  59. `trained_model_configs`::
  60. (array)
  61. An array of trained model resources, which are sorted by the `model_id` value in
  62. ascending order.
  63. +
  64. .Properties of trained model resources
  65. [%collapsible%open]
  66. ====
  67. `created_by`:::
  68. (string)
  69. Information on the creator of the trained model.
  70. `create_time`:::
  71. (<<time-units,time units>>)
  72. The time when the trained model was created.
  73. `default_field_map` :::
  74. (object)
  75. A string to string object that contains the default field map to use
  76. when inferring against the model. For example, data frame analytics
  77. may train the model on a specific multi-field `foo.keyword`.
  78. The analytics job would then supply a default field map entry for
  79. `"foo" : "foo.keyword"`.
  80. +
  81. Any field map described in the inference configuration takes precedence.
  82. `estimated_heap_memory_usage_bytes`:::
  83. (integer)
  84. The estimated heap usage in bytes to keep the trained model in memory.
  85. `estimated_operations`:::
  86. (integer)
  87. The estimated number of operations to use the trained model.
  88. `license_level`:::
  89. (string)
  90. The license level of the trained model.
  91. `metadata`:::
  92. (object)
  93. An object containing metadata about the trained model. For example, models
  94. created by {dfanalytics} contain `analysis_config` and `input` objects.
  95. `model_id`:::
  96. (string)
  97. Identifier for the trained model.
  98. `tags`:::
  99. (string)
  100. A comma delimited string of tags. A {infer} model can have many tags, or none.
  101. `version`:::
  102. (string)
  103. The {es} version number in which the trained model was created.
  104. ====
  105. [[ml-get-inference-response-codes]]
  106. ==== {api-response-codes-title}
  107. `400`::
  108. If `include_model_definition` is `true`, this code indicates that more than
  109. one models match the ID pattern.
  110. `404` (Missing resources)::
  111. If `allow_no_match` is `false`, this code indicates that there are no
  112. resources that match the request or only partial matches for the request.
  113. [[ml-get-inference-example]]
  114. ==== {api-examples-title}
  115. The following example gets configuration information for all the trained models:
  116. [source,console]
  117. --------------------------------------------------
  118. GET _ml/inference/
  119. --------------------------------------------------
  120. // TEST[skip:TBD]