Ver Fonte

[DOCS] Changes model_id path param to inference_id (#106719)

István Zoltán Szabó há 1 ano atrás
pai
commit
a3d96b9333

+ 4 - 4
docs/reference/inference/delete-inference.asciidoc

@@ -16,9 +16,9 @@ own model, use the <<ml-df-trained-models-apis>>.
 [[delete-inference-api-request]]
 ==== {api-request-title}
 
-`DELETE /_inference/<model_id>`
+`DELETE /_inference/<inference_id>`
 
-`DELETE /_inference/<task_type>/<model_id>`
+`DELETE /_inference/<task_type>/<inference_id>`
 
 [discrete]
 [[delete-inference-api-prereqs]]
@@ -32,9 +32,9 @@ own model, use the <<ml-df-trained-models-apis>>.
 [[delete-inference-api-path-params]]
 ==== {api-path-parms-title}
 
-<model_id>::
+<inference_id>::
 (Required, string)
-The unique identifier of the {infer} model to delete.
+The unique identifier of the {infer} endpoint to delete.
 
 <task_type>::
 (Optional, string)

+ 5 - 5
docs/reference/inference/get-inference.asciidoc

@@ -18,11 +18,11 @@ own model, use the <<ml-df-trained-models-apis>>.
 
 `GET /_inference/_all`
 
-`GET /_inference/<model_id>`
+`GET /_inference/<inference_id>`
 
 `GET /_inference/<task_type>/_all`
 
-`GET /_inference/<task_type>/<model_id>`
+`GET /_inference/<task_type>/<inference_id>`
 
 [discrete]
 [[get-inference-api-prereqs]]
@@ -47,9 +47,9 @@ and a wildcard expression,
 [[get-inference-api-path-params]]
 ==== {api-path-parms-title}
 
-`<model_id>`::
+`<inference_id>`::
 (Optional, string)
-The unique identifier of the {infer} model.
+The unique identifier of the {infer} endpoint.
 
 
 `<task_type>`::
@@ -77,7 +77,7 @@ The API returns the following response:
 [source,console-result]
 ------------------------------------------------------------
 {
-  "model_id": "my-elser-model",
+  "inference_id": "my-elser-model",
   "task_type": "sparse_embedding",
   "service": "elser",
   "service_settings": {

+ 6 - 6
docs/reference/inference/post-inference.asciidoc

@@ -16,9 +16,9 @@ own model, use the <<ml-df-trained-models-apis>>.
 [[post-inference-api-request]]
 ==== {api-request-title}
 
-`POST /_inference/<model_id>`
+`POST /_inference/<inference_id>`
 
-`POST /_inference/<task_type>/<model_id>`
+`POST /_inference/<task_type>/<inference_id>`
 
 
 [discrete]
@@ -32,8 +32,8 @@ own model, use the <<ml-df-trained-models-apis>>.
 [[post-inference-api-desc]]
 ==== {api-description-title}
 
-The perform {infer} API enables you to use {infer} models to perform specific
-tasks on data that you provide as an input. The API returns a response with the
+The perform {infer} API enables you to use {ml} models to perform specific tasks
+on data that you provide as an input. The API returns a response with the 
 resutls of the tasks. The {infer} model you use can perform one specific task
 that has been defined when the model was created with the <<put-inference-api>>.
 
@@ -42,9 +42,9 @@ that has been defined when the model was created with the <<put-inference-api>>.
 [[post-inference-api-path-params]]
 ==== {api-path-parms-title}
 
-`<model_id>`::
+`<inference_id>`::
 (Required, string)
-The unique identifier of the {infer} model.
+The unique identifier of the {infer} endpoint.
 
 
 `<task_type>`::

+ 10 - 10
docs/reference/inference/put-inference.asciidoc

@@ -33,7 +33,7 @@ or if you want to use non-NLP models, use the <<ml-df-trained-models-apis>>.
 [[put-inference-api-desc]]
 ==== {api-description-title}
 
-The create {infer} API enables you to create and configure an {infer} model to
+The create {infer} API enables you to create and configure a {ml} model to
 perform a specific {infer} task.
 
 The following services are available through the {infer} API:
@@ -50,9 +50,9 @@ The following services are available through the {infer} API:
 ==== {api-path-parms-title}
 
 
-`<model_id>`::
+`<inference_id>`::
 (Required, string)
-The unique identifier of the model.
+The unique identifier of the {infer} endpoint.
 
 `<task_type>`::
 (Required, string)
@@ -246,7 +246,7 @@ This section contains example API calls for every service type.
 [[inference-example-cohere]]
 ===== Cohere service
 
-The following example shows how to create an {infer} model called
+The following example shows how to create an {infer} endpoint called
 `cohere_embeddings` to perform a `text_embedding` task type.
 
 [source,console]
@@ -268,7 +268,7 @@ PUT _inference/text_embedding/cohere-embeddings
 [[inference-example-e5]]
 ===== E5 via the elasticsearch service
 
-The following example shows how to create an {infer} model called
+The following example shows how to create an {infer} endpoint called
 `my-e5-model` to perform a `text_embedding` task type.
 
 [source,console]
@@ -293,7 +293,7 @@ further details, refer to the {ml-docs}/ml-nlp-e5.html[E5 model documentation].
 [[inference-example-elser]]
 ===== ELSER service
 
-The following example shows how to create an {infer} model called
+The following example shows how to create an {infer} endpoint called
 `my-elser-model` to perform a `sparse_embedding` task type.
 
 [source,console]
@@ -315,7 +315,7 @@ Example response:
 [source,console-result]
 ------------------------------------------------------------
 {
-  "model_id": "my-elser-model",
+  "inference_id": "my-elser-model",
   "task_type": "sparse_embedding",
   "service": "elser",
   "service_settings": {
@@ -332,7 +332,7 @@ Example response:
 [[inference-example-hugging-face]]
 ===== Hugging Face service
 
-The following example shows how to create an {infer} model called
+The following example shows how to create an {infer} endpoint called
 `hugging-face_embeddings` to perform a `text_embedding` task type.
 
 [source,console]
@@ -362,7 +362,7 @@ after the endpoint initialization has been finished.
 [[inference-example-eland]]
 ===== Models uploaded by Eland via the elasticsearch service
 
-The following example shows how to create an {infer} model called
+The following example shows how to create an {infer} endpoint called
 `my-msmarco-minilm-model` to perform a `text_embedding` task type.
 
 [source,console]
@@ -387,7 +387,7 @@ been
 [[inference-example-openai]]
 ===== OpenAI service
 
-The following example shows how to create an {infer} model called
+The following example shows how to create an {infer} endpoint called
 `openai_embeddings` to perform a `text_embedding` task type.
 
 [source,console]

+ 2 - 2
docs/reference/search/search-your-data/semantic-search-inference.asciidoc

@@ -23,9 +23,9 @@ include::{es-repo-dir}/tab-widgets/inference-api/infer-api-requirements-widget.a
 
 [discrete]
 [[infer-text-embedding-task]]
-==== Create the inference task
+==== Create an inference endpoint
 
-Create the {infer} task by using the <<put-inference-api>>:
+Create an {infer} endpoint by using the <<put-inference-api>>:
 
 include::{es-repo-dir}/tab-widgets/inference-api/infer-api-task-widget.asciidoc[]
 

+ 4 - 4
docs/reference/tab-widgets/inference-api/infer-api-ingest-pipeline.asciidoc

@@ -28,8 +28,8 @@ PUT _ingest/pipeline/cohere_embeddings
   ]
 }
 --------------------------------------------------
-<1> The name of the inference configuration you created by using the
-<<put-inference-api>>.
+<1> The name of the inference endpoint you created by using the
+<<put-inference-api>>, it's referred to as `inference_id` in that step.
 <2> Configuration object that defines the `input_field` for the {infer} process
 and the `output_field` that will contain the {infer} results.
 
@@ -55,8 +55,8 @@ PUT _ingest/pipeline/openai_embeddings
   ]
 }
 --------------------------------------------------
-<1> The name of the inference configuration you created by using the
-<<put-inference-api>>.
+<1> The name of the inference endpoint you created by using the
+<<put-inference-api>>, it's referred to as `inference_id` in that step.
 <2> Configuration object that defines the `input_field` for the {infer} process
 and the `output_field` that will contain the {infer} results.
 

+ 2 - 2
docs/reference/tab-widgets/inference-api/infer-api-search.asciidoc

@@ -8,7 +8,7 @@ GET cohere-embeddings/_search
     "field": "content_embedding",
     "query_vector_builder": {
       "text_embedding": {
-        "model_id": "cohere_embeddings",
+        "inference_id": "cohere_embeddings",
         "model_text": "Muscles in human body"
       }
     },
@@ -83,7 +83,7 @@ GET openai-embeddings/_search
     "field": "content_embedding",
     "query_vector_builder": {
       "text_embedding": {
-        "model_id": "openai_embeddings",
+        "inference_id": "openai_embeddings",
         "model_text": "Calculate fuel cost"
       }
     },

+ 4 - 2
docs/reference/tab-widgets/inference-api/infer-api-task.asciidoc

@@ -13,7 +13,8 @@ PUT _inference/text_embedding/cohere_embeddings <1>
 }
 ------------------------------------------------------------
 // TEST[skip:TBD]
-<1> The task type is `text_embedding` in the path.
+<1> The task type is `text_embedding` in the path and the `inference_id` which 
+is the unique identifier of the {infer} endpoint is `cohere_embeddings`.
 <2> The API key of your Cohere account. You can find your API keys in your
 Cohere dashboard under the
 https://dashboard.cohere.com/api-keys[API keys section]. You need to provide
@@ -46,7 +47,8 @@ PUT _inference/text_embedding/openai_embeddings <1>
 }
 ------------------------------------------------------------
 // TEST[skip:TBD]
-<1> The task type is `text_embedding` in the path.
+<1> The task type is `text_embedding` in the path and the `inference_id` which 
+is the unique identifier of the {infer} endpoint is `openai_embeddings`.
 <2> The API key of your OpenAI account. You can find your OpenAI API keys in
 your OpenAI account under the
 https://platform.openai.com/api-keys[API keys section]. You need to provide