Browse Source

[8.x] [DOCS] Update service-openai.asciidoc (#125491)

* Update service-openai.asciidoc (#125419)

Many customers want to use our OpenAI Inference endpoint against OpenAI compatible API's they have written, or Ollama, or Nvidia Triton OpenAI API front end. I had heard that was the intent of this OpenAI inference endpoint, but we do not state it directly. Can we validate this is OK with Search PM and include this?

Co-authored-by: István Zoltán Szabó <istvan.szabo@elastic.co>

* Update docs/reference/inference/service-openai.asciidoc

---------

Co-authored-by: Brad Quarry <38725582+bradquarry@users.noreply.github.com>
István Zoltán Szabó 7 months ago
parent
commit
1ba6ed2a35
1 changed files with 1 additions and 1 deletions
  1. 1 1
      docs/reference/inference/service-openai.asciidoc

+ 1 - 1
docs/reference/inference/service-openai.asciidoc

@@ -7,7 +7,7 @@
 For the most up-to-date API details, refer to {api-es}/group/endpoint-inference[{infer-cap} APIs].
 --
 
-Creates an {infer} endpoint to perform an {infer} task with the `openai` service.
+Creates an {infer} endpoint to perform an {infer} task with the `openai` service or `openai` compatible APIs. 
 
 
 [discrete]