Răsfoiți Sursa

clearer documentation on accessing web UI and chatgpt-api

Alex Cheema 9 luni în urmă
părinte
comite
5ac6b6a717
1 a modificat fișierele cu 3 adăugiri și 1 ștergeri
  1. 3 1
      README.md

+ 3 - 1
README.md

@@ -109,7 +109,9 @@ That's it! No configuration required - exo will automatically discover the other
 
 The native way to access models running on exo is using the exo library with peer handles. See how in [this example for Llama 3](examples/llama3_distributed.py).
 
-exo also starts a ChatGPT-compatible API endpoint on http://localhost:8000. Note: this is currently only supported by tail nodes (i.e. nodes selected to be at the end of the ring topology). Example request:
+exo starts a ChatGPT-like WebUI (powered by [tinygrad tinychat](https://github.com/tinygrad/tinygrad/tree/master/examples/tinychat)) on http://localhost:8000
+
+For developers, exo also starts a ChatGPT-compatible API endpoint on http://localhost:8000/v1/chat/completions. Example with curl:
 
 ```sh
 curl http://localhost:8000/v1/chat/completions \