瀏覽代碼

Update main.py: Default timeout 90->900

On slow setups (~1 token per second) average response may take ~600-1000 tokens. In most cases it will lead to timeout (network error which is not). Fixing to reduce exceptions. Who looking for better performance and know what to do need adjust with a knowledge how it will impact. By default making it will work for most cases.
FFAMax 10 月之前
父節點
當前提交
dbf40d7837
共有 1 個文件被更改,包括 1 次插入1 次删除
  1. 1 1
      exo/main.py

+ 1 - 1
exo/main.py

@@ -43,7 +43,7 @@ parser.add_argument("--discovery-timeout", type=int, default=30, help="Discovery
 parser.add_argument("--discovery-config-path", type=str, default=None, help="Path to discovery config json file")
 parser.add_argument("--wait-for-peers", type=int, default=0, help="Number of peers to wait to connect to before starting")
 parser.add_argument("--chatgpt-api-port", type=int, default=8000, help="ChatGPT API port")
-parser.add_argument("--chatgpt-api-response-timeout", type=int, default=90, help="ChatGPT API response timeout in seconds")
+parser.add_argument("--chatgpt-api-response-timeout", type=int, default=900, help="ChatGPT API response timeout in seconds")
 parser.add_argument("--max-generate-tokens", type=int, default=10000, help="Max tokens to generate in each request")
 parser.add_argument("--inference-engine", type=str, default=None, help="Inference engine to use (mlx, tinygrad, or dummy)")
 parser.add_argument("--disable-tui", action=argparse.BooleanOptionalAction, help="Disable TUI")