Browse Source

update readme with PyTorch inference engine and llama.cpp link to issue

Alex Cheema 7 months ago
parent
commit
0120891c35
1 changed files with 2 additions and 2 deletions
  1. 2 2
      README.md

+ 2 - 2
README.md

@@ -224,11 +224,11 @@ exo supports the following inference engines:
 
 
 - ✅ [MLX](exo/inference/mlx/sharded_inference_engine.py)
 - ✅ [MLX](exo/inference/mlx/sharded_inference_engine.py)
 - ✅ [tinygrad](exo/inference/tinygrad/inference.py)
 - ✅ [tinygrad](exo/inference/tinygrad/inference.py)
-- 🚧 [llama.cpp](TODO)
+- 🚧 [PyTorch](https://github.com/exo-explore/exo/pull/139)
+- 🚧 [llama.cpp](https://github.com/exo-explore/exo/issues/167)
 
 
 ## Networking Modules
 ## Networking Modules
 
 
 - ✅ [GRPC](exo/networking/grpc)
 - ✅ [GRPC](exo/networking/grpc)
 - 🚧 [Radio](TODO)
 - 🚧 [Radio](TODO)
 - 🚧 [Bluetooth](TODO)
 - 🚧 [Bluetooth](TODO)
-