Ludovic Leroux 939411300a
Bump vLLM version + more options when loading models in vLLM (#1782)
* Bump vLLM version to 0.3.2

* Add vLLM model loading options

* Remove transformers-exllama

* Fix install exllama
2024-03-01 22:48:53 +01:00
..
2023-12-24 19:38:54 +01:00
2024-01-19 23:42:50 +01:00
2023-12-24 19:38:54 +01:00
2023-12-24 19:38:54 +01:00
2023-12-24 19:38:54 +01:00
2023-12-24 19:38:54 +01:00
2023-12-24 19:38:54 +01:00

Creating a separate environment for ttsbark project

make coqui

Testing the gRPC server

make test