LocalAI/backend/python/exllama
Ettore Di Giacinto 11d960b2a6
chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both (#3428)
* chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both

Fixes: https://github.com/mudler/LocalAI/issues/3427

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* bump grpcio

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-08-30 00:10:17 +02:00
..
.gitignore feat: migrate python backends from conda to uv (#2215) 2024-05-10 15:08:08 +02:00
backend.py feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
install.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
Makefile feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
requirements-cpu.txt fix(python): move accelerate and GPU-specific libs to build-type (#3194) 2024-08-07 17:02:32 +02:00
requirements-cublas11.txt fix(python): move accelerate and GPU-specific libs to build-type (#3194) 2024-08-07 17:02:32 +02:00
requirements-cublas12.txt fix(python): move accelerate and GPU-specific libs to build-type (#3194) 2024-08-07 17:02:32 +02:00
requirements.txt chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both (#3428) 2024-08-30 00:10:17 +02:00
run.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
test.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00

Creating a separate environment for the exllama project

make exllama