LocalAI/backend/python/autogptq
Ettore Di Giacinto 11d960b2a6
chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both (#3428)
* chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both

Fixes: https://github.com/mudler/LocalAI/issues/3427

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* bump grpcio

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-08-30 00:10:17 +02:00
..
backend.py feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
install.sh fix: add missing openvino/optimum/etc libraries for Intel, fixes #2289 (#2292) 2024-05-12 09:01:45 +02:00
Makefile feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
requirements-cublas11.txt fix: ensure correct version of torch is always installed based on BUILD_TYPE(#2890) 2024-08-05 16:38:33 +00:00
requirements-cublas12.txt fix: ensure correct version of torch is always installed based on BUILD_TYPE(#2890) 2024-08-05 16:38:33 +00:00
requirements-hipblas.txt fix: install pytorch from proper index for hipblas builds (#2413) 2024-05-26 18:05:52 +00:00
requirements-intel.txt chore(deps): Bump setuptools from 70.3.0 to 72.1.0 in /backend/python/autogptq (#3048) 2024-07-29 21:45:52 +00:00
requirements.txt chore(cli): be consistent between workers and expose ExtraLLamaCPPArgs to both (#3428) 2024-08-30 00:10:17 +02:00
run.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
test.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00

Creating a separate environment for the autogptq project

make autogptq