LocalAI/backend/python/vllm
Ettore Di Giacinto 078942fc9f
chore(deps): bump grpcio to 1.66.2 (#3690)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-09-30 09:09:51 +02:00
..
backend.py feat(vllm): add support for embeddings (#3440) 2024-09-02 21:44:32 +02:00
install.sh fix: add missing openvino/optimum/etc libraries for Intel, fixes #2289 (#2292) 2024-05-12 09:01:45 +02:00
Makefile feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
requirements-after.txt fix(python): move vllm to after deps, drop diffusers main deps 2024-08-07 23:34:37 +02:00
requirements-cpu.txt fix(python): move vllm to after deps, drop diffusers main deps 2024-08-07 23:34:37 +02:00
requirements-cublas11-after.txt feat(venv): shared env (#3195) 2024-08-07 19:45:14 +02:00
requirements-cublas11.txt fix(python): move vllm to after deps, drop diffusers main deps 2024-08-07 23:34:37 +02:00
requirements-cublas12-after.txt feat(venv): shared env (#3195) 2024-08-07 19:45:14 +02:00
requirements-cublas12.txt fix(python): move vllm to after deps, drop diffusers main deps 2024-08-07 23:34:37 +02:00
requirements-hipblas.txt fix(python): move vllm to after deps, drop diffusers main deps 2024-08-07 23:34:37 +02:00
requirements-install.txt feat: migrate python backends from conda to uv (#2215) 2024-05-10 15:08:08 +02:00
requirements-intel.txt chore(deps): Bump setuptools from 70.3.0 to 75.1.0 in /backend/python/vllm (#3580) 2024-09-17 03:41:01 +00:00
requirements.txt chore(deps): bump grpcio to 1.66.2 (#3690) 2024-09-30 09:09:51 +02:00
run.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
test.py feat(vllm): add support for embeddings (#3440) 2024-09-02 21:44:32 +02:00
test.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00

Creating a separate environment for the vllm project

make vllm