This website requires JavaScript.
Explore
Help
Sign In
ExternalVendorCode
/
LocalAI
Watch
1
Star
0
Fork
0
You've already forked LocalAI
mirror of
https://github.com/mudler/LocalAI.git
synced
2025-02-08 03:50:15 +00:00
Code
Issues
Actions
12
Packages
Projects
Releases
Wiki
Activity
LocalAI
/
backend
/
python
/
vllm
History
Ettore Di Giacinto
949da7792d
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
...
* deps(conda): use transformers with vllm * join vllm, exllama, exllama2, split petals
2024-01-06 13:32:28 +01:00
..
backend_pb2_grpc.py
refactor: move backends into the backends directory (
#1279
)
2023-11-13 22:40:16 +01:00
backend_pb2.py
feat(diffusers): update, add autopipeline, controlnet (
#1432
)
2023-12-13 19:20:22 +01:00
backend_vllm.py
refactor: move backends into the backends directory (
#1279
)
2023-11-13 22:40:16 +01:00
Makefile
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
README.md
refactor: move backends into the backends directory (
#1279
)
2023-11-13 22:40:16 +01:00
run.sh
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
test_backend_vllm.py
feat(conda): share envs with transformer-based backends (
#1465
)
2023-12-21 08:35:15 +01:00
test.sh
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
README.md
Creating a separate environment for the vllm project
make vllm