This website requires JavaScript.
Explore
Help
Sign In
ExternalVendorCode
/
LocalAI
Watch
1
Star
0
Fork
0
You've already forked LocalAI
mirror of
https://github.com/mudler/LocalAI.git
synced
2025-02-22 10:00:47 +00:00
Code
Issues
Actions
13
Packages
Projects
Releases
Wiki
Activity
LocalAI
/
backend
/
python
/
exllama
History
Ettore Di Giacinto
949da7792d
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
...
* deps(conda): use transformers with vllm * join vllm, exllama, exllama2, split petals
2024-01-06 13:32:28 +01:00
..
backend_pb2_grpc.py
refactor: move backends into the backends directory (
#1279
)
2023-11-13 22:40:16 +01:00
backend_pb2.py
feat(diffusers): update, add autopipeline, controlnet (
#1432
)
2023-12-13 19:20:22 +01:00
exllama.py
exllama(v2): fix exllamav1, add exllamav2 (
#1384
)
2023-12-05 08:15:37 +01:00
exllama.yml
exllama(v2): fix exllamav1, add exllamav2 (
#1384
)
2023-12-05 08:15:37 +01:00
install.sh
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
Makefile
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
README.md
refactor: move backends into the backends directory (
#1279
)
2023-11-13 22:40:16 +01:00
run.sh
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
README.md
Creating a separate environment for the exllama project
make exllama