This website requires JavaScript.
Explore
Help
Sign In
ExternalVendorCode
/
LocalAI
Watch
1
Star
0
Fork
0
You've already forked LocalAI
mirror of
https://github.com/mudler/LocalAI.git
synced
2025-02-20 17:32:47 +00:00
Code
Issues
Actions
12
Packages
Projects
Releases
Wiki
Activity
LocalAI
/
backend
/
python
/
petals
History
Ettore Di Giacinto
949da7792d
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
...
* deps(conda): use transformers with vllm * join vllm, exllama, exllama2, split petals
2024-01-06 13:32:28 +01:00
..
backend_pb2_grpc.py
feat(petals): add backend (
#1350
)
2023-11-28 09:01:46 +01:00
backend_pb2.py
feat(diffusers): update, add autopipeline, controlnet (
#1432
)
2023-12-13 19:20:22 +01:00
backend_petals.py
feat(petals): add backend (
#1350
)
2023-11-28 09:01:46 +01:00
Makefile
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
petals.yml
fix(piper): pin petals, phonemize and espeak (
#1393
)
2023-12-07 22:58:41 +01:00
run.sh
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
test_petals.py
feat(conda): share envs with transformer-based backends (
#1465
)
2023-12-21 08:35:15 +01:00
test.sh
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00