mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-19 12:47:54 +00:00
* Use cuda in transformers if available tensorflow probably needs a different check. Signed-off-by: Erich Schubert <kno10@users.noreply.github.com> * feat: expose CUDA at top level Signed-off-by: Ettore Di Giacinto <mudler@localai.io> * tests: add to tests and create workflow for py extra backends * doc: update note on how to use core images --------- Signed-off-by: Erich Schubert <kno10@users.noreply.github.com> Signed-off-by: Ettore Di Giacinto <mudler@localai.io> Co-authored-by: Erich Schubert <kno10@users.noreply.github.com> |
||
---|---|---|
.. | ||
backend_pb2_grpc.py | ||
backend_pb2.py | ||
Makefile | ||
README.md | ||
run.sh | ||
test_transformers_server.py | ||
test.sh | ||
transformers_server.py | ||
transformers.yml |
Creating a separate environment for the transformers project
make transformers