LocalAI/backend/python/transformers
Ettore Di Giacinto 887b3dff04
feat: cuda transformers (#1401)
* Use cuda in transformers if available

tensorflow probably needs a different check.

Signed-off-by: Erich Schubert <kno10@users.noreply.github.com>

* feat: expose CUDA at top level

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* tests: add to tests and create workflow for py extra backends

* doc: update note on how to use core images

---------

Signed-off-by: Erich Schubert <kno10@users.noreply.github.com>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Co-authored-by: Erich Schubert <kno10@users.noreply.github.com>
2023-12-08 15:45:04 +01:00
..
backend_pb2_grpc.py feat(transformers): add embeddings with Automodel (#1308) 2023-11-20 21:21:17 +01:00
backend_pb2.py Feat: new backend: transformers-musicgen (#1387) 2023-12-08 10:01:02 +01:00
Makefile fix/docs: Python backend dependencies (#1360) 2023-11-30 17:46:55 +01:00
README.md feat(transformers): add embeddings with Automodel (#1308) 2023-11-20 21:21:17 +01:00
run.sh fix: rename transformers.py to avoid circular import (#1337) 2023-11-26 08:49:43 +01:00
test_transformers_server.py feat: cuda transformers (#1401) 2023-12-08 15:45:04 +01:00
test.sh fix: rename transformers.py to avoid circular import (#1337) 2023-11-26 08:49:43 +01:00
transformers_server.py feat: cuda transformers (#1401) 2023-12-08 15:45:04 +01:00
transformers.yml feat(transformers): add embeddings with Automodel (#1308) 2023-11-20 21:21:17 +01:00

Creating a separate environment for the transformers project

make transformers