LocalAI/backend
Koen Farell 36da11a0ee
deps: Update version of vLLM to add support of Cohere Command_R model in vLLM inference (#1975)
* Update vLLM version to add support of Command_R

Signed-off-by: Koen Farell <hellios.dt@gmail.com>

* fix: Fixed vllm version from requirements

Signed-off-by: Koen Farell <hellios.dt@gmail.com>

* chore: Update transformers-rocm.yml

Signed-off-by: Koen Farell <hellios.dt@gmail.com>

* chore: Update transformers.yml version of vllm

Signed-off-by: Koen Farell <hellios.dt@gmail.com>

---------

Signed-off-by: Koen Farell <hellios.dt@gmail.com>
2024-04-10 11:25:26 +00:00
..
cpp test/fix: OSX Test Repair (#1843) 2024-03-18 19:19:43 +01:00
go feat(stores): Vector store backend (#1795) 2024-03-22 21:14:04 +01:00
python deps: Update version of vLLM to add support of Cohere Command_R model in vLLM inference (#1975) 2024-04-10 11:25:26 +00:00
backend_grpc.pb.go transformers: correctly load automodels (#1643) 2024-01-26 00:13:21 +01:00
backend.proto feat(stores): Vector store backend (#1795) 2024-03-22 21:14:04 +01:00