LocalAI/backend/python/vllm
dependabot[bot] a1bc2e9771
chore(deps): Bump grpcio from 1.65.0 to 1.65.1 in /backend/python/vllm (#2964)
Bumps [grpcio](https://github.com/grpc/grpc) from 1.65.0 to 1.65.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.65.0...v1.65.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 00:08:22 +00:00
..
backend.py feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
install.sh fix: add missing openvino/optimum/etc libraries for Intel, fixes #2289 (#2292) 2024-05-12 09:01:45 +02:00
Makefile feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
README.md refactor: move backends into the backends directory (#1279) 2023-11-13 22:40:16 +01:00
requirements-cublas.txt feat: migrate python backends from conda to uv (#2215) 2024-05-10 15:08:08 +02:00
requirements-hipblas.txt fix: install pytorch from proper index for hipblas builds (#2413) 2024-05-26 18:05:52 +00:00
requirements-install.txt feat: migrate python backends from conda to uv (#2215) 2024-05-10 15:08:08 +02:00
requirements-intel.txt chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in /backend/python/vllm (#2820) 2024-07-13 06:45:29 +00:00
requirements.txt chore(deps): Bump grpcio from 1.65.0 to 1.65.1 in /backend/python/vllm (#2964) 2024-07-23 00:08:22 +00:00
run.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
test.py feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00
test.sh feat: create bash library to handle install/run/test of python backends (#2286) 2024-05-11 18:32:46 +02:00

Creating a separate environment for the vllm project

make vllm