feat: migrate python backends from conda to uv (#2215)

* feat: migrate diffusers backend from conda to uv

  - replace conda with UV for diffusers install (prototype for all
    extras backends)
  - add ability to build docker with one/some/all extras backends
    instead of all or nothing

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate autogtpq bark coqui from conda to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: convert exllama over to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate exllama2 to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate mamba to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate parler to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate petals to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: fix tests

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate rerankers to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate sentencetransformers to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: install uv for tests-linux

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: make sure file exists before installing on intel images

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate transformers backend to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate transformers-musicgen to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate vall-e-x to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: migrate vllm to uv

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: add uv install to the rest of test-extra.yml

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: adjust file perms on all install/run/test scripts

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: add missing acclerate dependencies

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: add some more missing dependencies to python backends

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: parler tests venv py dir fix

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: correct filename for transformers-musicgen tests

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: adjust the pwd for valle tests

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: cleanup and optimization work for uv migration

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: add setuptools to requirements-install for mamba

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: more size optimization work

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* feat: make installs and tests more consistent, cleanup some deps

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: cleanup

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: mamba backend is cublas only

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

* fix: uncomment lines in makefile

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>

---------

Signed-off-by: Chris Jowett <421501+cryptk@users.noreply.github.com>
This commit is contained in:
cryptk 2024-05-10 08:08:08 -05:00 committed by GitHub
parent e6768097f4
commit 28a421cb1d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
101 changed files with 988 additions and 740 deletions

View File

@ -6,6 +6,11 @@ examples/chatbot-ui/models
examples/rwkv/models examples/rwkv/models
examples/**/models examples/**/models
Dockerfile* Dockerfile*
__pycache__
# SonarQube # SonarQube
.scannerwork .scannerwork
# backend virtual environments
**/venv
backend/python/**/source

View File

@ -25,22 +25,14 @@ jobs:
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install build-essential ffmpeg sudo apt-get install build-essential ffmpeg
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # Install UV
sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ curl -LsSf https://astral.sh/uv/install.sh | sh
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \
sudo apt-get install -y conda
sudo apt-get install -y ca-certificates cmake curl patch python3-pip sudo apt-get install -y ca-certificates cmake curl patch python3-pip
sudo apt-get install -y libopencv-dev sudo apt-get install -y libopencv-dev
pip install --user grpcio-tools==1.63.0 pip install --user grpcio-tools==1.63.0
sudo rm -rfv /usr/bin/conda || true
- name: Test transformers - name: Test transformers
run: | run: |
export PATH=$PATH:/opt/conda/bin
make --jobs=5 --output-sync=target -C backend/python/transformers make --jobs=5 --output-sync=target -C backend/python/transformers
make --jobs=5 --output-sync=target -C backend/python/transformers test make --jobs=5 --output-sync=target -C backend/python/transformers test
@ -55,22 +47,14 @@ jobs:
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install build-essential ffmpeg sudo apt-get install build-essential ffmpeg
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # Install UV
sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ curl -LsSf https://astral.sh/uv/install.sh | sh
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \
sudo apt-get install -y conda
sudo apt-get install -y ca-certificates cmake curl patch python3-pip sudo apt-get install -y ca-certificates cmake curl patch python3-pip
sudo apt-get install -y libopencv-dev sudo apt-get install -y libopencv-dev
pip install --user grpcio-tools==1.63.0 pip install --user grpcio-tools==1.63.0
sudo rm -rfv /usr/bin/conda || true
- name: Test sentencetransformers - name: Test sentencetransformers
run: | run: |
export PATH=$PATH:/opt/conda/bin
make --jobs=5 --output-sync=target -C backend/python/sentencetransformers make --jobs=5 --output-sync=target -C backend/python/sentencetransformers
make --jobs=5 --output-sync=target -C backend/python/sentencetransformers test make --jobs=5 --output-sync=target -C backend/python/sentencetransformers test
@ -86,22 +70,14 @@ jobs:
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install build-essential ffmpeg sudo apt-get install build-essential ffmpeg
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # Install UV
sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ curl -LsSf https://astral.sh/uv/install.sh | sh
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \
sudo apt-get install -y conda
sudo apt-get install -y ca-certificates cmake curl patch python3-pip sudo apt-get install -y ca-certificates cmake curl patch python3-pip
sudo apt-get install -y libopencv-dev sudo apt-get install -y libopencv-dev
pip install --user grpcio-tools==1.63.0 pip install --user grpcio-tools==1.63.0
sudo rm -rfv /usr/bin/conda || true
- name: Test rerankers - name: Test rerankers
run: | run: |
export PATH=$PATH:/opt/conda/bin
make --jobs=5 --output-sync=target -C backend/python/rerankers make --jobs=5 --output-sync=target -C backend/python/rerankers
make --jobs=5 --output-sync=target -C backend/python/rerankers test make --jobs=5 --output-sync=target -C backend/python/rerankers test
@ -115,23 +91,14 @@ jobs:
- name: Dependencies - name: Dependencies
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install build-essential ffmpeg sudo apt-get install -y build-essential ffmpeg
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \
sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \
sudo apt-get install -y conda
sudo apt-get install -y ca-certificates cmake curl patch python3-pip sudo apt-get install -y ca-certificates cmake curl patch python3-pip
sudo apt-get install -y libopencv-dev sudo apt-get install -y libopencv-dev
# Install UV
curl -LsSf https://astral.sh/uv/install.sh | sh
pip install --user grpcio-tools==1.63.0 pip install --user grpcio-tools==1.63.0
sudo rm -rfv /usr/bin/conda || true
- name: Test diffusers - name: Test diffusers
run: | run: |
export PATH=$PATH:/opt/conda/bin
make --jobs=5 --output-sync=target -C backend/python/diffusers make --jobs=5 --output-sync=target -C backend/python/diffusers
make --jobs=5 --output-sync=target -C backend/python/diffusers test make --jobs=5 --output-sync=target -C backend/python/diffusers test
@ -146,22 +113,14 @@ jobs:
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install build-essential ffmpeg sudo apt-get install build-essential ffmpeg
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # Install UV
sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ curl -LsSf https://astral.sh/uv/install.sh | sh
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \
sudo apt-get install -y conda
sudo apt-get install -y ca-certificates cmake curl patch python3-pip sudo apt-get install -y ca-certificates cmake curl patch python3-pip
sudo apt-get install -y libopencv-dev sudo apt-get install -y libopencv-dev
pip install --user grpcio-tools==1.63.0 pip install --user grpcio-tools==1.63.0
sudo rm -rfv /usr/bin/conda || true
- name: Test parler-tts - name: Test parler-tts
run: | run: |
export PATH=$PATH:/opt/conda/bin
make --jobs=5 --output-sync=target -C backend/python/parler-tts make --jobs=5 --output-sync=target -C backend/python/parler-tts
make --jobs=5 --output-sync=target -C backend/python/parler-tts test make --jobs=5 --output-sync=target -C backend/python/parler-tts test
@ -176,22 +135,14 @@ jobs:
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install build-essential ffmpeg sudo apt-get install build-essential ffmpeg
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # Install UV
sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ curl -LsSf https://astral.sh/uv/install.sh | sh
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \
sudo apt-get install -y conda
sudo apt-get install -y ca-certificates cmake curl patch python3-pip sudo apt-get install -y ca-certificates cmake curl patch python3-pip
sudo apt-get install -y libopencv-dev sudo apt-get install -y libopencv-dev
pip install --user grpcio-tools==1.63.0 pip install --user grpcio-tools==1.63.0
sudo rm -rfv /usr/bin/conda || true
- name: Test transformers-musicgen - name: Test transformers-musicgen
run: | run: |
export PATH=$PATH:/opt/conda/bin
make --jobs=5 --output-sync=target -C backend/python/transformers-musicgen make --jobs=5 --output-sync=target -C backend/python/transformers-musicgen
make --jobs=5 --output-sync=target -C backend/python/transformers-musicgen test make --jobs=5 --output-sync=target -C backend/python/transformers-musicgen test
@ -208,22 +159,14 @@ jobs:
# run: | # run: |
# sudo apt-get update # sudo apt-get update
# sudo apt-get install build-essential ffmpeg # sudo apt-get install build-essential ffmpeg
# curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # # Install UV
# sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ # curl -LsSf https://astral.sh/uv/install.sh | sh
# gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
# sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
# sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
# sudo apt-get update && \
# sudo apt-get install -y conda
# sudo apt-get install -y ca-certificates cmake curl patch python3-pip # sudo apt-get install -y ca-certificates cmake curl patch python3-pip
# sudo apt-get install -y libopencv-dev # sudo apt-get install -y libopencv-dev
# pip install --user grpcio-tools==1.63.0 # pip install --user grpcio-tools==1.63.0
# sudo rm -rfv /usr/bin/conda || true
# - name: Test petals # - name: Test petals
# run: | # run: |
# export PATH=$PATH:/opt/conda/bin
# make --jobs=5 --output-sync=target -C backend/python/petals # make --jobs=5 --output-sync=target -C backend/python/petals
# make --jobs=5 --output-sync=target -C backend/python/petals test # make --jobs=5 --output-sync=target -C backend/python/petals test
@ -280,22 +223,14 @@ jobs:
# run: | # run: |
# sudo apt-get update # sudo apt-get update
# sudo apt-get install build-essential ffmpeg # sudo apt-get install build-essential ffmpeg
# curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # # Install UV
# sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ # curl -LsSf https://astral.sh/uv/install.sh | sh
# gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
# sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
# sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
# sudo apt-get update && \
# sudo apt-get install -y conda
# sudo apt-get install -y ca-certificates cmake curl patch python3-pip # sudo apt-get install -y ca-certificates cmake curl patch python3-pip
# sudo apt-get install -y libopencv-dev # sudo apt-get install -y libopencv-dev
# pip install --user grpcio-tools==1.63.0 # pip install --user grpcio-tools==1.63.0
# sudo rm -rfv /usr/bin/conda || true
# - name: Test bark # - name: Test bark
# run: | # run: |
# export PATH=$PATH:/opt/conda/bin
# make --jobs=5 --output-sync=target -C backend/python/bark # make --jobs=5 --output-sync=target -C backend/python/bark
# make --jobs=5 --output-sync=target -C backend/python/bark test # make --jobs=5 --output-sync=target -C backend/python/bark test
@ -313,20 +248,13 @@ jobs:
# run: | # run: |
# sudo apt-get update # sudo apt-get update
# sudo apt-get install build-essential ffmpeg # sudo apt-get install build-essential ffmpeg
# curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # # Install UV
# sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ # curl -LsSf https://astral.sh/uv/install.sh | sh
# gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
# sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
# sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
# sudo apt-get update && \
# sudo apt-get install -y conda
# sudo apt-get install -y ca-certificates cmake curl patch python3-pip # sudo apt-get install -y ca-certificates cmake curl patch python3-pip
# sudo apt-get install -y libopencv-dev # sudo apt-get install -y libopencv-dev
# pip install --user grpcio-tools==1.63.0 # pip install --user grpcio-tools==1.63.0
# sudo rm -rfv /usr/bin/conda || true
# - name: Test vllm # - name: Test vllm
# run: | # run: |
# export PATH=$PATH:/opt/conda/bin
# make --jobs=5 --output-sync=target -C backend/python/vllm # make --jobs=5 --output-sync=target -C backend/python/vllm
# make --jobs=5 --output-sync=target -C backend/python/vllm test # make --jobs=5 --output-sync=target -C backend/python/vllm test
tests-vallex: tests-vallex:
@ -340,20 +268,13 @@ jobs:
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install build-essential ffmpeg sudo apt-get install build-essential ffmpeg
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \ # Install UV
sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \ curl -LsSf https://astral.sh/uv/install.sh | sh
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \
sudo apt-get install -y conda
sudo apt-get install -y ca-certificates cmake curl patch python3-pip sudo apt-get install -y ca-certificates cmake curl patch python3-pip
sudo apt-get install -y libopencv-dev sudo apt-get install -y libopencv-dev
pip install --user grpcio-tools==1.63.0 pip install --user grpcio-tools==1.63.0
sudo rm -rfv /usr/bin/conda || true
- name: Test vall-e-x - name: Test vall-e-x
run: | run: |
export PATH=$PATH:/opt/conda/bin
make --jobs=5 --output-sync=target -C backend/python/vall-e-x make --jobs=5 --output-sync=target -C backend/python/vall-e-x
make --jobs=5 --output-sync=target -C backend/python/vall-e-x test make --jobs=5 --output-sync=target -C backend/python/vall-e-x test
@ -368,19 +289,11 @@ jobs:
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install build-essential ffmpeg sudo apt-get install build-essential ffmpeg
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \
sudo install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list' && \
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \
sudo apt-get install -y conda
sudo apt-get install -y ca-certificates cmake curl patch espeak espeak-ng python3-pip sudo apt-get install -y ca-certificates cmake curl patch espeak espeak-ng python3-pip
# Install UV
curl -LsSf https://astral.sh/uv/install.sh | sh
pip install --user grpcio-tools==1.63.0 pip install --user grpcio-tools==1.63.0
sudo rm -rfv /usr/bin/conda || true
- name: Test coqui - name: Test coqui
run: | run: |
export PATH=$PATH:/opt/conda/bin
make --jobs=5 --output-sync=target -C backend/python/coqui make --jobs=5 --output-sync=target -C backend/python/coqui
make --jobs=5 --output-sync=target -C backend/python/coqui test make --jobs=5 --output-sync=target -C backend/python/coqui test

View File

@ -78,6 +78,8 @@ jobs:
sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \ sudo /bin/bash -c 'echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list' && \
sudo apt-get update && \ sudo apt-get update && \
sudo apt-get install -y conda sudo apt-get install -y conda
# Install UV
curl -LsSf https://astral.sh/uv/install.sh | sh
sudo apt-get install -y ca-certificates cmake patch python3-pip unzip sudo apt-get install -y ca-certificates cmake patch python3-pip unzip
sudo apt-get install -y libopencv-dev sudo apt-get install -y libopencv-dev

3
.gitignore vendored
View File

@ -47,3 +47,6 @@ prepare
# SonarQube # SonarQube
.scannerwork .scannerwork
# backend virtual environments
**/venv

View File

@ -76,26 +76,16 @@ RUN test -n "$TARGETARCH" \
# The requirements-extras target is for any builds with IMAGE_TYPE=extras. It should not be placed in this target unless every IMAGE_TYPE=extras build will use it # The requirements-extras target is for any builds with IMAGE_TYPE=extras. It should not be placed in this target unless every IMAGE_TYPE=extras build will use it
FROM requirements-core AS requirements-extras FROM requirements-core AS requirements-extras
RUN apt-get update && \ RUN curl -LsSf https://astral.sh/uv/install.sh | sh
apt-get install -y --no-install-recommends gpg && \
curl https://repo.anaconda.com/pkgs/misc/gpgkeys/anaconda.asc | gpg --dearmor > conda.gpg && \
install -o root -g root -m 644 conda.gpg /usr/share/keyrings/conda-archive-keyring.gpg && \
gpg --keyring /usr/share/keyrings/conda-archive-keyring.gpg --no-default-keyring --fingerprint 34161F5BF5EB1D4BFBBB8F0A8AEB4F8B29D82806 && \
echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" > /etc/apt/sources.list.d/conda.list && \
echo "deb [arch=amd64 signed-by=/usr/share/keyrings/conda-archive-keyring.gpg] https://repo.anaconda.com/pkgs/misc/debrepo/conda stable main" | tee -a /etc/apt/sources.list.d/conda.list && \
apt-get update && \
apt-get install -y --no-install-recommends \
conda && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
ENV PATH="/root/.cargo/bin:${PATH}" ENV PATH="/root/.cargo/bin:${PATH}"
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
RUN apt-get update && \ RUN apt-get update && \
apt-get install -y --no-install-recommends \ apt-get install -y --no-install-recommends \
espeak-ng \ espeak-ng \
espeak && \ espeak \
python3-dev \
python3-venv && \
apt-get clean && \ apt-get clean && \
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*
@ -246,6 +236,7 @@ ARG FFMPEG
ARG BUILD_TYPE ARG BUILD_TYPE
ARG TARGETARCH ARG TARGETARCH
ARG IMAGE_TYPE=extras ARG IMAGE_TYPE=extras
ARG EXTRA_BACKENDS
ARG MAKEFLAGS ARG MAKEFLAGS
ENV BUILD_TYPE=${BUILD_TYPE} ENV BUILD_TYPE=${BUILD_TYPE}
@ -257,7 +248,6 @@ ARG CUDA_MAJOR_VERSION=11
ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
ENV NVIDIA_REQUIRE_CUDA="cuda>=${CUDA_MAJOR_VERSION}.0" ENV NVIDIA_REQUIRE_CUDA="cuda>=${CUDA_MAJOR_VERSION}.0"
ENV NVIDIA_VISIBLE_DEVICES=all ENV NVIDIA_VISIBLE_DEVICES=all
ENV PIP_CACHE_PURGE=true
# Add FFmpeg # Add FFmpeg
RUN if [ "${FFMPEG}" = "true" ]; then \ RUN if [ "${FFMPEG}" = "true" ]; then \
@ -290,51 +280,58 @@ COPY --from=builder /build/sources/go-piper/piper-phonemize/pi/lib/* /usr/lib/
# do not let stablediffusion rebuild (requires an older version of absl) # do not let stablediffusion rebuild (requires an older version of absl)
COPY --from=builder /build/backend-assets/grpc/stablediffusion ./backend-assets/grpc/stablediffusion COPY --from=builder /build/backend-assets/grpc/stablediffusion ./backend-assets/grpc/stablediffusion
## Duplicated from Makefile to avoid having a big layer that's hard to push # Change the shell to bash so we can use [[ tests below
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \ SHELL ["/bin/bash", "-c"]
make -C backend/python/autogptq \ # We try to strike a balance between individual layer size (as that affects total push time) and total image size
; fi # Splitting the backends into more groups with fewer items results in a larger image, but a smaller size for the largest layer
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \ # Splitting the backends into fewer groups with more items results in a smaller image, but a larger size for the largest layer
make -C backend/python/bark \
; fi RUN if [[ ( "${EXTRA_BACKENDS}" =~ "coqui" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \ make -C backend/python/coqui \
; fi && \
if [[ ( "${EXTRA_BACKENDS}" =~ "parler-tts" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/parler-tts \
; fi && \
if [[ ( "${EXTRA_BACKENDS}" =~ "diffusers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/diffusers \ make -C backend/python/diffusers \
; fi ; fi && \
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \ if [[ ( "${EXTRA_BACKENDS}" =~ "transformers-musicgen" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/vllm \ make -C backend/python/transformers-musicgen \
; fi ; fi && \
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \ if [[ ( "${EXTRA_BACKENDS}" =~ "exllama1" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/mamba \
; fi
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \
make -C backend/python/sentencetransformers \
; fi
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \
make -C backend/python/rerankers \
; fi
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \
make -C backend/python/transformers \
; fi
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \
make -C backend/python/vall-e-x \
; fi
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \
make -C backend/python/exllama \ make -C backend/python/exllama \
; fi ; fi
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \
make -C backend/python/exllama2 \ RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vall-e-x" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
; fi make -C backend/python/vall-e-x \
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \ ; fi && \
if [[ ( "${EXTRA_BACKENDS}" =~ "petals" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/petals \ make -C backend/python/petals \
; fi && \
if [[ ( "${EXTRA_BACKENDS}" =~ "sentencetransformers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/sentencetransformers \
; fi && \
if [[ ( "${EXTRA_BACKENDS}" =~ "exllama2" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/exllama2 \
; fi && \
if [[ ( "${EXTRA_BACKENDS}" =~ "transformers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/transformers \
; fi ; fi
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \
make -C backend/python/transformers-musicgen \ RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
; fi make -C backend/python/vllm \
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \ ; fi && \
make -C backend/python/parler-tts \ if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
; fi make -C backend/python/autogptq \
RUN if [ "${IMAGE_TYPE}" = "extras" ]; then \ ; fi && \
make -C backend/python/coqui \ if [[ ( "${EXTRA_BACKENDS}" =~ "bark" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/bark \
; fi && \
if [[ ( "${EXTRA_BACKENDS}" =~ "rerankers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/rerankers \
; fi && \
if [[ ( "${EXTRA_BACKENDS}" =~ "mamba" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \
make -C backend/python/mamba \
; fi ; fi
# Make sure the models directory exists # Make sure the models directory exists

View File

@ -62,8 +62,8 @@ grpc-server: llama.cpp llama.cpp/examples/grpc-server
@echo "Building grpc-server with $(BUILD_TYPE) build type and $(CMAKE_ARGS)" @echo "Building grpc-server with $(BUILD_TYPE) build type and $(CMAKE_ARGS)"
ifneq (,$(findstring sycl,$(BUILD_TYPE))) ifneq (,$(findstring sycl,$(BUILD_TYPE)))
bash -c "source $(ONEAPI_VARS); \ bash -c "source $(ONEAPI_VARS); \
cd llama.cpp && mkdir -p build && cd build && cmake .. $(CMAKE_ARGS) && cmake --build . --config Release" cd llama.cpp && mkdir -p build && cd build && cmake .. $(CMAKE_ARGS) && $(MAKE)"
else else
cd llama.cpp && mkdir -p build && cd build && cmake .. $(CMAKE_ARGS) && cmake --build . --config Release cd llama.cpp && mkdir -p build && cd build && cmake .. $(CMAKE_ARGS) && $(MAKE)
endif endif
cp llama.cpp/build/bin/grpc-server . cp llama.cpp/build/bin/grpc-server .

View File

@ -1,6 +1,6 @@
.PHONY: autogptq .PHONY: autogptq
autogptq: protogen autogptq: protogen
$(MAKE) -C ../common-env/transformers bash install.sh
.PHONY: protogen .PHONY: protogen
protogen: backend_pb2_grpc.py backend_pb2.py protogen: backend_pb2_grpc.py backend_pb2.py
@ -11,3 +11,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

View File

@ -0,0 +1,34 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1,7 @@
accelerate
auto-gptq==0.7.1
grpcio==1.63.0
protobuf
torch
certifi
transformers

View File

@ -1,14 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the autogptq server with conda ## A bash script wrapper that runs the autogptq server
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate transformers
# get the directory where the bash script is located python $MY_DIR/autogptq.py $@
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/autogptq.py $@

16
backend/python/autogptq/test.sh Executable file
View File

@ -0,0 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,6 +1,6 @@
.PHONY: ttsbark .PHONY: ttsbark
ttsbark: protogen ttsbark: protogen
$(MAKE) -C ../common-env/transformers bash install.sh
.PHONY: run .PHONY: run
run: protogen run: protogen
@ -23,3 +23,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

34
backend/python/bark/install.sh Executable file
View File

@ -0,0 +1,34 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1,6 @@
accelerate
bark==0.1.5
grpcio==1.63.0
protobuf
certifi
transformers

View File

@ -1,14 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the ttsbark server with conda ## A bash script wrapper that runs the ttsbark server
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate transformers
# get the directory where the bash script is located python $MY_DIR/ttsbark.py $@
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/ttsbark.py $@

17
backend/python/bark/test.sh Normal file → Executable file
View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the bark server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

0
backend/python/common-env/transformers/install.sh Normal file → Executable file
View File

View File

@ -1,6 +1,6 @@
.PHONY: coqui .PHONY: coqui
coqui: protogen coqui: protogen
$(MAKE) -C ../common-env/transformers bash install.sh
.PHONY: run .PHONY: run
run: protogen run: protogen
@ -23,3 +23,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

34
backend/python/coqui/install.sh Executable file
View File

@ -0,0 +1,34 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1,6 @@
accelerate
TTS==0.22.0
grpcio==1.63.0
protobuf
certifi
transformers

View File

@ -1,14 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the ttsbark server with conda ## A bash script wrapper that runs the coqui server
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate transformers
# get the directory where the bash script is located python $MY_DIR/coqui_server.py $@
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/coqui_server.py $@

17
backend/python/coqui/test.sh Normal file → Executable file
View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the bark server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -13,8 +13,7 @@ endif
.PHONY: diffusers .PHONY: diffusers
diffusers: protogen diffusers: protogen
@echo "Installing $(CONDA_ENV_PATH)..." bash install.sh
bash install.sh $(CONDA_ENV_PATH)
.PHONY: run .PHONY: run
run: protogen run: protogen
@ -34,3 +33,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

View File

@ -1,65 +0,0 @@
name: diffusers
channels:
- defaults
dependencies:
- _libgcc_mutex=0.1=main
- _openmp_mutex=5.1=1_gnu
- bzip2=1.0.8=h7b6447c_0
- ca-certificates=2023.08.22=h06a4308_0
- ld_impl_linux-64=2.38=h1181459_1
- libffi=3.4.4=h6a678d5_0
- libgcc-ng=11.2.0=h1234567_1
- libgomp=11.2.0=h1234567_1
- libstdcxx-ng=11.2.0=h1234567_1
- libuuid=1.41.5=h5eee18b_0
- ncurses=6.4=h6a678d5_0
- openssl=3.0.11=h7f8727e_2
- pip=23.2.1=py311h06a4308_0
- python=3.11.5=h955ad1f_0
- readline=8.2=h5eee18b_0
- setuptools=68.0.0=py311h06a4308_0
- sqlite=3.41.2=h5eee18b_0
- tk=8.6.12=h1ccaba5_0
- tzdata=2023c=h04d1e81_0
- wheel=0.41.2=py311h06a4308_0
- xz=5.4.2=h5eee18b_0
- zlib=1.2.13=h5eee18b_0
- pip:
- --pre
- --extra-index-url https://download.pytorch.org/whl/nightly/
- accelerate>=0.11.0
- certifi==2023.7.22
- charset-normalizer==3.3.0
- compel==2.0.2
- diffusers==0.24.0
- filelock==3.12.4
- fsspec==2023.9.2
- grpcio==1.63.0
- huggingface-hub>=0.19.4
- idna==3.4
- importlib-metadata==6.8.0
- jinja2==3.1.2
- markupsafe==2.1.3
- mpmath==1.3.0
- networkx==3.1
- numpy==1.26.0
- omegaconf
- packaging==23.2
- pillow==10.0.1
- protobuf==4.24.4
- psutil==5.9.5
- pyparsing==3.1.1
- pyyaml==6.0.1
- regex==2023.10.3
- requests==2.31.0
- safetensors==0.4.0
- sympy==1.12
- tqdm==4.66.1
- transformers>=4.25.1
- triton==2.1.0
- typing-extensions==4.8.0
- urllib3==2.0.6
- zipp==3.17.0
- torch
- opencv-python
prefix: /opt/conda/envs/diffusers

View File

@ -1,75 +0,0 @@
name: diffusers
channels:
- defaults
dependencies:
- _libgcc_mutex=0.1=main
- _openmp_mutex=5.1=1_gnu
- bzip2=1.0.8=h7b6447c_0
- ca-certificates=2023.08.22=h06a4308_0
- ld_impl_linux-64=2.38=h1181459_1
- libffi=3.4.4=h6a678d5_0
- libgcc-ng=11.2.0=h1234567_1
- libgomp=11.2.0=h1234567_1
- libstdcxx-ng=11.2.0=h1234567_1
- libuuid=1.41.5=h5eee18b_0
- ncurses=6.4=h6a678d5_0
- openssl=3.0.11=h7f8727e_2
- pip=23.2.1=py311h06a4308_0
- python=3.11.5=h955ad1f_0
- readline=8.2=h5eee18b_0
- setuptools=68.0.0=py311h06a4308_0
- sqlite=3.41.2=h5eee18b_0
- tk=8.6.12=h1ccaba5_0
- tzdata=2023c=h04d1e81_0
- wheel=0.41.2=py311h06a4308_0
- xz=5.4.2=h5eee18b_0
- zlib=1.2.13=h5eee18b_0
- pip:
- accelerate>=0.11.0
- certifi==2023.7.22
- charset-normalizer==3.3.0
- compel==2.0.2
- diffusers==0.24.0
- filelock==3.12.4
- fsspec==2023.9.2
- grpcio==1.63.0
- huggingface-hub>=0.19.4
- idna==3.4
- importlib-metadata==6.8.0
- jinja2==3.1.2
- markupsafe==2.1.3
- mpmath==1.3.0
- networkx==3.1
- numpy==1.26.0
- nvidia-cublas-cu12==12.1.3.1
- nvidia-cuda-cupti-cu12==12.1.105
- nvidia-cuda-nvrtc-cu12==12.1.105
- nvidia-cuda-runtime-cu12==12.1.105
- nvidia-cudnn-cu12==8.9.2.26
- nvidia-cufft-cu12==11.0.2.54
- nvidia-curand-cu12==10.3.2.106
- nvidia-cusolver-cu12==11.4.5.107
- nvidia-cusparse-cu12==12.1.0.106
- nvidia-nccl-cu12==2.18.1
- nvidia-nvjitlink-cu12==12.2.140
- nvidia-nvtx-cu12==12.1.105
- omegaconf
- packaging==23.2
- pillow==10.0.1
- protobuf==4.24.4
- psutil==5.9.5
- pyparsing==3.1.1
- pyyaml==6.0.1
- regex==2023.10.3
- requests==2.31.0
- safetensors==0.4.0
- sympy==1.12
- torch==2.1.0
- tqdm==4.66.1
- transformers>=4.25.1
- triton==2.1.0
- typing-extensions==4.8.0
- urllib3==2.0.6
- zipp==3.17.0
- opencv-python
prefix: /opt/conda/envs/diffusers

View File

@ -1,50 +1,34 @@
#!/bin/bash #!/bin/bash
set -ex set -ex
SKIP_CONDA=${SKIP_CONDA:-0} BUILD_ISOLATION_FLAG=""
# Check if environment exist MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
conda_env_exists(){
! conda list --name "${@}" >/dev/null 2>/dev/null
}
if [ $SKIP_CONDA -eq 1 ]; then uv venv ${MY_DIR}/venv
echo "Skipping conda environment installation" source ${MY_DIR}/venv/bin/activate
else
export PATH=$PATH:/opt/conda/bin if [ -f "requirements-install.txt" ]; then
if conda_env_exists "diffusers" ; then # If we have a requirements-install.txt, it means that a package does not properly declare it's build time
echo "Creating virtual environment..." # dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
conda env create --name diffusers --file $1 # the package without build isolation
echo "Virtual environment created." BUILD_ISOLATION_FLAG="--no-build-isolation"
else uv pip install --requirement ${MY_DIR}/requirements-install.txt
echo "Virtual environment already exists."
fi fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi fi
if [ -d "/opt/intel" ]; then if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image # Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538 # https://github.com/intel/intel-extension-for-pytorch/issues/538
pip install torch==2.1.0a0 \ if [ -f "requirements-intel.txt" ]; then
torchvision==0.16.0a0 \ uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
torchaudio==2.1.0a0 \ fi
intel-extension-for-pytorch==2.1.10+xpu \
--extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
pip install google-api-python-client \
grpcio==1.63.0 \
grpcio-tools==1.63.0 \
diffusers==0.24.0 \
transformers>=4.25.1 \
accelerate \
compel==2.0.2 \
Pillow
fi fi
if [ "$PIP_CACHE_PURGE" = true ] ; then if [ "$PIP_CACHE_PURGE" = true ] ; then
if [ $SKIP_CONDA -ne 1 ]; then
# Activate conda environment
source activate diffusers
fi
pip cache purge pip cache purge
fi fi

View File

@ -0,0 +1,3 @@
intel-extension-for-pytorch
torchaudio
torchvision

View File

@ -0,0 +1,10 @@
accelerate
compel
diffusers
grpcio==1.63.0
opencv-python
pillow
protobuf
torch
transformers
certifi

View File

@ -1,19 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the diffusers server with conda ## A bash script wrapper that runs the GRPC backend
if [ -d "/opt/intel" ]; then MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Assumes we are using the Intel oneAPI container image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
export XPU=1
else
export PATH=$PATH:/opt/conda/bin
# Activate conda environment
source activate diffusers
fi
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/backend_diffusers.py $@ python $MY_DIR/backend_diffusers.py $@

20
backend/python/diffusers/test.sh Normal file → Executable file
View File

@ -1,14 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the diffusers server with conda ## A bash script wrapper that runs python unittests
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate diffusers
# get the directory where the bash script is located if [ -f "${MY_DIR}/test.py" ]; then
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )" pushd ${MY_DIR}
python -m unittest test.py
python -m unittest $DIR/test.py popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

1
backend/python/exllama/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
source

View File

@ -19,3 +19,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
$(RM) -r venv source

View File

@ -1,31 +1,36 @@
#!/bin/bash #!/bin/bash
set -ex set -ex
export PATH=$PATH:/opt/conda/bin BUILD_ISOLATION_FLAG=""
if [ "$BUILD_TYPE" != "cublas" ]; then if [ "$BUILD_TYPE" != "cublas" ]; then
echo "[exllama] Attention!!! Nvidia GPU is required - skipping installation" echo "[exllama] Attention!!! Nvidia GPU is required - skipping installation"
exit 0 exit 0
fi fi
# Check if environment exist MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
conda_env_exists(){
! conda list --name "${@}" >/dev/null 2>/dev/null
}
if conda_env_exists "exllama" ; then uv venv ${MY_DIR}/venv
echo "Creating virtual environment..." source ${MY_DIR}/venv/bin/activate
conda env create --name exllama --file $1
echo "Virtual environment created." if [ -f "requirements-install.txt" ]; then
else # If we have a requirements-install.txt, it means that a package does not properly declare it's build time
echo "Virtual environment already exists." # dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi fi
source activate exllama uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
git clone https://github.com/turboderp/exllama $CONDA_PREFIX/exllama && pushd $CONDA_PREFIX/exllama && pip install -r requirements.txt && popd if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
cp -rfv $CONDA_PREFIX/exllama/* ./ git clone https://github.com/turboderp/exllama $MY_DIR/source
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/source/requirements.txt
cp -rfv ./*py $MY_DIR/source/
if [ "$PIP_CACHE_PURGE" = true ] ; then if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge pip cache purge

View File

@ -0,0 +1,6 @@
grpcio==1.63.0
protobuf
torch
transformers
certifi
setuptools

View File

@ -1,15 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the exllama server with conda ## A bash script wrapper that runs the exllama server with uv
export PATH=$PATH:/opt/conda/bin
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate exllama
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
cd $DIR python $MY_DIR/source/exllama.py $@
python $DIR/exllama.py $@

16
backend/python/exllama/test.sh Executable file
View File

@ -0,0 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

1
backend/python/exllama2/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
source

View File

@ -1,6 +1,5 @@
.PHONY: exllama2 .PHONY: exllama2
exllama2: protogen exllama2: protogen
$(MAKE) -C ../common-env/transformers
bash install.sh bash install.sh
.PHONY: run .PHONY: run
@ -18,3 +17,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
$(RM) -r venv source

View File

@ -1,57 +0,0 @@
name: exllama2
channels:
- defaults
dependencies:
- _libgcc_mutex=0.1=main
- _openmp_mutex=5.1=1_gnu
- bzip2=1.0.8=h7b6447c_0
- ca-certificates=2023.08.22=h06a4308_0
- ld_impl_linux-64=2.38=h1181459_1
- libffi=3.4.4=h6a678d5_0
- libgcc-ng=11.2.0=h1234567_1
- libgomp=11.2.0=h1234567_1
- libstdcxx-ng=11.2.0=h1234567_1
- libuuid=1.41.5=h5eee18b_0
- ncurses=6.4=h6a678d5_0
- openssl=3.0.11=h7f8727e_2
- pip=23.2.1=py311h06a4308_0
- python=3.11.5=h955ad1f_0
- readline=8.2=h5eee18b_0
- setuptools=68.0.0=py311h06a4308_0
- sqlite=3.41.2=h5eee18b_0
- tk=8.6.12=h1ccaba5_0
- tzdata=2023c=h04d1e81_0
- wheel=0.41.2=py311h06a4308_0
- xz=5.4.2=h5eee18b_0
- zlib=1.2.13=h5eee18b_0
- pip:
- filelock==3.12.4
- fsspec==2023.9.2
- grpcio==1.63.0
- markupsafe==2.1.3
- mpmath==1.3.0
- networkx==3.1
- protobuf==4.24.4
- nvidia-cublas-cu12==12.1.3.1
- nvidia-cuda-cupti-cu12==12.1.105
- nvidia-cuda-nvrtc-cu12==12.1.105
- nvidia-cuda-runtime-cu12==12.1.105
- nvidia-cudnn-cu12==8.9.2.26
- nvidia-cufft-cu12==11.0.2.54
- nvidia-curand-cu12==10.3.2.106
- nvidia-cusolver-cu12==11.4.5.107
- nvidia-cusparse-cu12==12.1.0.106
- nvidia-nccl-cu12==2.18.1
- nvidia-nvjitlink-cu12==12.2.140
- nvidia-nvtx-cu12==12.1.105
- pandas
- numpy
- ninja
- fastparquet
- torch>=2.1.0
- safetensors>=0.3.2
- sentencepiece>=0.1.97
- pygments
- websockets
- regex
prefix: /opt/conda/envs/exllama2

View File

@ -2,30 +2,42 @@
set -e set -e
## ##
## A bash script installs the required dependencies of VALL-E-X and prepares the environment ## A bash script installs the required dependencies of VALL-E-X and prepares the environment
export SHA=c0ddebaaaf8ffd1b3529c2bb654e650bce2f790f EXLLAMA2_VERSION=c0ddebaaaf8ffd1b3529c2bb654e650bce2f790f
BUILD_ISOLATION_FLAG=""
if [ "$BUILD_TYPE" != "cublas" ]; then if [ "$BUILD_TYPE" != "cublas" ]; then
echo "[exllamav2] Attention!!! Nvidia GPU is required - skipping installation" echo "[exllama] Attention!!! Nvidia GPU is required - skipping installation"
exit 0 exit 0
fi fi
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
echo $CONDA_PREFIX uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
git clone https://github.com/turboderp/exllamav2 $CONDA_PREFIX/exllamav2 if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
pushd $CONDA_PREFIX/exllamav2 uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
git checkout -b build $SHA if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
# TODO: this needs to be pinned within the conda environments git clone https://github.com/turboderp/exllamav2 $MY_DIR/source
pip install -r requirements.txt pushd ${MY_DIR}/source && git checkout -b build ${EXLLAMA2_VERSION} && popd
popd uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/source/requirements.txt
# This installs exllamav2 in JIT mode so it will compile the appropriate torch extension at runtime
EXLLAMA_NOCOMPILE= uv pip install ${BUILD_ISOLATION_FLAG} ${MY_DIR}/source/
cp -rfv $CONDA_PREFIX/exllamav2/* ./ cp -rfv ./*py $MY_DIR/source/
if [ "$PIP_CACHE_PURGE" = true ] ; then if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge pip cache purge

View File

@ -0,0 +1,4 @@
# This is here to trigger the install script to add --no-build-isolation to the uv pip install commands
# exllama2 does not specify it's build requirements per PEP517, so we need to provide some things ourselves
wheel
setuptools

View File

@ -0,0 +1,7 @@
accelerate
grpcio==1.63.0
protobuf
certifi
torch
wheel
setuptools

View File

@ -1,16 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the exllama server with conda ## A bash script wrapper that runs the exllama2 server
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate transformers
# get the directory where the bash script is located python $MY_DIR/source/exllama2_backend.py $@
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
cd $DIR
python $DIR/exllama2_backend.py $@

16
backend/python/exllama2/test.sh Executable file
View File

@ -0,0 +1,16 @@
#!/bin/bash
##
## A bash script wrapper that runs python unittests
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source $MY_DIR/venv/bin/activate
if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,6 +1,5 @@
.PHONY: mamba .PHONY: mamba
mamba: protogen mamba: protogen
$(MAKE) -C ../common-env/transformers
bash install.sh bash install.sh
.PHONY: run .PHONY: run
@ -24,3 +23,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
$(RM) -r venv

View File

@ -1,21 +1,38 @@
#!/bin/bash #!/bin/bash
set -e set -ex
##
## A bash script installs the required dependencies of VALL-E-X and prepares the environment
if [ "$BUILD_TYPE" != "cublas" ]; then if [ "$BUILD_TYPE" != "cublas" ]; then
echo "[mamba] Attention!!! nvcc is required - skipping installation" echo "[mamba] Attention!!! nvcc is required - skipping installation"
exit 0 exit 0
fi fi
export PATH=$PATH:/opt/conda/bin BUILD_ISOLATION_FLAG=""
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
echo $CONDA_PREFIX uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
pip install causal-conv1d==1.0.0 mamba-ssm==1.0.1 if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge pip cache purge

View File

@ -0,0 +1,7 @@
# mabma does not specify it's build dependencies per PEP517, so we need to disable build isolation
# this also means that we need to install the basic build dependencies into the venv ourselves
# https://github.com/Dao-AILab/causal-conv1d/issues/24
packaging
setuptools
wheel
torch==2.2.0

View File

@ -0,0 +1,6 @@
causal-conv1d==1.2.0.post2
mamba-ssm==1.2.0.post1
grpcio==1.63.0
protobuf
certifi
transformers

View File

@ -1,14 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the diffusers server with conda ## A bash script wrapper that runs the GRPC server
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate transformers
# get the directory where the bash script is located python $MY_DIR/backend_mamba.py $@
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/backend_mamba.py $@

View File

@ -20,7 +20,7 @@ class TestBackendServicer(unittest.TestCase):
This class contains methods to test the startup and shutdown of the gRPC service. This class contains methods to test the startup and shutdown of the gRPC service.
""" """
def setUp(self): def setUp(self):
self.service = subprocess.Popen(["python", "backend_vllm.py", "--addr", "localhost:50051"]) self.service = subprocess.Popen(["python", "backend_mamba.py", "--addr", "localhost:50051"])
time.sleep(10) time.sleep(10)
def tearDown(self) -> None: def tearDown(self) -> None:

17
backend/python/mamba/test.sh Normal file → Executable file
View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the transformers server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test_backend_mamba.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -37,3 +37,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
$(RM) -r venv

View File

@ -1,39 +1,39 @@
#!/bin/bash #!/bin/bash
set -ex set -ex
SKIP_CONDA=${SKIP_CONDA:-0} BUILD_ISOLATION_FLAG=""
# Check if environment exist MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
conda_env_exists(){
! conda list --name "${@}" >/dev/null 2>/dev/null
}
if [ $SKIP_CONDA -eq 1 ]; then uv venv ${MY_DIR}/venv
echo "Skipping conda environment installation" source ${MY_DIR}/venv/bin/activate
else
export PATH=$PATH:/opt/conda/bin if [ -f "requirements-install.txt" ]; then
if conda_env_exists "parler" ; then # If we have a requirements-install.txt, it means that a package does not properly declare it's build time
echo "Creating virtual environment..." # dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
conda env create --name parler --file $1 # the package without build isolation
echo "Virtual environment created." BUILD_ISOLATION_FLAG="--no-build-isolation"
else uv pip install --requirement ${MY_DIR}/requirements-install.txt
echo "Virtual environment already exists." fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi fi
fi fi
if [ $SKIP_CONDA -ne 1 ]; then
# Activate conda environment
source activate parler
# https://github.com/descriptinc/audiotools/issues/101 # https://github.com/descriptinc/audiotools/issues/101
# incompatible protobuf versions. # incompatible protobuf versions.
curl -L https://raw.githubusercontent.com/protocolbuffers/protobuf/main/python/google/protobuf/internal/builder.py -o $CONDA_PREFIX/lib/python3.11/site-packages/google/protobuf/internal/builder.py PYDIR=$(ls $MY_DIR/venv/lib)
fi curl -L https://raw.githubusercontent.com/protocolbuffers/protobuf/main/python/google/protobuf/internal/builder.py -o $MY_DIR/venv/lib/$PYDIR/site-packages/google/protobuf/internal/builder.py
if [ "$PIP_CACHE_PURGE" = true ] ; then if [ "$PIP_CACHE_PURGE" = true ] ; then
if [ $SKIP_CONDA -ne 1 ]; then
# Activate conda environment
source activate parler
fi
pip cache purge pip cache purge
fi fi

View File

@ -0,0 +1,7 @@
accelerate
grpcio==1.63.0
protobuf
torch
git+https://github.com/huggingface/parler-tts.git@10016fb0300c0dc31a0fb70e26f3affee7b62f16
certifi
transformers

14
backend/python/parler-tts/run.sh Normal file → Executable file
View File

@ -1,16 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the parler-tts server with conda ## A bash script wrapper that runs the GRPC backend
echo "Launching gRPC server for parler-tts" MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
export PATH=$PATH:/opt/conda/bin source $MY_DIR/venv/bin/activate
# Activate conda environment python $MY_DIR/parler_tts_server.py $@
source activate parler
# get the directory where the bash script is located
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/parler_tts_server.py $@

17
backend/python/parler-tts/test.sh Normal file → Executable file
View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the transformers server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate parler
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test_parler.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -25,3 +25,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

33
backend/python/petals/install.sh Normal file → Executable file
View File

@ -1,5 +1,34 @@
#!/bin/bash #!/bin/bash
set -ex
export PATH=$PATH:/opt/conda/bin BUILD_ISOLATION_FLAG=""
conda env create --name petals --file $1 MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1,3 @@
git+https://github.com/bigscience-workshop/petals
certifi
transformers

View File

@ -1,23 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the exllama server with conda ## A bash script wrapper that runs the GRPC backend
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
CONDA_ENV=petals source $MY_DIR/venv/bin/activate
# Activate conda environment python $MY_DIR/backend_petals.py $@
# if source is available use it, or use conda
#
if [ -f /opt/conda/bin/activate ]; then
source activate $CONDA_ENV
else
eval "$(conda shell.bash hook)"
conda activate $CONDA_ENV
fi
# get the directory where the bash script is located
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/backend_petals.py $@

26
backend/python/petals/test.sh Normal file → Executable file
View File

@ -1,20 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the transformers server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
CONDA_ENV=petals
# Activate conda environment source $MY_DIR/venv/bin/activate
# if source is available use it, or use conda
# if [ -f "${MY_DIR}/test.py" ]; then
if [ -f /opt/conda/bin/activate ]; then pushd ${MY_DIR}
source activate $CONDA_ENV python -m unittest test.py
popd
else else
eval "$(conda shell.bash hook)" echo "ERROR: No tests defined for backend!"
conda activate $CONDA_ENV exit 1
fi fi
# get the directory where the bash script is located
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test_petals.py

View File

@ -1,7 +1,6 @@
.PHONY: rerankers .PHONY: rerankers
rerankers: protogen rerankers: protogen
$(MAKE) -C ../common-env/transformers bash install.sh
.PHONY: run .PHONY: run
run: protogen run: protogen
@ -25,3 +24,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

View File

@ -0,0 +1,34 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1,6 @@
accelerate
rerankers[transformers]
grpcio==1.63.0
protobuf
certifi
transformers

View File

@ -1,14 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the reranker server with conda ## A bash script wrapper that runs the GRPC backend
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate transformers
# get the directory where the bash script is located python $MY_DIR/reranker.py $@
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/reranker.py $@

View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the reranker server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test_reranker.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,6 +1,6 @@
.PHONY: sentencetransformers .PHONY: sentencetransformers
sentencetransformers: protogen sentencetransformers: protogen
$(MAKE) -C ../common-env/transformers bash ./install.sh
.PHONY: run .PHONY: run
@ -25,3 +25,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

View File

@ -0,0 +1,34 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1,6 @@
accelerate
sentence-transformers==2.5.1
transformers
grpcio==1.63.0
protobuf
certifi

View File

@ -1,14 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the sentencetransformers server with conda ## A bash script wrapper that runs the GRPC backend
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate transformers
# get the directory where the bash script is located python $MY_DIR/sentencetransformers.py $@
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/sentencetransformers.py $@

17
backend/python/sentencetransformers/test.sh Normal file → Executable file
View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the sentencetransformers server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test_sentencetransformers.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,6 +1,6 @@
.PHONY: transformers-musicgen .PHONY: transformers-musicgen
transformers-musicgen: protogen transformers-musicgen: protogen
$(MAKE) -C ../common-env/transformers bash install.sh
.PHONY: run .PHONY: run
run: protogen run: protogen
@ -23,3 +23,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

View File

@ -0,0 +1,34 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1,7 @@
accelerate
transformers
grpcio==1.63.0
protobuf
torch
scipy==1.13.0
certifi

14
backend/python/transformers-musicgen/run.sh Normal file → Executable file
View File

@ -1,16 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the transformers-musicgen server with conda ## A bash script wrapper that runs the GRPC backend
echo "Launching gRPC server for transformers-musicgen" MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
export PATH=$PATH:/opt/conda/bin source $MY_DIR/venv/bin/activate
# Activate conda environment python $MY_DIR/transformers_server.py $@
source activate transformers
# get the directory where the bash script is located
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/transformers_server.py $@

17
backend/python/transformers-musicgen/test.sh Normal file → Executable file
View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the transformers server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test_transformers.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,6 +1,6 @@
.PHONY: transformers .PHONY: transformers
transformers: protogen transformers: protogen
$(MAKE) -C ../common-env/transformers bash install.sh
.PHONY: run .PHONY: run
run: protogen run: protogen
@ -24,3 +24,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

View File

@ -0,0 +1,34 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1,6 @@
accelerate
transformers
grpcio==1.63.0
protobuf
torch
certifi

View File

@ -1,20 +1,17 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the transformers server with conda ## A bash script wrapper that runs the GRPC backend
if [ -d "/opt/intel" ]; then if [ -d "/opt/intel" ]; then
# Assumes we are using the Intel oneAPI container image # Assumes we are using the Intel oneAPI container image
# https://github.com/intel/intel-extension-for-pytorch/issues/538 # https://github.com/intel/intel-extension-for-pytorch/issues/538
export XPU=1 export XPU=1
else
export PATH=$PATH:/opt/conda/bin
# Activate conda environment
source activate transformers
fi fi
# get the directory where the bash script is located MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/transformers_server.py $@ source $MY_DIR/venv/bin/activate
python $MY_DIR/transformers_server.py $@

17
backend/python/transformers/test.sh Normal file → Executable file
View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the transformers server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test_transformers_server.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

1
backend/python/vall-e-x/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
source

View File

@ -4,7 +4,6 @@ endif
.PHONY: ttsvalle .PHONY: ttsvalle
ttsvalle: protogen ttsvalle: protogen
$(MAKE) -C ../common-env/transformers
bash install.sh bash install.sh
.PHONY: run .PHONY: run
@ -28,3 +27,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf source venv

40
backend/python/vall-e-x/install.sh Normal file → Executable file
View File

@ -1,21 +1,39 @@
#!/bin/bash #!/bin/bash
set -ex
## BUILD_ISOLATION_FLAG=""
## A bash script installs the required dependencies of VALL-E-X and prepares the environment
export SHA=3faaf8ccadb154d63b38070caf518ce9309ea0f4
SKIP_CONDA=${SKIP_CONDA:-0} MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
if [ $SKIP_CONDA -ne 1 ]; then uv venv ${MY_DIR}/venv
source activate transformers source ${MY_DIR}/venv/bin/activate
else
export PATH=$PATH:/opt/conda/bin if [ -f "requirements-install.txt" ]; then
CONDA_PREFIX=$PWD # If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi fi
git clone https://github.com/Plachtaa/VALL-E-X.git $CONDA_PREFIX/vall-e-x && pushd $CONDA_PREFIX/vall-e-x && git checkout -b build $SHA && popd if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
cp -rfv $CONDA_PREFIX/vall-e-x/* ./ git clone https://github.com/Plachtaa/VALL-E-X.git $MY_DIR/source
pushd $MY_DIR/source && git checkout -b build $VALL_E_X_VERSION && popd
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/source/requirements.txt
cp -rfv ./*py $MY_DIR/source/
if [ "$PIP_CACHE_PURGE" = true ] ; then if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge pip cache purge

View File

@ -0,0 +1,4 @@
accelerate
grpcio==1.63.0
protobuf
certifi

View File

@ -1,15 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the ttsvalle server with conda ## A bash script wrapper that runs the GRPC backend
export PATH=$PATH:/opt/conda/bin
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
cd $DIR pushd $MY_DIR/source && python ttsvalle.py $@
python $DIR/ttsvalle.py $@

17
backend/python/vall-e-x/test.sh Normal file → Executable file
View File

@ -1,11 +1,16 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the ttsvalle server with conda ## A bash script wrapper that runs python unittests
# Activate conda environment MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
source activate transformers
# get the directory where the bash script is located source $MY_DIR/venv/bin/activate
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python -m unittest $DIR/test.py if [ -f "${MY_DIR}/test.py" ]; then
pushd ${MY_DIR}/source
python -m unittest test.py
popd
else
echo "ERROR: No tests defined for backend!"
exit 1
fi

View File

@ -1,6 +1,6 @@
.PHONY: vllm .PHONY: vllm
vllm: protogen vllm: protogen
$(MAKE) -C ../common-env/transformers bash install.sh
.PHONY: run .PHONY: run
run: protogen run: protogen
@ -23,3 +23,7 @@ protogen-clean:
backend_pb2_grpc.py backend_pb2.py: backend_pb2_grpc.py backend_pb2.py:
python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto
.PHONY: clean
clean: protogen-clean
rm -rf venv

34
backend/python/vllm/install.sh Executable file
View File

@ -0,0 +1,34 @@
#!/bin/bash
set -ex
BUILD_ISOLATION_FLAG=""
MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
uv venv ${MY_DIR}/venv
source ${MY_DIR}/venv/bin/activate
if [ -f "requirements-install.txt" ]; then
# If we have a requirements-install.txt, it means that a package does not properly declare it's build time
# dependencies per PEP-517, so we have to set up the proper build environment ourselves, and then install
# the package without build isolation
BUILD_ISOLATION_FLAG="--no-build-isolation"
uv pip install --requirement ${MY_DIR}/requirements-install.txt
fi
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements.txt
if [ -f "requirements-${BUILD_TYPE}.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --requirement ${MY_DIR}/requirements-${BUILD_TYPE}.txt
fi
if [ -d "/opt/intel" ]; then
# Intel GPU: If the directory exists, we assume we are using the Intel image
# https://github.com/intel/intel-extension-for-pytorch/issues/538
if [ -f "requirements-intel.txt" ]; then
uv pip install ${BUILD_ISOLATION_FLAG} --index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/ --requirement ${MY_DIR}/requirements-intel.txt
fi
fi
if [ "$PIP_CACHE_PURGE" = true ] ; then
pip cache purge
fi

View File

@ -0,0 +1 @@
flash-attn

View File

@ -0,0 +1,6 @@
# mabma does not specify it's build dependencies per PEP517, so we need to disable build isolation
# this also means that we need to install the basic build dependencies into the venv ourselves
# https://github.com/Dao-AILab/causal-conv1d/issues/24
packaging
setuptools
wheel

View File

@ -0,0 +1,7 @@
accelerate
vllm
grpcio==1.63.0
protobuf
certifi
transformers
setuptools

View File

@ -1,14 +1,10 @@
#!/bin/bash #!/bin/bash
## ##
## A bash script wrapper that runs the diffusers server with conda ## A bash script wrapper that runs the GRPC backend
export PATH=$PATH:/opt/conda/bin MY_DIR="$(dirname -- "${BASH_SOURCE[0]}")"
# Activate conda environment source $MY_DIR/venv/bin/activate
source activate transformers
# get the directory where the bash script is located python $MY_DIR/backend_vllm.py $@
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
python $DIR/backend_vllm.py $@

Some files were not shown because too many files have changed in this diff Show More