From b82d3052822af0b781ae1238d65c139806f0cd98 Mon Sep 17 00:00:00 2001 From: Jayant Date: Tue, 7 Jan 2025 12:20:51 +0100 Subject: [PATCH] readme : add docker instructions (#2711) I found the docker instructions to be useful in the README.md and the differences in docker variants such as ffmpeg and cuda support. However, this section was removed in v1.7.4 and I would vote to bring it back. This is a pull request to add that section back. --- README.md | 32 ++++++++++++++++++++++++++++++++ 1 file changed, 32 insertions(+) diff --git a/README.md b/README.md index 93e50d16..62afad47 100644 --- a/README.md +++ b/README.md @@ -360,6 +360,38 @@ Run the inference examples as usual, for example: - If you have trouble with Ascend NPU device, please create a issue with **[CANN]** prefix/tag. - If you run successfully with your Ascend NPU device, please help update the table `Verified devices`. +## Docker + +### Prerequisites + +- Docker must be installed and running on your system. +- Create a folder to store big models & intermediate files (ex. /whisper/models) + +### Images + +We have two Docker images available for this project: + +1. `ghcr.io/ggerganov/whisper.cpp:main`: This image includes the main executable file as well as `curl` and `ffmpeg`. (platforms: `linux/amd64`, `linux/arm64`) +2. `ghcr.io/ggerganov/whisper.cpp:main-cuda`: Same as `main` but compiled with CUDA support. (platforms: `linux/amd64`) + +### Usage + +```shell +# download model and persist it in a local folder +docker run -it --rm \ + -v path/to/models:/models \ + whisper.cpp:main "./models/download-ggml-model.sh base /models" +# transcribe an audio file +docker run -it --rm \ + -v path/to/models:/models \ + -v path/to/audios:/audios \ + whisper.cpp:main "./main -m /models/ggml-base.bin -f /audios/jfk.wav" +# transcribe an audio file in samples folder +docker run -it --rm \ + -v path/to/models:/models \ + whisper.cpp:main "./main -m /models/ggml-base.bin -f ./samples/jfk.wav" +``` + ## Installing with Conan You can install pre-built binaries for whisper.cpp or build it from source using [Conan](https://conan.io/). Use the following command: