diff --git a/docs/content/features/embeddings.md b/docs/content/features/embeddings.md index 58992c30..09151af8 100644 --- a/docs/content/features/embeddings.md +++ b/docs/content/features/embeddings.md @@ -61,23 +61,23 @@ curl http://localhost:8080/embeddings -X POST -H "Content-Type: application/json ## Huggingface embeddings -To use `sentence-formers` and models in `huggingface` you can use the `huggingface` embedding backend. +To use `sentence-transformers` and models in `huggingface` you can use the `sentencetransformers` embedding backend. ```yaml name: text-embedding-ada-002 -backend: huggingface-embeddings +backend: sentencetransformers embeddings: true parameters: model: all-MiniLM-L6-v2 ``` -The `huggingface` backend uses Python [sentence-transformers](https://github.com/UKPLab/sentence-transformers). For a list of all pre-trained models available see here: https://github.com/UKPLab/sentence-transformers#pre-trained-models +The `sentencetransformers` backend uses Python [sentence-transformers](https://github.com/UKPLab/sentence-transformers). For a list of all pre-trained models available see here: https://github.com/UKPLab/sentence-transformers#pre-trained-models {{% notice note %}} -- The `huggingface` backend is an optional backend of LocalAI and uses Python. If you are running `LocalAI` from the containers you are good to go and should be already configured for use. If you are running `LocalAI` manually you must install the python dependencies (`pip install -r /path/to/LocalAI/extra/requirements`) and specify the extra backend in the `EXTERNAL_GRPC_BACKENDS` environment variable ( `EXTERNAL_GRPC_BACKENDS="huggingface-embeddings:/path/to/LocalAI/extra/grpc/huggingface/huggingface.py"` ) . -- The `huggingface` backend does support only embeddings of text, and not of tokens. If you need to embed tokens you can use the `bert` backend or `llama.cpp`. -- No models are required to be downloaded before using the `huggingface` backend. The models will be downloaded automatically the first time the API is used. +- The `sentencetransformers` backend is an optional backend of LocalAI and uses Python. If you are running `LocalAI` from the containers you are good to go and should be already configured for use. If you are running `LocalAI` manually you must install the python dependencies (`pip install -r /path/to/LocalAI/extra/requirements`) and specify the extra backend in the `EXTERNAL_GRPC_BACKENDS` environment variable ( `EXTERNAL_GRPC_BACKENDS="sentencetransformers:/path/to/LocalAI/backend/python/sentencetransformers/sentencetransformers.py"` ) . +- The `sentencetransformers` backend does support only embeddings of text, and not of tokens. If you need to embed tokens you can use the `bert` backend or `llama.cpp`. +- No models are required to be downloaded before using the `sentencetransformers` backend. The models will be downloaded automatically the first time the API is used. {{% /notice %}}