add chatbot-ui example

This commit is contained in:
Marc R Kellerman 2023-04-25 08:48:43 -07:00
parent 4e2061636e
commit 325bc78acc
3 changed files with 52 additions and 0 deletions

11
examples/README.md Normal file
View File

@ -0,0 +1,11 @@
# Examples
Here is a list of projects that can easily be integrated with the LocalAI backend.
## Projects
- [chatbot-ui](https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui/) (by [@mkellerman](https://github.com/mkellerman))
## Want to contribute?
Create an issue, and put `Example: <description>` in the title! We will post your examples here.

View File

@ -0,0 +1,18 @@
# chatbot-ui
## Setup
- Set the `services > api > volumes` parameter in the docker-compose.yaml file. This should point to your local folder with your models.
- Copy/Rename your model file to `gpt-3.5-turbo` (without any .bin file extension).
- Type `docker compose up` to run the api and the Web UI.
- Open http://localhost:3000 for the Web UI.
## Known issues
- Can't select the model from the UI. Seems hardcoded to `gpt-3.5-turbo`.
- If your machine is slow, the UI will timeout on the request to the API.
### Links
- [mckaywrigley/chatbot-ui](https://github.com/mckaywrigley/chatbot-ui)

View File

@ -0,0 +1,23 @@
version: '3.6'
services:
api:
image: quay.io/go-skynet/local-ai:latest
ports:
- 8080:8080
environment:
- THREADS=4
- CONTEXT_SIZE=512
- DEBUG=true
- MODELS_PATH=/models
volumes:
- ./models:/models:cached
command: ["/usr/bin/local-ai" ]
chatgpt:
image: ghcr.io/mckaywrigley/chatbot-ui:main
ports:
- 3000:3000
environment:
- 'OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXX'
- 'OPENAI_API_HOST=http://host.docker.internal:8080'