mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-24 06:46:39 +00:00
e34b5f0119
Closes https://github.com/go-skynet/LocalAI/issues/1066 and https://github.com/go-skynet/LocalAI/issues/1065 Standardizes all `examples/`: - Models in one place (other than `rwkv`, which was one-offy) - Env files as `.env.example` with `cp` - Also standardizes comments and links docs |
||
---|---|---|
.. | ||
docker-compose.yaml | ||
models | ||
README.md |
chatbot-ui
Example of integration with mckaywrigley/chatbot-ui.
Setup
# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI
cd LocalAI/examples/chatbot-ui
# (optional) Checkout a specific LocalAI tag
# git checkout -b build <TAG>
# Download gpt4all-j to models/
wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j
# start with docker-compose
docker-compose up -d --pull always
# or you can build the images with:
# docker-compose up -d --build
Then browse to http://localhost:3000
to view the Web UI.
Pointing chatbot-ui to a separately managed LocalAI service
If you want to use the chatbot-ui example with an externally managed LocalAI service, you can alter the docker-compose.yaml
file so that it looks like the below. You will notice the file is smaller, because we have removed the section that would normally start the LocalAI service. Take care to update the IP address (or FQDN) that the chatbot-ui service tries to access (marked <<LOCALAI_IP>>
below):
version: '3.6'
services:
chatgpt:
image: ghcr.io/mckaywrigley/chatbot-ui:main
ports:
- 3000:3000
environment:
- 'OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXX'
- 'OPENAI_API_HOST=http://<<LOCALAI_IP>>:8080'
Once you've edited the docker-compose.yaml
, you can start it with docker compose up
, then browse to http://localhost:3000
to view the Web UI.
Accessing chatbot-ui
Open http://localhost:3000 for the Web UI.