mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-21 05:33:09 +00:00
20ec4d0342
* Update LocalAI version for telegram-bot example (fixes #2638) * Update examples/telegram-bot/docker-compose.yml Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com> --------- Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com> Co-authored-by: Ettore Di Giacinto <mudler@users.noreply.github.com> |
||
---|---|---|
.. | ||
docker-compose.yml | ||
README.md |
Telegram bot
This example uses a fork of chatgpt-telegram-bot to deploy a telegram bot with LocalAI instead of OpenAI.
# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI
cd LocalAI/examples/telegram-bot
git clone https://github.com/mudler/chatgpt_telegram_bot
cp -rf docker-compose.yml chatgpt_telegram_bot
cd chatgpt_telegram_bot
mv config/config.example.yml config/config.yml
mv config/config.example.env config/config.env
# Edit config/config.yml to set the telegram bot token
vim config/config.yml
# run the bot
docker-compose --env-file config/config.env up --build
Note: LocalAI is configured to download gpt4all-j
in place of gpt-3.5-turbo
and stablediffusion
for image generation at the first start. Download size is >6GB, if your network connection is slow, adapt the docker-compose.yml
file healthcheck section accordingly (replace 20m
, for instance with 1h
, etc.).
To configure models manually, comment the PRELOAD_MODELS
environment variable in the docker-compose.yml
file and see for instance the chatbot-ui-manual example model
directory.