From fd1b7b3f22ae557dd6f5a6da75c923a8ebe8d2f2 Mon Sep 17 00:00:00 2001 From: Ettore Di Giacinto Date: Tue, 28 Nov 2023 23:14:16 +0100 Subject: [PATCH] docs: Add docker instructions, add community projects section in README (#1359) docs: Add docker instructions --- README.md | 12 ++++++-- docs/content/getting_started/_index.en.md | 34 ++++++++++++++++++++--- 2 files changed, 40 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index dbda44de..c974c186 100644 --- a/README.md +++ b/README.md @@ -109,9 +109,17 @@ Hot topics: Check out the [Getting started](https://localai.io/basics/getting_started/index.html) section in our documentation. -### 💡 Example: Use Luna-AI Llama model +### Community -See the [documentation](https://localai.io/basics/getting_started) +WebUI +- https://github.com/Jirubizu/localai-admin +- https://github.com/go-skynet/LocalAI-frontend + +Model galleries +- https://github.com/go-skynet/model-gallery + +Other: +- Helm chart https://github.com/go-skynet/helm-charts ### 🔗 Resources diff --git a/docs/content/getting_started/_index.en.md b/docs/content/getting_started/_index.en.md index 4677ef6b..fe94ae7e 100644 --- a/docs/content/getting_started/_index.en.md +++ b/docs/content/getting_started/_index.en.md @@ -6,13 +6,36 @@ weight = 1 url = '/basics/getting_started/' +++ -`LocalAI` is available as a container image and binary. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest). +`LocalAI` is available as a container image and binary. It can be used with docker, podman, kubernetes and any container engine. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest). + +See also our [How to]({{%relref "howtos" %}}) section for end-to-end guided examples curated by the community. ### How to get started -For a always up to date step by step how to of setting up LocalAI, Please see our [How to]({{%relref "howtos" %}}) page. -### Fast Setup -The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})). The following example uses `docker compose`: +The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})). + +{{< tabs >}} +{{% tab name="Docker" %}} + +```bash +# Prepare the models into the `model` directory +mkdir models +# copy your models to it +cp your-model.bin models/ +# run the LocalAI container +docker run -p 8080:8080 -v $PWD/models:/models -ti --rm quay.io/go-skynet/local-ai:latest --models-path /models --context-size 700 --threads 4 +# Try the endpoint with curl +curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{ + "model": "your-model.bin", + "prompt": "A long time ago in a galaxy far, far away", + "temperature": 0.7 + }' +``` + +{{% /tab %}} +{{% tab name="Docker compose" %}} + + ```bash @@ -44,6 +67,9 @@ curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d "temperature": 0.7 }' ``` +{{% /tab %}} + +{{< /tabs >}} ### Example: Use luna-ai-llama2 model with `docker compose`