mirror of
https://github.com/mudler/LocalAI.git
synced 2025-05-15 15:03:16 +00:00
docs: Add docker instructions, add community projects section in README (#1359)
docs: Add docker instructions
This commit is contained in:
parent
687730a7f5
commit
fd1b7b3f22
12
README.md
12
README.md
@ -109,9 +109,17 @@ Hot topics:
|
|||||||
|
|
||||||
Check out the [Getting started](https://localai.io/basics/getting_started/index.html) section in our documentation.
|
Check out the [Getting started](https://localai.io/basics/getting_started/index.html) section in our documentation.
|
||||||
|
|
||||||
### 💡 Example: Use Luna-AI Llama model
|
### Community
|
||||||
|
|
||||||
See the [documentation](https://localai.io/basics/getting_started)
|
WebUI
|
||||||
|
- https://github.com/Jirubizu/localai-admin
|
||||||
|
- https://github.com/go-skynet/LocalAI-frontend
|
||||||
|
|
||||||
|
Model galleries
|
||||||
|
- https://github.com/go-skynet/model-gallery
|
||||||
|
|
||||||
|
Other:
|
||||||
|
- Helm chart https://github.com/go-skynet/helm-charts
|
||||||
|
|
||||||
### 🔗 Resources
|
### 🔗 Resources
|
||||||
|
|
||||||
|
@ -6,13 +6,36 @@ weight = 1
|
|||||||
url = '/basics/getting_started/'
|
url = '/basics/getting_started/'
|
||||||
+++
|
+++
|
||||||
|
|
||||||
`LocalAI` is available as a container image and binary. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest).
|
`LocalAI` is available as a container image and binary. It can be used with docker, podman, kubernetes and any container engine. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest).
|
||||||
|
|
||||||
|
See also our [How to]({{%relref "howtos" %}}) section for end-to-end guided examples curated by the community.
|
||||||
|
|
||||||
### How to get started
|
### How to get started
|
||||||
For a always up to date step by step how to of setting up LocalAI, Please see our [How to]({{%relref "howtos" %}}) page.
|
|
||||||
|
|
||||||
### Fast Setup
|
The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})).
|
||||||
The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})). The following example uses `docker compose`:
|
|
||||||
|
{{< tabs >}}
|
||||||
|
{{% tab name="Docker" %}}
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Prepare the models into the `model` directory
|
||||||
|
mkdir models
|
||||||
|
# copy your models to it
|
||||||
|
cp your-model.bin models/
|
||||||
|
# run the LocalAI container
|
||||||
|
docker run -p 8080:8080 -v $PWD/models:/models -ti --rm quay.io/go-skynet/local-ai:latest --models-path /models --context-size 700 --threads 4
|
||||||
|
# Try the endpoint with curl
|
||||||
|
curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
|
||||||
|
"model": "your-model.bin",
|
||||||
|
"prompt": "A long time ago in a galaxy far, far away",
|
||||||
|
"temperature": 0.7
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
{{% /tab %}}
|
||||||
|
{{% tab name="Docker compose" %}}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
|
||||||
@ -44,6 +67,9 @@ curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d
|
|||||||
"temperature": 0.7
|
"temperature": 0.7
|
||||||
}'
|
}'
|
||||||
```
|
```
|
||||||
|
{{% /tab %}}
|
||||||
|
|
||||||
|
{{< /tabs >}}
|
||||||
|
|
||||||
### Example: Use luna-ai-llama2 model with `docker compose`
|
### Example: Use luna-ai-llama2 model with `docker compose`
|
||||||
|
|
||||||
|
Loading…
x
Reference in New Issue
Block a user