feat: rebrand - LocalAGI and LocalRecall joins the LocalAI stack family (#5159)

* wip

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* docs

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Update lotusdocs and hugo

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* rephrasing

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fixups

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Latest fixups

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Adjust readme section

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
This commit is contained in:
Ettore Di Giacinto 2025-04-15 17:51:24 +02:00 committed by GitHub
parent 04d74ac648
commit 4f239bac89
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
44 changed files with 976 additions and 196 deletions

View File

@ -1,7 +1,6 @@
<h1 align="center">
<br>
<img height="300" src="https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd"> <br>
LocalAI
<img height="300" src="./core/http/static/logo.png"> <br>
<br>
</h1>
@ -48,25 +47,58 @@
[![tests](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml)[![Build and Release](https://github.com/go-skynet/LocalAI/actions/workflows/release.yaml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/release.yaml)[![build container images](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml)[![Bump dependencies](https://github.com/go-skynet/LocalAI/actions/workflows/bump_deps.yaml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/bump_deps.yaml)[![Artifact Hub](https://img.shields.io/endpoint?url=https://artifacthub.io/badge/repository/localai)](https://artifacthub.io/packages/search?repo=localai)
**LocalAI** is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API thats compatible with OpenAI (Elevenlabs, Anthropic... ) API specifications for local AI inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU. It is created and maintained by [Ettore Di Giacinto](https://github.com/mudler).
**LocalAI** is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that's compatible with OpenAI (Elevenlabs, Anthropic... ) API specifications for local AI inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU. It is created and maintained by [Ettore Di Giacinto](https://github.com/mudler).
## 📚🆕 Local Stack Family
🆕 LocalAI is now part of a comprehensive suite of AI tools designed to work together:
<table>
<tr>
<td width="50%" valign="top">
<a href="https://github.com/mudler/LocalAGI">
<img src="https://raw.githubusercontent.com/mudler/LocalAGI/refs/heads/main/webui/react-ui/public/logo_2.png" width="300" alt="LocalAGI Logo">
</a>
</td>
<td width="50%" valign="top">
<h3><a href="https://github.com/mudler/LocalAGI">LocalAGI</a></h3>
<p>A powerful Local AI agent management platform that serves as a drop-in replacement for OpenAI's Responses API, enhanced with advanced agentic capabilities.</p>
</td>
</tr>
<tr>
<td width="50%" valign="top">
<a href="https://github.com/mudler/LocalRecall">
<img src="https://raw.githubusercontent.com/mudler/LocalRecall/refs/heads/main/static/localrecall_horizontal.png" width="300" alt="LocalRecall Logo">
</a>
</td>
<td width="50%" valign="top">
<h3><a href="https://github.com/mudler/LocalRecall">LocalRecall</a></h3>
<p>A REST-ful API and knowledge base management system that provides persistent memory and storage capabilities for AI agents.</p>
</td>
</tr>
</table>
## Screenshots
| Talk Interface | Generate Audio |
| --- | --- |
| ![Screenshot 2025-03-31 at 12-01-36 LocalAI - Talk](https://github.com/user-attachments/assets/9841b1ee-88af-4b96-8ec0-41b17364efa7) | ![Screenshot 2025-03-31 at 12-01-29 LocalAI - Generate audio with voice-en-us-ryan-low](https://github.com/user-attachments/assets/d729f6f4-0621-4715-bda3-35fe6e159524) |
| ![Screenshot 2025-03-31 at 12-01-36 LocalAI - Talk](./docs/assets/images/screenshots/screenshot_tts.png) | ![Screenshot 2025-03-31 at 12-01-29 LocalAI - Generate audio with voice-en-us-ryan-low](./docs/assets/images/screenshots/screenshot_tts.png) |
| Models Overview | Generate Images |
| --- | --- |
| ![Screenshot 2025-03-31 at 12-01-20 LocalAI - Models](https://github.com/user-attachments/assets/3cf0b918-ba8e-498a-a3cd-485db5984325) | ![Screenshot 2025-03-31 at 12-31-41 LocalAI - Generate images with flux 1-dev](https://github.com/user-attachments/assets/6753d23d-218b-4e07-94b8-9e6c5a4f2311) |
| ![Screenshot 2025-03-31 at 12-01-20 LocalAI - Models](./docs/assets/images/screenshots/screenshot_gallery.png) | ![Screenshot 2025-03-31 at 12-31-41 LocalAI - Generate images with flux 1-dev](./docs/assets/images/screenshots/screenshot_image.png) |
| Chat Interface | API Overview |
| Chat Interface | Home |
| --- | --- |
| ![Screenshot 2025-03-31 at 11-57-44 LocalAI - Chat with localai-functioncall-qwen2 5-7b-v0 5](https://github.com/user-attachments/assets/048eab31-0f0c-4d52-a920-3715233f9bf3) | ![Screenshot 2025-03-31 at 11-57-23 LocalAI API - c2a39e3 (c2a39e3639227cfd94ffffe9f5691239acc275a8)](https://github.com/user-attachments/assets/2540e8ce-1a2c-4c12-800c-763bd9be247f) |
| ![Screenshot 2025-03-31 at 11-57-44 LocalAI - Chat with localai-functioncall-qwen2 5-7b-v0 5](./docs/assets/images/screenshots/screenshot_chat.png) | ![Screenshot 2025-03-31 at 11-57-23 LocalAI API - c2a39e3 (c2a39e3639227cfd94ffffe9f5691239acc275a8)](./docs/assets/images/screenshots/screenshot_home.png) |
| Login | Swarm |
| --- | --- |
|![Screenshot 2025-03-31 at 12-09-59 ](https://github.com/user-attachments/assets/5af681b0-dd8e-4fe8-a234-a22f8a040547) | ![Screenshot 2025-03-31 at 12-10-39 LocalAI - P2P dashboard](https://github.com/user-attachments/assets/b9527176-63d6-4d2e-8ed1-7fde13a9b0ad) |
|![Screenshot 2025-03-31 at 12-09-59 ](./docs/assets/images/screenshots/screenshot_login.png) | ![Screenshot 2025-03-31 at 12-10-39 LocalAI - P2P dashboard](./docs/assets/images/screenshots/screenshot_p2p.png) |
## Quickstart
## 💻 Quickstart
Run the installer script:
@ -108,10 +140,11 @@ local-ai run https://gist.githubusercontent.com/.../phi-2.yaml
local-ai run oci://localai/phi-2:latest
```
[💻 Getting started](https://localai.io/basics/getting_started/index.html)
For more information, see [💻 Getting started](https://localai.io/basics/getting_started/index.html)
## 📰 Latest project news
- Apr 2025: [LocalAGI](https://github.com/mudler/LocalAGI) and [LocalRecall](https://github.com/mudler/LocalRecall) join the LocalAI family stack.
- Apr 2025: WebUI overhaul, AIO images updates
- Feb 2025: Backend cleanup, Breaking changes, new backends (kokoro, OutelTTS, faster-whisper), Nvidia L4T images
- Jan 2025: LocalAI model release: https://huggingface.co/mudler/LocalAI-functioncall-phi-4-v0.3, SANA support in diffusers: https://github.com/mudler/LocalAI/pull/4603
@ -127,19 +160,6 @@ local-ai run oci://localai/phi-2:latest
Roadmap items: [List of issues](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap)
## 🔥🔥 Hot topics (looking for help):
- Multimodal with vLLM and Video understanding: https://github.com/mudler/LocalAI/pull/3729
- Realtime API https://github.com/mudler/LocalAI/issues/3714
- WebUI improvements: https://github.com/mudler/LocalAI/issues/2156
- Backends v2: https://github.com/mudler/LocalAI/issues/1126
- Improving UX v2: https://github.com/mudler/LocalAI/issues/1373
- Assistant API: https://github.com/mudler/LocalAI/issues/1273
- Vulkan: https://github.com/mudler/LocalAI/issues/1647
- Anthropic API: https://github.com/mudler/LocalAI/issues/1808
If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22
## 🚀 [Features](https://localai.io/features/)
- 📖 [Text generation with GPTs](https://localai.io/features/text-generation/) (`llama.cpp`, `transformers`, `vllm` ... [:book: and more](https://localai.io/model-compatibility/index.html#model-compatibility-table))
@ -153,12 +173,10 @@ If you want to help and contribute, issues up for grabs: https://github.com/mudl
- 🥽 [Vision API](https://localai.io/features/gpt-vision/)
- 📈 [Reranker API](https://localai.io/features/reranker/)
- 🆕🖧 [P2P Inferencing](https://localai.io/features/distribute/)
- [Agentic capabilities](https://github.com/mudler/LocalAGI)
- 🔊 Voice activity detection (Silero-VAD support)
- 🌍 Integrated WebUI!
## 💻 Usage
Check out the [Getting started](https://localai.io/basics/getting_started/index.html) section in our documentation.
### 🔗 Community and integrations

View File

@ -142,9 +142,9 @@ func API(application *application.Application) (*fiber.App, error) {
httpFS := http.FS(embedDirStatic)
router.Use(favicon.New(favicon.Config{
URL: "/favicon.ico",
URL: "/favicon.svg",
FileSystem: httpFS,
File: "static/favicon.ico",
File: "static/favicon.svg",
}))
router.Use("/static", filesystem.New(filesystem.Config{

View File

@ -29,9 +29,9 @@ func Explorer(db *explorer.Database) *fiber.App {
httpFS := http.FS(embedDirStatic)
app.Use(favicon.New(favicon.Config{
URL: "/favicon.ico",
URL: "/favicon.svg",
FileSystem: httpFS,
File: "static/favicon.ico",
File: "static/favicon.svg",
}))
app.Use("/static", filesystem.New(filesystem.Config{

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 108 KiB

BIN
core/http/static/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 893 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 930 KiB

View File

@ -12,7 +12,7 @@
<div class="max-w-md w-full bg-gray-800/90 border border-gray-700/50 rounded-xl overflow-hidden shadow-xl">
<div class="animation-container">
<div class="text-overlay">
<!-- <i class="fas fa-circle-nodes text-5xl text-blue-400 mb-2"></i> -->
<img src="static/logo.png" alt="LocalAI Logo" class="h-32">
</div>
</div>

View File

@ -3,7 +3,7 @@
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{{.Title}}</title>
<base href="{{.BaseURL}}" />
<link rel="icon" type="image/x-icon" href="favicon.ico" />
<link rel="shortcut icon" href="static/favicon.svg" type="image/svg">
<link rel="stylesheet" href="static/assets/highlightjs.css" />
<script defer src="static/assets/highlightjs.js"></script>
<script defer src="static/assets/alpine.js"></script>

View File

@ -4,10 +4,9 @@
<div class="flex items-center">
<!-- Logo Image -->
<a href="./" class="flex items-center group">
<img src="https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd"
<img src="static/logo_horizontal.png"
alt="LocalAI Logo"
class="h-10 mr-3 rounded-lg border border-blue-600/30 shadow-md transition-all duration-300 group-hover:shadow-blue-500/20 group-hover:border-blue-500/50">
<span class="text-white text-xl font-bold bg-clip-text text-transparent bg-gradient-to-r from-blue-400 to-indigo-400">LocalAI</span>
class="h-14 mr-3 brightness-110 transition-all duration-300 group-hover:brightness-125">
</a>
</div>

View File

@ -4,10 +4,9 @@
<div class="flex items-center">
<!-- Logo Image -->
<a href="./" class="flex items-center group">
<img src="https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd"
<img src="static/logo_horizontal.png"
alt="LocalAI Logo"
class="h-10 mr-3 rounded-lg border border-blue-600/30 shadow-md transition-all duration-300 group-hover:shadow-blue-500/20 group-hover:border-blue-500/50">
<span class="text-white text-xl font-bold bg-clip-text text-transparent bg-gradient-to-r from-blue-400 to-indigo-400">LocalAI</span>
</a>
</div>

Binary file not shown.

After

Width:  |  Height:  |  Size: 506 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 170 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 893 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 108 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 132 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 284 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 287 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 506 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 225 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 418 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 246 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 213 KiB

View File

@ -3,7 +3,7 @@
"baseUrl": ".",
"paths": {
"*": [
"../../../../.cache/hugo_cache/modules/filecache/modules/pkg/mod/github.com/gohugoio/hugo-mod-jslibs-dist/popperjs/v2@v2.21100.20000/package/dist/cjs/popper.js/*",
"../../../../.cache/hugo_cache/modules/filecache/modules/pkg/mod/github.com/gohugoio/hugo-mod-jslibs-dist/popperjs/v2@v2.21100.20000/package/dist/cjs/*",
"../../../../.cache/hugo_cache/modules/filecache/modules/pkg/mod/github.com/twbs/bootstrap@v5.3.2+incompatible/js/*"
]
}

View File

@ -48,9 +48,9 @@ defaultContentLanguage = 'en'
[params.docs] # Parameters for the /docs 'template'
logo = "https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd"
logo_text = "LocalAI"
title = "LocalAI documentation" # default html title for documentation pages/sections
logo = "https://raw.githubusercontent.com/mudler/LocalAI/refs/heads/master/core/http/static/logo.png"
logo_text = ""
title = "LocalAI" # default html title for documentation pages/sections
pathName = "docs" # path name for documentation site | default "docs"
@ -108,6 +108,7 @@ defaultContentLanguage = 'en'
# indexName = "" # Index Name to perform search on (or set env variable HUGO_PARAM_DOCSEARCH_indexName)
[params.analytics] # Parameters for Analytics (Google, Plausible)
# google = "G-XXXXXXXXXX" # Replace with your Google Analytics ID
# plausibleURL = "/docs/s" # (or set via env variable HUGO_PARAM_ANALYTICS_plausibleURL)
# plausibleAPI = "/docs/s" # optional - (or set via env variable HUGO_PARAM_ANALYTICS_plausibleAPI)
# plausibleDomain = "" # (or set via env variable HUGO_PARAM_ANALYTICS_plausibleDomain)

View File

@ -13,6 +13,8 @@ LocalAI supports two modes of distributed inferencing via p2p:
- **Federated Mode**: Requests are shared between the cluster and routed to a single worker node in the network based on the load balancer's decision.
- **Worker Mode** (aka "model sharding" or "splitting weights"): Requests are processed by all the workers which contributes to the final inference result (by sharing the model weights).
A list of global instances shared by the community is available at [explorer.localai.io](https://explorer.localai.io).
## Usage
Starting LocalAI with `--p2p` generates a shared token for connecting multiple instances: and that's all you need to create AI clusters, eliminating the need for intricate network setups.

View File

@ -18,14 +18,45 @@ To access the WebUI with an API_KEY, browser extensions such as [Requestly](http
{{% /alert %}}
## Using the Bash Installer
## Quickstart
Install LocalAI easily using the bash installer with the following command:
```sh
### Using the Bash Installer
```bash
curl https://localai.io/install.sh | sh
```
### Run with docker:
```bash
# CPU only image:
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-cpu
# Nvidia GPU:
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12
# CPU and GPU image (bigger size):
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest
# AIO images (it will pre-download a set of models ready for use, see https://localai.io/basics/container/)
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu
```
### Load models:
```bash
# From the model gallery (see available models with `local-ai models list`, in the WebUI from the model tab, or visiting https://models.localai.io)
local-ai run llama-3.2-1b-instruct:q4_k_m
# Start LocalAI with the phi-2 model directly from huggingface
local-ai run huggingface://TheBloke/phi-2-GGUF/phi-2.Q8_0.gguf
# Install and run a model from the Ollama OCI registry
local-ai run ollama://gemma:2b
# Run a model from a configuration file
local-ai run https://gist.githubusercontent.com/.../phi-2.yaml
# Install and run a model from a standard OCI registry (e.g., Docker Hub)
local-ai run oci://localai/phi-2:latest
```
For a full list of options, refer to the [Installer Options]({{% relref "docs/advanced/installer" %}}) documentation.
Binaries can also be [manually downloaded]({{% relref "docs/reference/binaries" %}}).

View File

@ -1,4 +1,3 @@
+++
title = "Overview"
weight = 1
@ -7,162 +6,96 @@ description = "What is LocalAI?"
tags = ["Beginners"]
categories = [""]
author = "Ettore Di Giacinto"
# This allows to overwrite the landing page
url = '/'
icon = "info"
+++
<p align="center">
<a href="https://localai.io"><img width=512 src="https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd"></a>
</p >
# Welcome to LocalAI
<p align="center">
<a href="https://github.com/go-skynet/LocalAI/fork" target="blank">
<img src="https://img.shields.io/github/forks/go-skynet/LocalAI?style=for-the-badge" alt="LocalAI forks"/>
</a>
<a href="https://github.com/go-skynet/LocalAI/stargazers" target="blank">
<img src="https://img.shields.io/github/stars/go-skynet/LocalAI?style=for-the-badge" alt="LocalAI stars"/>
</a>
<a href="https://github.com/go-skynet/LocalAI/pulls" target="blank">
<img src="https://img.shields.io/github/issues-pr/go-skynet/LocalAI?style=for-the-badge" alt="LocalAI pull-requests"/>
</a>
<a href='https://github.com/go-skynet/LocalAI/releases'>
<img src='https://img.shields.io/github/release/go-skynet/LocalAI?&label=Latest&style=for-the-badge'>
</a>
</p>
LocalAI is your complete AI stack for running AI models locally. It's designed to be simple, efficient, and accessible, providing a drop-in replacement for OpenAI's API while keeping your data private and secure.
<p align="center">
<a href="https://hub.docker.com/r/localai/localai" target="blank">
<img src="https://img.shields.io/badge/dockerhub-images-important.svg?logo=Docker" alt="LocalAI Docker hub"/>
</a>
<a href="https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest" target="blank">
<img src="https://img.shields.io/badge/quay.io-images-important.svg?" alt="LocalAI Quay.io"/>
</a>
</p>
## Why LocalAI?
<p align="center">
<a href="https://trendshift.io/repositories/5539" target="_blank"><img src="https://trendshift.io/api/badge/repositories/5539" alt="mudler%2FLocalAI | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
</p>
In today's AI landscape, privacy, control, and flexibility are paramount. LocalAI addresses these needs by:
<p align="center">
<a href="https://twitter.com/LocalAI_API" target="blank">
<img src="https://img.shields.io/twitter/follow/LocalAI_API?label=Follow: LocalAI_API&style=social" alt="Follow LocalAI_API"/>
</a>
<a href="https://discord.gg/uJAeKSAGDy" target="blank">
<img src="https://dcbadge.vercel.app/api/server/uJAeKSAGDy?style=flat-square&theme=default-inverted" alt="Join LocalAI Discord Community"/>
</a>
</p>
- **Privacy First**: Your data never leaves your machine
- **Complete Control**: Run models on your terms, with your hardware
- **Open Source**: MIT licensed and community-driven
- **Flexible Deployment**: From laptops to servers, with or without GPUs
- **Extensible**: Add new models and features as needed
> 💡 Get help - [❓FAQ](https://localai.io/faq/) [💭Discussions](https://github.com/go-skynet/LocalAI/discussions) [💭Discord](https://discord.gg/uJAeKSAGDy)
>
> [💻 Quickstart](https://localai.io/basics/getting_started/) [🖼️ Models](https://models.localai.io/) [🚀 Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap) [🥽 Demo](https://demo.localai.io) [🌍 Explorer](https://explorer.localai.io) [🛫 Examples](https://github.com/go-skynet/LocalAI/tree/master/examples/)
## Core Components
LocalAI is more than just a single tool - it's a complete ecosystem:
**LocalAI** is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that's compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families and architectures. Does not require GPU. It is created and maintained by [Ettore Di Giacinto](https://github.com/mudler).
1. **[LocalAI Core](https://github.com/mudler/LocalAI)**
- OpenAI-compatible API
- Multiple model support (LLMs, image, audio)
- No GPU required
- Fast inference with native bindings
- [Github repository](https://github.com/mudler/LocalAI)
2. **[LocalAGI](https://github.com/mudler/LocalAGI)**
- Autonomous AI agents
- No coding required
- WebUI and REST API support
- Extensible agent framework
- [Github repository](https://github.com/mudler/LocalAGI)
## Start LocalAI
3. **[LocalRecall](https://github.com/mudler/LocalRecall)**
- Semantic search
- Memory management
- Vector database
- Perfect for AI applications
- [Github repository](https://github.com/mudler/LocalRecall)
Start the image with Docker to have a functional clone of OpenAI! 🚀:
## Getting Started
```bash
docker run -p 8080:8080 --name local-ai -ti localai/localai:latest-aio-cpu
# Do you have a Nvidia GPUs? Use this instead
# CUDA 11
# docker run -p 8080:8080 --gpus all --name local-ai -ti localai/localai:latest-aio-gpu-nvidia-cuda-11
# CUDA 12
# docker run -p 8080:8080 --gpus all --name local-ai -ti localai/localai:latest-aio-gpu-nvidia-cuda-12
```
Or just use the bash installer:
The fastest way to get started is with our one-line installer:
```bash
curl https://localai.io/install.sh | sh
```
See the [💻 Quickstart](https://localai.io/basics/getting_started/) for all the options and way you can run LocalAI!
Or use Docker for a quick start:
## What is LocalAI?
```bash
docker run -p 8080:8080 --name local-ai -ti localai/localai:latest-aio-cpu
```
In a nutshell:
For more detailed installation options and configurations, see our [Getting Started guide](/basics/getting_started/).
- Local, OpenAI drop-in alternative REST API. You own your data.
- NO GPU required. NO Internet access is required either
- Optional, GPU Acceleration is available. See also the [build section](https://localai.io/basics/build/index.html).
- Supports multiple models
- 🏃 Once loaded the first time, it keep models loaded in memory for faster inference
- ⚡ Doesn't shell-out, but uses bindings for a faster inference and better performance.
## Key Features
LocalAI is focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
- **Text Generation**: Run various LLMs locally
- **Image Generation**: Create images with stable diffusion
- **Audio Processing**: Text-to-speech and speech-to-text
- **Vision API**: Image understanding and analysis
- **Embeddings**: Vector database support
- **Functions**: OpenAI-compatible function calling
- **P2P**: Distributed inference capabilities
Note that this started just as a fun weekend project by [mudler](https://github.com/mudler) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT`: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
## Community and Support
### 🚀 Features
LocalAI is a community-driven project. You can:
- 📖 [Text generation with GPTs](https://localai.io/features/text-generation/) (`llama.cpp`, `gpt4all.cpp`, ... [:book: and more](https://localai.io/model-compatibility/index.html#model-compatibility-table))
- 🗣 [Text to Audio](https://localai.io/features/text-to-audio/)
- 🔈 [Audio to Text](https://localai.io/features/audio-to-text/) (Audio transcription with `whisper.cpp`)
- 🎨 [Image generation with stable diffusion](https://localai.io/features/image-generation)
- 🔥 [OpenAI functions](https://localai.io/features/openai-functions/) 🆕
- 🧠 [Embeddings generation for vector databases](https://localai.io/features/embeddings/)
- ✍️ [Constrained grammars](https://localai.io/features/constrained_grammars/)
- 🖼️ [Download Models directly from Huggingface ](https://localai.io/models/)
- 🥽 [Vision API](https://localai.io/features/gpt-vision/)
- 💾 [Stores](https://localai.io/stores)
- 📈 [Reranker](https://localai.io/features/reranker/)
- 🆕🖧 [P2P Inferencing](https://localai.io/features/distribute/)
- Join our [Discord community](https://discord.gg/uJAeKSAGDy)
- Check out our [GitHub repository](https://github.com/mudler/LocalAI)
- Contribute to the project
- Share your use cases and examples
## Contribute and help
## Next Steps
To help the project you can:
Ready to dive in? Here are some recommended next steps:
- If you have technological skills and want to contribute to development, have a look at the open issues. If you are new you can have a look at the [good-first-issue](https://github.com/go-skynet/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) and [help-wanted](https://github.com/go-skynet/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) labels.
1. [Install LocalAI](/basics/getting_started/)
2. [Explore available models](https://models.localai.io)
3. [Model compatibility](/model-compatibility/)
4. [Try out examples](https://github.com/mudler/LocalAI-examples)
5. [Join the community](https://discord.gg/uJAeKSAGDy)
6. [Check the LocalAI Github repository](https://github.com/mudler/LocalAI)
7. [Check the LocalAGI Github repository](https://github.com/mudler/LocalAGI)
- If you don't have technological skills you can still help improving documentation or [add examples](https://github.com/go-skynet/LocalAI/tree/master/examples) or share your user-stories with our community, any help and contribution is welcome!
## 🌟 Star history
## License
[![LocalAI Star history Chart](https://api.star-history.com/svg?repos=mudler/LocalAI&type=Date)](https://star-history.com/#mudler/LocalAI&Date)
## ❤️ Sponsors
> Do you find LocalAI useful?
Support the project by becoming [a backer or sponsor](https://github.com/sponsors/mudler). Your logo will show up here with a link to your website.
A huge thank you to our generous sponsors who support this project covering CI expenses, and our [Sponsor list](https://github.com/sponsors/mudler):
<p align="center">
<a href="https://www.spectrocloud.com/" target="blank">
<img width=200 src="https://github.com/user-attachments/assets/72eab1dd-8b93-4fc0-9ade-84db49f24962">
</a>
<a href="https://www.premai.io/" target="blank">
<img width=200 src="https://github.com/mudler/LocalAI/assets/2420543/42e4ca83-661e-4f79-8e46-ae43689683d6"> <br>
</a>
</p>
## 📖 License
LocalAI is a community-driven project created by [Ettore Di Giacinto](https://github.com/mudler/).
MIT - Author Ettore Di Giacinto
## 🙇 Acknowledgements
LocalAI couldn't have been built without the help of great software already available from the community. Thank you!
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- https://github.com/tatsu-lab/stanford_alpaca
- https://github.com/cornelk/llama-go for the initial ideas
- https://github.com/antimatter15/alpaca.cpp
- https://github.com/EdVince/Stable-Diffusion-NCNN
- https://github.com/ggerganov/whisper.cpp
- https://github.com/saharNooby/rwkv.cpp
- https://github.com/rhasspy/piper
## 🤗 Contributors
This is a community project, a special thanks to our contributors! 🤗
<a href="https://github.com/go-skynet/LocalAI/graphs/contributors">
<img src="https://contrib.rocks/image?repo=go-skynet/LocalAI" />
</a>
LocalAI is MIT licensed, created and maintained by [Ettore Di Giacinto](https://github.com/mudler).

View File

@ -2,38 +2,212 @@
# Hero
hero:
enable: false
enable: true
weight: 10
template: hero
backgroundImage:
path: "images/templates/hero"
filename:
desktop: "gradient-desktop.webp"
mobile: "gradient-mobile.webp"
badge:
text: "⭐ 31.7k+ stars on GitHub!"
color: primary
pill: false
soft: true
titleLogo:
path: "images/logos"
filename: "logo.png"
alt: "LocalAI Logo"
height: 540px
title: ""
subtitle: |
**The free, OpenAI, Anthropic alternative. Your All-in-One Complete AI Stack** - Run powerful language models, autonomous agents, and document intelligence **locally** on your hardware.
**No cloud, no limits, no compromise.**
image:
path: "images"
filename: "localai_screenshot.png"
alt: "LocalAI Screenshot"
boxShadow: true
rounded: true
ctaButton:
icon: rocket_launch
btnText: "Get Started"
url: "/basics/getting_started/"
cta2Button:
icon: code
btnText: "View on GitHub"
url: "https://github.com/mudler/LocalAI"
info: |
**Drop-in replacement for OpenAI API** - modular suite of tools that work seamlessly together or independently.
Start with **[LocalAI](https://localai.io)**'s OpenAI-compatible API, extend with **[LocalAGI](https://github.com/mudler/LocalAGI)**'s autonomous agents, and enhance with **[LocalRecall](https://github.com/mudler/LocalRecall)**'s semantic search - all running locally on your hardware.
**Open Source** MIT Licensed.
# Feature Grid
featureGrid:
enable: false
enable: true
weight: 20
template: feature grid
title: Why choose LocalAI?
subtitle: |
**OpenAI API Compatible** - Run AI models locally with our modular ecosystem. From language models to autonomous agents and semantic search, build your complete AI stack without the cloud.
items:
- title: LLM Inferencing
icon: memory_alt
description: LocalAI is a free, **Open Source** OpenAI alternative. Run **LLMs**, generate **images**, **audio** and more **locally** with consumer grade hardware.
ctaLink:
text: learn more
url: /basics/getting_started/
- title: Agentic-first
icon: smart_toy
description: |
Extend LocalAI with LocalAGI, an autonomous AI agent platform that runs locally, no coding required.
Build and deploy autonomous agents with ease. Interact with REST APIs or use the WebUI.
ctaLink:
text: learn more
url: https://github.com/mudler/LocalAGI
- title: Memory and Knowledge base
icon: psychology
description:
Extend LocalAI with LocalRecall, A local rest api for semantic search and memory management. Perfect for AI applications.
ctaLink:
text: learn more
url: https://github.com/mudler/LocalRecall
- title: OpenAI Compatible
icon: api
description: Drop-in replacement for OpenAI API. Compatible with existing applications and libraries.
ctaLink:
text: learn more
url: /basics/getting_started/
- title: No GPU Required
icon: memory
description: Run on consumer grade hardware. No need for expensive GPUs or cloud services.
ctaLink:
text: learn more
url: /basics/getting_started/
- title: Multiple Models
icon: hub
description: |
Support for various model families including LLMs, image generation, and audio models.
Supports multiple backends for inferencing, including vLLM, llama.cpp, and more.
You can switch between them as needed and install them from the Web interface or the CLI.
ctaLink:
text: learn more
url: /model-compatibility
- title: Privacy Focused
icon: security
description: Keep your data local. No data leaves your machine, ensuring complete privacy.
ctaLink:
text: learn more
url: /basics/container/
- title: Easy Setup
icon: settings
description: Simple installation and configuration. Get started in minutes with Binaries installation, Docker, Podman, Kubernetes or local installation.
ctaLink:
text: learn more
url: /basics/getting_started/
- title: Community Driven
icon: groups
description: Active community support and regular updates. Contribute and help shape the future of LocalAI.
ctaLink:
text: learn more
url: https://github.com/mudler/LocalAI
- title: Extensible
icon: extension
description: Easy to extend and customize. Add new models and features as needed.
ctaLink:
text: learn more
url: /docs/integrations/
- title: Peer 2 Peer
icon: hub
description: |
LocalAI is designed to be a decentralized LLM inference, powered by a peer-to-peer system based on libp2p.
It is designed to be used in a local or remote network, and is compatible with any LLM model.
It works both in federated mode or by splitting models weights.
ctaLink:
text: learn more
url: /features/distribute/
- title: Open Source
icon: code
description: MIT licensed. Free to use, modify, and distribute. Community contributions welcome.
ctaLink:
text: learn more
url: https://github.com/mudler/LocalAI
imageText:
enable: true
weight: 25
template: image text
title: LocalAI
subtitle: The Free, Open Source OpenAI Alternative
title: Run AI models locally with ease
subtitle: |
LocalAI makes it simple to run various AI models on your own hardware. From text generation to image creation, autonomous agents to semantic search - all orchestrated through a unified API.
list:
- text: Optimized, fast inference
icon: speed
- text: OpenAI API compatibility
icon: api
- text: Comprensive support for many models architectures
icon: area_chart
- text: Multiple model support
icon: hub
- text: Easy to deploy with Docker
icon: accessibility
- text: Image understanding
icon: image
- text: Image generation
icon: image
- text: Audio generation
icon: music_note
- text: Voice activity detection
icon: mic
- text: Speech recognition
icon: mic
- text: Video generation
icon: movie
- text: Privacy focused
icon: security
- text: Autonomous agents with [LocalAGI](https://github.com/mudler/LocalAGI)
icon: smart_toy
- text: Semantic search with [LocalRecall](https://github.com/mudler/LocalRecall)
icon: psychology
- text: Agent orchestration
icon: hub
image:
path: "images/logos"
filename: "logo.png"
alt: "LocalAI logo" # Optional but recommended
path: "images"
filename: "imagen.png"
alt: "LocalAI Image generation"
imgOrder:
desktop: 2
@ -41,10 +215,62 @@ imageText:
ctaButton:
text: Learn more
url: "/docs/"
url: "/basics/getting_started/"
# Image compare
imageCompare:
enable: false
weight: 30
template: image compare
title: LocalAI in Action
subtitle: See how LocalAI can transform your local AI experience with various models and capabilities.
items:
- title: Text Generation
config: {
startingPoint: 50,
addCircle: true,
addCircleBlur: false,
showLabels: true,
labelOptions: {
before: 'Dark',
after: 'Light',
onHover: false
}
}
imagePath: "images/screenshots"
imageBefore: "text_generation_input.webp"
imageAfter: "text_generation_output.webp"
- title: Image Generation
config: {
startingPoint: 50,
addCircle: true,
addCircleBlur: true,
showLabels: true,
labelOptions: {
before: 'Prompt',
after: 'Result',
onHover: true
}
}
imagePath: "images/screenshots"
imageBefore: "imagen_before.webp"
imageAfter: "imagen_after.webp"
- title: Audio Generation
config: {
startingPoint: 50,
addCircle: true,
addCircleBlur: false,
showLabels: true,
labelOptions: {
before: 'Text',
after: 'Audio',
onHover: false
}
}
imagePath: "images/screenshots"
imageBefore: "audio_generation_text.webp"
imageAfter: "audio_generation_waveform.webp"

View File

@ -82,7 +82,7 @@
</span>
</button>
{{ end -}}
{{ if .Site.IsMultiLingual }}
{{ if hugo.IsMultilingual }}
<div class="dropdown">
<button class="btn btn-link btn-default dropdown-toggle ps-2" type="button" data-bs-toggle="dropdown" aria-expanded="false">
{{ site.Language.Lang | upper }}

View File

@ -18,10 +18,10 @@
<!-- Custom CSS -->
{{- $options := dict "enableSourceMap" true }}
{{- if hugo.IsProduction}}
{{- $options := dict "enableSourceMap" false "outputStyle" "compressed" }}
{{- $options = dict "enableSourceMap" false "outputStyle" "compressed" }}
{{- end }}
{{- $style := resources.Get "/scss/style.scss" }}
{{- $style = $style | resources.ExecuteAsTemplate "/scss/style.scss" . | resources.ToCSS $options }}
{{- $style = $style | resources.ExecuteAsTemplate "/scss/style.scss" . | css.Sass $options }}
{{- if hugo.IsProduction }}
{{- $style = $style | minify | fingerprint "sha384" }}
{{- end -}}
@ -39,7 +39,7 @@
<!-- Image Compare Viewer -->
{{ if ($.Scratch.Get "image_compare_enabled") }}
{{ $imagecompare := resources.Get "js/image-compare-viewer.min.js" }}
{{- if not .Site.IsServer }}
{{- if not hugo.IsDevelopment }}
{{- $js := (slice $imagecompare) | resources.Concat "/js/image-compare.js" | minify | fingerprint "sha384" }}
<script type="text/javascript" src="{{ $js.Permalink }}" integrity="{{ $js.Data.Integrity }}"></script>
{{- else }}
@ -48,14 +48,14 @@
{{- end }}
{{- end }}
<!-- Plausible Analytics Config -->
{{- if not .Site.IsServer }}
{{- if not hugo.IsDevelopment }}
{{ if and (.Site.Params.plausible.scriptURL) (.Site.Params.plausible.dataDomain) -}}
{{- partialCached "head/plausible" . }}
{{- end -}}
{{- end -}}
<!-- Google Analytics v4 Config -->
{{- if not .Site.IsServer }}
{{- if .Site.GoogleAnalytics }}
{{- if not hugo.IsDevelopment }}
{{- if .Site.Params.analytics.google }}
{{- template "_internal/google_analytics.html" . -}}
{{- end -}}
{{- end -}}

View File

@ -0,0 +1,57 @@
<!-- Navbar Start -->
<header id="topnav">
<div class="container d-flex justify-content-between align-items-center">
<!-- Logo container-->
<a class="logo" aria-label="Home" href='{{ relLangURL "" }}'>
</a>
<!-- End Logo container-->
<div class="d-flex align-items-center">
<div id="navigation">
<!-- Navigation Menu -->
<ul class="navigation-menu nav-right">
{{- range .Site.Menus.primary }}
<li><a href="{{ relLangURL .URL }}">{{ .Name }}</a></li>
{{ end }}
</ul><!--end navigation menu-->
</div><!--end navigation-->
<!-- Social Links Start -->
{{ with $.Scratch.Get "social_list" }}
<ul class="social-link d-flex list-inline mb-0">
{{ range . }}
{{ $path := printf "images/social/%s.%s" . "svg" }}
<li class="list-inline-item mb-0">
<a href="{{ if eq . `rss` }} {{ `index.xml` | absURL }} {{ else if eq . `bluesky` }} https://bsky.app/profile/{{ index site.Params.social . }} {{ else }} https://{{ . }}.com/{{ index site.Params.social . }} {{ end }}" alt="{{ . }}" rel="noopener noreferrer" target="_blank">
<div class="btn btn-icon btn-landing border-0">
{{ with resources.Get $path }}
{{ .Content | safeHTML }}
{{ end }}
</div>
</a>
</li>
{{ end }}
</ul>
{{ end }}
<!-- Social Links End -->
<div class="menu-extras ms-3 me-2">
<div class="menu-item">
<!-- Mobile menu toggle-->
<button class="navbar-toggle btn btn-icon btn-soft-light" id="isToggle" aria-label="toggleMenu" onclick="toggleMenu()">
<div class="lines">
<span></span>
<span></span>
<span></span>
</div>
</button>
<!-- End mobile menu toggle-->
</div>
</div>
</div>
</div><!--end container-->
</header><!--end header-->
<!-- Navbar End -->

View File

@ -1 +1 @@
<a href="https://localai.io"><img src="https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd"></a>
<a href="https://localai.io"><img src="https://raw.githubusercontent.com/mudler/LocalAI/refs/heads/master/core/http/static/logo.png"></a>

View File

@ -1,4 +1,4 @@
[build]
[build.environment]
HUGO_VERSION = "0.121.2"
HUGO_VERSION = "0.146.3"
GO_VERSION = "1.22.2"

Binary file not shown.

Before

Width:  |  Height:  |  Size: 57 KiB

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 359 KiB

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 52 KiB

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 769 B

After

Width:  |  Height:  |  Size: 711 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

171
docs/static/favicon.svg vendored Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 108 KiB

1
docs/static/site.webmanifest vendored Normal file
View File

@ -0,0 +1 @@
{"name":"","short_name":"","icons":[{"src":"/android-chrome-192x192.png","sizes":"192x192","type":"image/png"},{"src":"/android-chrome-512x512.png","sizes":"512x512","type":"image/png"}],"theme_color":"#ffffff","background_color":"#ffffff","display":"standalone"}

@ -1 +1 @@
Subproject commit f5785a2399ca09e7fb4e7e3d69b397f85df42a24
Subproject commit 975da91e839cfdb5c20fb66961468e77b8a9f8fd