.github | ||
.vscode | ||
api | ||
cmd/grpc | ||
examples | ||
extra | ||
internal | ||
models | ||
pkg | ||
prompt-templates | ||
tests | ||
.dockerignore | ||
.env | ||
.gitattributes | ||
.gitignore | ||
assets.go | ||
docker-compose.yaml | ||
Dockerfile | ||
Earthfile | ||
entrypoint.sh | ||
go.mod | ||
go.sum | ||
LICENSE | ||
main.go | ||
Makefile | ||
README.md | ||
renovate.json |
LocalAI
💡 Get help - ❓FAQ 💭Discussions 💬 Discord 📖 Documentation website
LocalAI is a drop-in replacement REST API that's compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU.
Follow LocalAI
Connect with the Creator
Share LocalAI Repository
In a nutshell:
- Local, OpenAI drop-in alternative REST API. You own your data.
- NO GPU required. NO Internet access is required either
- Optional, GPU Acceleration is available in
llama.cpp
-compatible LLMs. See also the build section.
- Optional, GPU Acceleration is available in
- Supports multiple models
- 🏃 Once loaded the first time, it keep models loaded in memory for faster inference
- ⚡ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.
LocalAI was created by Ettore Di Giacinto and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
Note that this started just as a fun weekend project in order to try to create the necessary pieces for a full AI assistant like ChatGPT
: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
🚀 Features
- 📖 Text generation with GPTs (
llama.cpp
,gpt4all.cpp
, ... 📖 and more) - 🗣 Text to Audio
- 🔈 Audio to Text (Audio transcription with
whisper.cpp
) - 🎨 Image generation with stable diffusion
- 🔥 OpenAI functions 🆕
- 🧠 Embeddings generation for vector databases
- ✍️ Constrained grammars
- 🖼️ Download Models directly from Huggingface
🔥🔥 Hot topics / Roadmap
- Support for embeddings
- Support for audio transcription with https://github.com/ggerganov/whisper.cpp
- Support for text-to-audio
- GPU/CUDA support ( https://github.com/go-skynet/LocalAI/issues/69 )
- Enable automatic downloading of models from a curated gallery
- Enable automatic downloading of models from HuggingFace
- Upstream our golang bindings to llama.cpp (https://github.com/ggerganov/llama.cpp/issues/351)
- Enable gallery management directly from the webui.
- 🔥 OpenAI functions: https://github.com/go-skynet/LocalAI/issues/588
- 🔥 GPTQ support: https://github.com/go-skynet/LocalAI/issues/796
📖 🎥 Media, Blogs, Social
- Create a slackbot for teams and OSS projects that answer to documentation
- LocalAI meets k8sgpt
- Question Answering on Documents locally with LangChain, LocalAI, Chroma, and GPT4All
- Tutorial to use k8sgpt with LocalAI
💻 Usage
Check out the Getting started section in our documentation.
💡 Example: Use GPT4ALL-J model
See the documentation
🔗 Resources
❤️ Sponsors
Do you find LocalAI useful?
Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.
A huge thank you to our generous sponsors who support this project:
Spectro Cloud |
Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs! |
🌟 Star history
📖 License
LocalAI is a community-driven project created by Ettore Di Giacinto.
MIT - Author Ettore Di Giacinto
🙇 Acknowledgements
LocalAI couldn't have been built without the help of great software already available from the community. Thank you!
- llama.cpp
- https://github.com/tatsu-lab/stanford_alpaca
- https://github.com/cornelk/llama-go for the initial ideas
- https://github.com/antimatter15/alpaca.cpp
- https://github.com/EdVince/Stable-Diffusion-NCNN
- https://github.com/ggerganov/whisper.cpp
- https://github.com/saharNooby/rwkv.cpp
- https://github.com/rhasspy/piper
- https://github.com/cmp-nct/ggllm.cpp
🤗 Contributors
This is a community project, a special thanks to our contributors! 🤗