🤖 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
Go to file
Ettore Di Giacinto cf747bcdec
feat: extract output with regexes from LLMs (#3491)
* feat: extract output with regexes from LLMs

This changset adds `extract_regex` to the LLM config. It is a list of
regexes that can match output and will be used to re extract text from
the LLM output. This is particularly useful for LLMs which outputs final
results into tags.

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Add tests, enhance output in case of configuration error

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-09-13 13:27:36 +02:00
.devcontainer feat: devcontainer part 4 (#3339) 2024-08-20 19:25:22 +02:00
.devcontainer-scripts fix: devcontainer utils.sh ssh copy improvements (#3372) 2024-08-24 22:42:05 +00:00
.github chore(deps): Bump peter-evans/create-pull-request from 6 to 7 (#3518) 2024-09-10 01:52:16 +00:00
.vscode feat: Initial Version of vscode DevContainer (#3217) 2024-08-14 09:06:41 +02:00
aio fix: purge a few remaining runway model references (#3480) 2024-09-04 16:29:09 +02:00
backend chore(deps): update llama.cpp (#3497) 2024-09-12 20:55:27 +02:00
configuration refactor: move remaining api packages to core (#1731) 2024-03-01 16:19:53 +01:00
core feat: extract output with regexes from LLMs (#3491) 2024-09-13 13:27:36 +02:00
custom-ca-certs feat(certificates): add support for custom CA certificates (#880) 2023-11-01 20:10:14 +01:00
docs chore(deps): Bump docs/themes/hugo-theme-relearn from 550a6ee to f696f60 (#3505) 2024-09-09 21:49:26 +00:00
embedded chore(anime.js): drop unused (#3351) 2024-08-21 13:10:09 +02:00
examples chore(deps): Bump llama-index from 0.11.4 to 0.11.7 in /examples/chainlit (#3516) 2024-09-11 09:15:52 +02:00
gallery models(gallery): add athena-codegemma-2-2b-it (#3490) 2024-09-07 12:09:29 +02:00
internal feat: cleanups, small enhancements 2023-07-04 18:58:19 +02:00
models Add docker-compose 2023-04-13 01:13:14 +02:00
pkg feat: add endpoint to list system informations (#3449) 2024-09-05 20:44:30 +02:00
prompt-templates Requested Changes from GPT4ALL to Luna-AI-Llama2 (#1092) 2023-09-22 11:22:17 +02:00
scripts fix(scripts): minor fixup to gallery scripts 2024-07-13 11:36:20 +02:00
swagger feat(swagger): update swagger (#3484) 2024-09-05 22:21:24 +00:00
tests rf: centralize base64 image handling (#2595) 2024-06-24 08:34:36 +02:00
.dockerignore feat: Initial Version of vscode DevContainer (#3217) 2024-08-14 09:06:41 +02:00
.editorconfig feat(stores): Vector store backend (#1795) 2024-03-22 21:14:04 +01:00
.env feat: Initial Version of vscode DevContainer (#3217) 2024-08-14 09:06:41 +02:00
.gitattributes Create .gitattributes to force git clone to keep the LF line endings on .sh files (#838) 2023-07-30 15:27:43 +02:00
.gitignore feat: devcontainer part 3 (#3318) 2024-08-20 12:16:21 +02:00
.gitmodules docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
.yamllint fix: yamlint warnings and errors (#2131) 2024-04-25 17:25:56 +00:00
assets.go feat: Update gpt4all, support multiple implementations in runtime (#472) 2023-06-01 23:38:52 +02:00
CONTRIBUTING.md Update CONTRIBUTING.md 2024-04-12 15:27:40 +02:00
docker-compose.yaml feat: Initial Version of vscode DevContainer (#3217) 2024-08-14 09:06:41 +02:00
Dockerfile fix: speedup and improve cachability of docker build of builder-sd (#3430) 2024-09-10 08:57:16 +02:00
Dockerfile.aio feat(aio): entrypoint, update workflows (#1872) 2024-03-21 22:09:04 +01:00
Earthfile Rename project to LocalAI (#35) 2023-04-19 18:43:10 +02:00
Entitlements.plist Feat: OSX Local Codesigning (#1319) 2023-11-23 15:22:54 +01:00
entrypoint.sh deps(llama.cpp): bump to latest, update build variables (#2669) 2024-06-27 23:10:04 +02:00
go.mod chore(deps): update edgevpn to v0.28.3 2024-08-27 17:29:32 +02:00
go.sum chore(deps): update edgevpn to v0.28.3 2024-08-27 17:29:32 +02:00
LICENSE docs/examples: enhancements (#1572) 2024-01-18 19:41:08 +01:00
main.go chore: fix go.mod module (#2635) 2024-06-23 08:24:36 +00:00
Makefile chore(deps): update llama.cpp (#3497) 2024-09-12 20:55:27 +02:00
README.md Update README.md 2024-08-28 14:48:16 +02:00
renovate.json ci: manually update deps 2023-05-04 15:01:29 +02:00
SECURITY.md Create SECURITY.md 2024-02-29 19:53:04 +01:00



LocalAI

LocalAI forks LocalAI stars LocalAI pull-requests

LocalAI Docker hub LocalAI Quay.io

Follow LocalAI_API Join LocalAI Discord Community

💡 Get help - FAQ 💭Discussions 💬 Discord 📖 Documentation website

💻 Quickstart 🖼️ Models 🚀 Roadmap 🥽 Demo 🌍 Explorer 🛫 Examples

testsBuild and Releasebuild container imagesBump dependenciesArtifact Hub

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API thats compatible with OpenAI (Elevenlabs, Anthropic... ) API specifications for local AI inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU. It is created and maintained by Ettore Di Giacinto.

screen

Run the installer script:

curl https://localai.io/install.sh | sh

Or run with docker:

docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu
# Alternative images:
# - if you have an Nvidia GPU:
# docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-aio-gpu-nvidia-cuda-12
# - without preconfigured models
# docker run -ti --name local-ai -p 8080:8080 localai/localai:latest
# - without preconfigured models for Nvidia GPUs
# docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12 

💻 Getting started

🔥🔥 Hot topics / Roadmap

Roadmap

Hot topics (looking for contributors):

If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22

🚀 Features

💻 Usage

Check out the Getting started section in our documentation.

🔗 Community and integrations

Build and deploy custom containers:

WebUIs:

Model galleries

Other:

🔗 Resources

📖 🎥 Media, Blogs, Social

Citation

If you utilize this repository, data in a downstream project, please consider citing it with:

@misc{localai,
  author = {Ettore Di Giacinto},
  title = {LocalAI: The free, Open source OpenAI alternative},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/go-skynet/LocalAI}},

❤️ Sponsors

Do you find LocalAI useful?

Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.

A huge thank you to our generous sponsors who support this project covering CI expenses, and our Sponsor list:


🌟 Star history

LocalAI Star history Chart

📖 License

LocalAI is a community-driven project created by Ettore Di Giacinto.

MIT - Author Ettore Di Giacinto mudler@localai.io

🙇 Acknowledgements

LocalAI couldn't have been built without the help of great software already available from the community. Thank you!

🤗 Contributors

This is a community project, a special thanks to our contributors! 🤗