mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-28 08:28:51 +00:00
55318cca0f
Bumps [langchain](https://github.com/langchain-ai/langchain) from 0.2.10 to 0.2.12. - [Release notes](https://github.com/langchain-ai/langchain/releases) - [Commits](https://github.com/langchain-ai/langchain/compare/langchain==0.2.10...langchain==0.2.12) --- updated-dependencies: - dependency-name: langchain dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
.env.example | ||
docker-compose.yaml | ||
Dockerfile | ||
functions-openai.py | ||
README.md | ||
requirements.txt |
LocalAI functions
Example of using LocalAI functions, see the OpenAI blog post.
Run
# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI
cd LocalAI/examples/functions
cp -rfv .env.example .env
# Edit the .env file to set a different model by editing `PRELOAD_MODELS`.
vim .env
docker-compose run --rm functions
Note: The example automatically downloads the openllama
model as it is under a permissive license.