mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-19 04:37:53 +00:00
589a2ac869
Bumps [langchain](https://github.com/langchain-ai/langchain) from 0.2.14 to 0.2.15. - [Release notes](https://github.com/langchain-ai/langchain/releases) - [Commits](https://github.com/langchain-ai/langchain/compare/langchain==0.2.14...langchain==0.2.15) --- updated-dependencies: - dependency-name: langchain dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
.env.example | ||
docker-compose.yaml | ||
Dockerfile | ||
functions-openai.py | ||
README.md | ||
requirements.txt |
LocalAI functions
Example of using LocalAI functions, see the OpenAI blog post.
Run
# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI
cd LocalAI/examples/functions
cp -rfv .env.example .env
# Edit the .env file to set a different model by editing `PRELOAD_MODELS`.
vim .env
docker-compose run --rm functions
Note: The example automatically downloads the openllama
model as it is under a permissive license.