mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-20 21:23:10 +00:00
d5d9e78983
Bumps [langchain](https://github.com/langchain-ai/langchain) from 0.3.2 to 0.3.3. - [Release notes](https://github.com/langchain-ai/langchain/releases) - [Commits](https://github.com/langchain-ai/langchain/compare/langchain==0.3.2...langchain==0.3.3) --- updated-dependencies: - dependency-name: langchain dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
.env.example | ||
docker-compose.yaml | ||
Dockerfile | ||
functions-openai.py | ||
README.md | ||
requirements.txt |
LocalAI functions
Example of using LocalAI functions, see the OpenAI blog post.
Run
# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI
cd LocalAI/examples/functions
cp -rfv .env.example .env
# Edit the .env file to set a different model by editing `PRELOAD_MODELS`.
vim .env
docker-compose run --rm functions
Note: The example automatically downloads the openllama
model as it is under a permissive license.