mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-20 21:23:10 +00:00
7da07e8af9
Signed-off-by: mudler <mudler@localai.io> |
||
---|---|---|
.. | ||
broken-pod.yaml | ||
README.md | ||
values.yaml |
k8sgpt example
This example show how to use LocalAI with k8sgpt
Create the cluster locally with Kind (optional)
If you want to test this locally without a remote Kubernetes cluster, you can use kind.
Install kind and create a cluster:
kind create cluster
Setup LocalAI
We will use helm:
helm repo add go-skynet https://go-skynet.github.io/helm-charts/
helm repo update
# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI
cd LocalAI/examples/k8sgpt
# modify values.yaml preload_models with the models you want to install.
# CHANGE the URL to a model in huggingface.
helm install local-ai go-skynet/local-ai --create-namespace --namespace local-ai --values values.yaml
Setup K8sGPT
# Install k8sgpt
helm repo add k8sgpt https://charts.k8sgpt.ai/
helm repo update
helm install release k8sgpt/k8sgpt-operator -n k8sgpt-operator-system --create-namespace
Apply the k8sgpt-operator configuration:
kubectl apply -f - << EOF
apiVersion: core.k8sgpt.ai/v1alpha1
kind: K8sGPT
metadata:
name: k8sgpt-local-ai
namespace: default
spec:
backend: localai
baseUrl: http://local-ai.local-ai.svc.cluster.local:8080/v1
noCache: false
model: gpt-3.5-turbo
noCache: false
version: v0.3.0
enableAI: true
EOF
Test
Apply a broken pod:
kubectl apply -f broken-pod.yaml