mirror of
https://github.com/mudler/LocalAI.git
synced 2025-02-22 10:00:47 +00:00
docs(mixtral): add mixtral example (#1449)
This commit is contained in:
parent
2f7beb6744
commit
1c286c3c2f
@ -64,4 +64,21 @@ wget https://huggingface.co/mys/ggml_bakllava-1/resolve/main/mmproj-model-f16.gg
|
|||||||
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
|
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
|
||||||
"model": "llava",
|
"model": "llava",
|
||||||
"messages": [{"role": "user", "content": [{"type":"text", "text": "What is in the image?"}, {"type": "image_url", "image_url": {"url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg" }}], "temperature": 0.9}]}'
|
"messages": [{"role": "user", "content": [{"type":"text", "text": "What is in the image?"}, {"type": "image_url", "image_url": {"url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg" }}], "temperature": 0.9}]}'
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
### Mixtral
|
||||||
|
|
||||||
|
```
|
||||||
|
cp -r examples/configuration/mixtral/* models/
|
||||||
|
wget https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF/resolve/main/mixtral-8x7b-instruct-v0.1.Q2_K.gguf -O models/mixtral-8x7b-instruct-v0.1.Q2_K.gguf
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Test it out
|
||||||
|
|
||||||
|
```
|
||||||
|
curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
|
||||||
|
"model": "mixtral",
|
||||||
|
"prompt": "How fast is light?",
|
||||||
|
"temperature": 0.1 }'
|
||||||
```
|
```
|
||||||
|
1
examples/configurations/mixtral/mixtral
Normal file
1
examples/configurations/mixtral/mixtral
Normal file
@ -0,0 +1 @@
|
|||||||
|
[INST] {{.Input}} [/INST]
|
1
examples/configurations/mixtral/mixtral-chat
Normal file
1
examples/configurations/mixtral/mixtral-chat
Normal file
@ -0,0 +1 @@
|
|||||||
|
[INST] {{.Input}} [/INST]
|
16
examples/configurations/mixtral/mixtral.yaml
Executable file
16
examples/configurations/mixtral/mixtral.yaml
Executable file
@ -0,0 +1,16 @@
|
|||||||
|
context_size: 512
|
||||||
|
f16: true
|
||||||
|
threads: 11
|
||||||
|
gpu_layers: 90
|
||||||
|
name: mixtral
|
||||||
|
mmap: true
|
||||||
|
parameters:
|
||||||
|
model: mixtral-8x7b-instruct-v0.1.Q2_K.gguf
|
||||||
|
temperature: 0.2
|
||||||
|
top_k: 40
|
||||||
|
top_p: 0.95
|
||||||
|
batch: 512
|
||||||
|
tfz: 1.0
|
||||||
|
template:
|
||||||
|
chat: mixtral-chat
|
||||||
|
completion: mixtral
|
Loading…
x
Reference in New Issue
Block a user