mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-22 22:12:23 +00:00
chore(model gallery): add falcon3-1b-instruct (#4423)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
This commit is contained in:
parent
0b4bb7a562
commit
f52c6e3a31
gallery
40
gallery/falcon3.yaml
Normal file
40
gallery/falcon3.yaml
Normal file
@ -0,0 +1,40 @@
|
||||
---
|
||||
name: "falcon3"
|
||||
|
||||
config_file: |
|
||||
mmap: true
|
||||
template:
|
||||
chat_message: |
|
||||
<|{{ .RoleName }}|>
|
||||
{{ if .FunctionCall -}}
|
||||
Function call:
|
||||
{{ else if eq .RoleName "tool" -}}
|
||||
Function response:
|
||||
{{ end -}}
|
||||
{{ if .Content -}}
|
||||
{{.Content }}
|
||||
{{ end -}}
|
||||
{{ if .FunctionCall -}}
|
||||
{{toJson .FunctionCall}}
|
||||
{{ end -}}
|
||||
{{ if eq .RoleName "assistant" }}<|endoftext|>{{ end }}
|
||||
function: |
|
||||
<|system|>
|
||||
You are a function calling AI model. You are provided with functions to execute. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. Here are the available tools:
|
||||
{{range .Functions}}
|
||||
{'type': 'function', 'function': {'name': '{{.Name}}', 'description': '{{.Description}}', 'parameters': {{toJson .Parameters}} }}
|
||||
{{end}}
|
||||
For each function call return a json object with function name and arguments
|
||||
{{.Input }}
|
||||
<|im_start|>assistant
|
||||
chat: |
|
||||
{{.Input }}
|
||||
<|im_start|>assistant
|
||||
completion: |
|
||||
{{.Input}}
|
||||
context_size: 4096
|
||||
f16: true
|
||||
stopwords:
|
||||
- '<|endoftext|>'
|
||||
- '<dummy32000>'
|
||||
- '</s>'
|
@ -1,4 +1,29 @@
|
||||
---
|
||||
- &falcon3
|
||||
name: "falcon3-1b-instruct"
|
||||
url: "github:mudler/LocalAI/gallery/falcon3.yaml@master"
|
||||
icon: https://huggingface.co/datasets/tiiuae/documentation-images/resolve/main/general/falco3-logo.png
|
||||
urls:
|
||||
- https://huggingface.co/tiiuae/Falcon3-1B-Instruct
|
||||
- https://huggingface.co/bartowski/Falcon3-1B-Instruct-GGUF
|
||||
description: |
|
||||
Falcon3 family of Open Foundation Models is a set of pretrained and instruct LLMs ranging from 1B to 10B parameters.
|
||||
|
||||
This repository contains the Falcon3-1B-Instruct. It achieves strong results on reasoning, language understanding, instruction following, code and mathematics tasks. Falcon3-1B-Instruct supports 4 languages (English, French, Spanish, Portuguese) and a context length of up to 8K.
|
||||
overrides:
|
||||
parameters:
|
||||
model: Falcon3-1B-Instruct-Q4_K_M.gguf
|
||||
files:
|
||||
- filename: Falcon3-1B-Instruct-Q4_K_M.gguf
|
||||
sha256: d351a6506b7d21221f3858b04d98c8b1b7b108b85acde2b13b69d9cb06e2a7e9
|
||||
uri: huggingface://bartowski/Falcon3-1B-Instruct-GGUF/Falcon3-1B-Instruct-Q4_K_M.gguf
|
||||
tags:
|
||||
- llm
|
||||
- gguf
|
||||
- gpu
|
||||
- cpu
|
||||
- falcon
|
||||
license: falcon-llm
|
||||
- &intellect1
|
||||
name: "intellect-1-instruct"
|
||||
url: "github:mudler/LocalAI/gallery/llama3.1-instruct.yaml@master"
|
||||
|
Loading…
Reference in New Issue
Block a user