mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-19 04:37:53 +00:00
docs: update function docs
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
This commit is contained in:
parent
9b09eb005f
commit
88d0aa1e40
@ -80,6 +80,28 @@ When running the python script, be sure to:
|
|||||||
|
|
||||||
## Advanced
|
## Advanced
|
||||||
|
|
||||||
|
### Use functions without grammars
|
||||||
|
|
||||||
|
The functions calls maps automatically to grammars which are currently supported only by llama.cpp, however, it is possible to turn off the use of grammars, and extract tool arguments from the LLM responses, by specifying in the YAML file `no_grammar` and a regex to map the response from the LLM:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
function:
|
||||||
|
no_grammar: true
|
||||||
|
response_regex: "..."
|
||||||
|
```
|
||||||
|
|
||||||
|
The response regex have to be a regex with named parameters to allow to scan the function name and the arguments. For instance, consider:
|
||||||
|
|
||||||
|
```
|
||||||
|
(?P<function>\w+)\s*\((?P<arguments>.*)\)
|
||||||
|
```
|
||||||
|
|
||||||
|
will catch
|
||||||
|
|
||||||
|
```
|
||||||
|
function_name({ "foo": "bar"})
|
||||||
|
```
|
||||||
|
|
||||||
### Parallel tools calls
|
### Parallel tools calls
|
||||||
|
|
||||||
This feature is experimental and has to be configured in the YAML of the model by enabling `function.parallel_calls`:
|
This feature is experimental and has to be configured in the YAML of the model by enabling `function.parallel_calls`:
|
||||||
|
Loading…
Reference in New Issue
Block a user