LocalAI/core/config
Ettore Di Giacinto 3b4c1a7054 Load wrapper clients
Testing with:

```yaml
name: gpt-4o
pipeline:
 tts: voice-it-riccardo_fasol-x-low
 transcription: whisper-base-q5_1
 llm: llama-3.2-1b-instruct:q4_k_m
```

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-12-23 16:00:23 +01:00
..
application_config.go feat: allow to disable '/metrics' endpoints for local stats (#3945) 2024-10-23 15:34:32 +02:00
backend_config_filter.go groundwork: ListModels Filtering Upgrade (#2773) 2024-10-01 18:55:46 +00:00
backend_config_loader.go chore(refactor): imply modelpath (#4208) 2024-11-20 18:06:35 +01:00
backend_config_test.go groundwork: ListModels Filtering Upgrade (#2773) 2024-10-01 18:55:46 +00:00
backend_config.go Load wrapper clients 2024-12-23 16:00:23 +01:00
config_suite_test.go dependencies(grpcio): bump to fix CI issues (#2362) 2024-05-21 14:33:47 +02:00
config_test.go feat(llama.cpp): guess model defaults from file (#2522) 2024-06-08 22:13:02 +02:00
gallery.go refactor: gallery inconsistencies (#2647) 2024-06-24 17:32:12 +02:00
guesser.go feat(template): read jinja templates from gguf files (#4332) 2024-12-08 13:50:33 +01:00