LocalAI/pkg/startup
Ettore Di Giacinto b6b8ab6c21
feat(models): pull models from urls (#2750)
* feat(models): pull models from urls

When using `run` now we can point directly to hf models via URL, for
instance:

```bash
local-ai run
huggingface://TheBloke/TinyLlama-1.1B-Chat-v0.3-GGUF/tinyllama-1.1b-chat-v0.3.Q2_K.gguf
```

Will pull the gguf model and place it in the models folder - of course
this depends on the fact that the gguf file should be automatically
detected by our guesser mechanism in order to this to make effective.

Similarly now galleries can refer to single files in the API requests.

This also changes the download code and `yaml` files now are treated in
the same way, so now config files are saved with the appropriate name
(and not hashed anymore).

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Adapt tests

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-07-11 15:04:05 +02:00
..
model_preload_test.go feat(models): pull models from urls (#2750) 2024-07-11 15:04:05 +02:00
model_preload.go feat(models): pull models from urls (#2750) 2024-07-11 15:04:05 +02:00
startup_suite_test.go feat: embedded model configurations, add popular model examples, refactoring (#1532) 2024-01-05 23:16:33 +01:00