LocalAI/api/openai
Dave c6bf67f446
feat(llama2): add template for chat messages (#782)
Co-authored-by: Aman Karmani <aman@tmm1.net>

Lays some of the groundwork for LLAMA2 compatibility as well as other future models with complex prompting schemes.

Started small refactoring in pkg/model/loader.go regarding template loading. Currently still a part of ModelLoader, but should be easy to add template loading for situations other than overall prompt templates and the new chat-specific per-message templates
Adds support for new chat-endpoint-specific, per-message templates as an alternative to the existing Role: XYZ sprintf method.
Includes a temporary prompt template as an example, since I have a few questions before we merge in the model-gallery side changes (see )
Minor debug logging changes.
2023-07-22 11:31:39 -04:00
..
api.go fix: make completions endpoint more close to OpenAI specification (#790) 2023-07-22 00:53:52 +02:00
chat.go feat(llama2): add template for chat messages (#782) 2023-07-22 11:31:39 -04:00
completion.go feat(llama2): add template for chat messages (#782) 2023-07-22 11:31:39 -04:00
edit.go feat(llama2): add template for chat messages (#782) 2023-07-22 11:31:39 -04:00
embeddings.go feat: various refactorings 2023-07-15 01:19:43 +02:00
image.go feat: various refactorings 2023-07-15 01:19:43 +02:00
inference.go feat: various refactorings 2023-07-15 01:19:43 +02:00
list.go feat: various refactorings 2023-07-15 01:19:43 +02:00
request.go feat: various refactorings 2023-07-15 01:19:43 +02:00
transcription.go feat: add external grpc and model autoloading 2023-07-20 22:10:12 +02:00