mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-26 07:41:05 +00:00
2bc4b56a79
* Use pb.Reply instead of []byte with Reply.GetMessage() in llama grpc to get the proper usage data in reply streaming mode at the last [DONE] frame * Fix 'hang' on empty message from the start Seems like that empty message marker trick was unnecessary --------- Co-authored-by: Ettore Di Giacinto <mudler@users.noreply.github.com> |
||
---|---|---|
.. | ||
backend_suite_test.go | ||
embeddings.go | ||
image.go | ||
llm_test.go | ||
llm.go | ||
options.go | ||
rerank.go | ||
soundgeneration.go | ||
stores.go | ||
token_metrics.go | ||
tokenize.go | ||
transcript.go | ||
tts.go |