LocalAI/core/backend
mintyleaf 2bc4b56a79
feat: stream tokens usage (#4415)
* Use pb.Reply instead of []byte with Reply.GetMessage() in llama grpc to get the proper usage data in reply streaming mode at the last [DONE] frame

* Fix 'hang' on empty message from the start

Seems like that empty message marker trick was unnecessary

---------

Co-authored-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-12-18 09:48:50 +01:00
..
backend_suite_test.go feat: extract output with regexes from LLMs (#3491) 2024-09-13 13:27:36 +02:00
embeddings.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00
image.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00
llm_test.go feat: extract output with regexes from LLMs (#3491) 2024-09-13 13:27:36 +02:00
llm.go feat: stream tokens usage (#4415) 2024-12-18 09:48:50 +01:00
options.go feat(llama.cpp): expose cache_type_k and cache_type_v for quant of kv cache (#4329) 2024-12-06 10:23:59 +01:00
rerank.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00
soundgeneration.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00
stores.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00
token_metrics.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00
tokenize.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00
transcript.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00
tts.go chore(refactor): drop unnecessary code in loader (#4096) 2024-11-08 21:54:25 +01:00