mirror of
https://github.com/mudler/LocalAI.git
synced 2024-12-18 12:26:26 +00:00
2bc4b56a79
* Use pb.Reply instead of []byte with Reply.GetMessage() in llama grpc to get the proper usage data in reply streaming mode at the last [DONE] frame * Fix 'hang' on empty message from the start Seems like that empty message marker trick was unnecessary --------- Co-authored-by: Ettore Di Giacinto <mudler@users.noreply.github.com> |
||
---|---|---|
.. | ||
base | ||
backend.go | ||
client.go | ||
embed.go | ||
interface.go | ||
server.go |