ianarawjo 7eb5aaa26d
Model settings, more models, and visible temperature (#57)
* Model settings forms

* Editable nicknames and emojis

* Saving and loading model settings

* Temperature indicator on LLM items in PromptNodes

* Ensure LLM nicknames are unique

* Detect when PaLM blocks responses and output standard error msg in response instead

* Fix examples/ to use new cache format

* Add helpful 'could not reach server' text on countQueries fail

* Add Dalai model settings

* Rebuild react and update package version
2023-06-01 15:08:17 -04:00
..