* Implement autofill backend
* Add autofill to ui
* Add argument to getUID to force recalculation of UID's on every call
* Add command fill
* Move popover to the right
* Merge autofill-ui into autofill
* Add minimum rows requirement for autofilling
* Rename local variable in autofill system
* Rename autofill.ts to ai.ts
* Implement generate and replace backend function
* Add purple AI button
* Add ai popover
* Add tabs to ai popover
* Cosmetic changes to AI popover
* Move command fill UI to purple button popover
* Add 'creative' toggle to generateAndReplace
* Generate and replace UI
* Call backend for generate and replace
* Change creative to unconventional in generate and replace system
* Fix generate and replace
* Add loading states
* Cosmetic changes
* Use sparkle icon
* Cosmetic changes
* Add a clarifying sentence to the prompt when the user asks for a prompt
* Change to markdown
* Add error handling to AI system
* Improve prompt prompt
* Remove 'suggestions loading' message
* Change 'pattern' to 'generate a list of' and fix a bug where i forgot to specify unordered markdown list
* Limit output to n in decode()
* Fix bug in error handling
* TEMP: try to fix autofill
* TEMP: disable autofill
* Finally fix autofill's debouncing
* Improve autofill prompt to handle commands
* Fix typo with semicolon
* Implement autofill backend
* Add autofill to ui
* Add argument to getUID to force recalculation of UID's on every call
* Add command fill
* Move popover to the right
* Merge autofill-ui into autofill
* Add minimum rows requirement for autofilling
* Rename local variable in autofill system
* Rename autofill.ts to ai.ts
* Implement generate and replace backend function
* Add purple AI button
* Add ai popover
* Add tabs to ai popover
* Cosmetic changes to AI popover
* Move command fill UI to purple button popover
* Add 'creative' toggle to generateAndReplace
* Generate and replace UI
* Call backend for generate and replace
* Change creative to unconventional in generate and replace system
* Fix generate and replace
* Add loading states
* Cosmetic changes
* Use sparkle icon
* Cosmetic changes
* Add a clarifying sentence to the prompt when the user asks for a prompt
* Change to markdown
* Add error handling to AI system
* Improve prompt prompt
* Remove 'suggestions loading' message
* Change 'pattern' to 'generate a list of' and fix a bug where i forgot to specify unordered markdown list
* Limit output to n in decode()
* Fix bug in error handling
* TEMP: try to fix autofill
* TEMP: disable autofill
* Finally fix autofill's debouncing
* Improve autofill prompt to handle commands
* Fix typo with semicolon
* Refactor the AI Popover into a new component
* Refactor the AI Popover into a new component
* Refactor the autofill functionality into two backend files
* Minor refactoring and styling fixes
* Parse markdown using markdown library
* Add no_cache flag support in backend to ignore cache for AI popover
* trim quotation marks and escape braces in AI autofill
* Add AI Support Tab in Global Settings pane.
* Convert Jinja braces
* Fix typo in AiPopover import
* Handle template variables with Extend and Autocomplete + Check template variable correctness in outputs
* Escape the braces of generate and replace prompts
* Update prompts to strengthen AI support for multiple template variables
* Log the system message
* Reduce minimum rows required to 1 for autocomplete to begin generating
* Reduce min rows to extend to 1 and add warning below 2
* Create a defaultdict utility
* Consider null values as nonexistant in defaultdict
* Make placeholders stick to their assigned text field without using defaultdict
* Make placeholder logic more readable
* Cache rendering of text fields to avoid expensive computation
* Calculate whether to refresh suggestions based on expected suggestions instead of previous suggestions
* Fix bug where LLM was returning templates in generate and replace where none was requested
* Force re-render of text fields on Extend
* Add Sean Yang to README
* Add GenAI support to Items Node
* Pass front-end API keys to AI support features
* Escape braces on Items Node outputs
* Update package to 0.2.8
* Disable autosaving if it takes 1 second or longer to save to localStorage
* Skip autosave when browser tab is inactive
* Fetch environment API keys only once upon load
* Check for OpenAI API key in AIPopover. If not present, display Alert.
---------
Co-authored-by: Sean Yang <53060248+shawseanyang@users.noreply.github.com>
* Add LLM scorer node (#107)
* Modularize the LLM list container, extracting it from prompt node
* Working LLM scorer node
* Bug and minor fixes
* Change modals to use perc left.
* Add inspect response footer to LLMEvalNode.
* Make Play buttons light green
* Fix React errors w keys in JSX arrays
* Add Chat Turn node and support for chat history (#108)
* Adds chat_history across backend's cache and querying mechanisms.
* Adds Chat Turn nodes, which allow for continuing a conversation.
* Adds automatic conversions of ChatHistory (in OpenAI format) to Anthropic and Google PaLM's chat formats. Converts chat history to appropriate format and passes it as context in the API call.
* Bug fix and error popup when missing past convo in Chat Turn
* Bug squashing to progress in chat turn node
* bug squashing
* Color false scores bright red in eval inspector
* fix tooltip when cont chat present
* Rebuild react
* bug fix llm eval node
* Add HF chat model support.
* Show multiple response objs in table inspector view
* Fix LLM item deletion bug
* Rebuild react and update package version
* Fix obscure bug when LLM outputs have no 'llm' property (due to prior CF version)
* Fix isLooselyEqual bug
* Update examples so that their cached 'fields' include llm nicknames
* rebuild react
* Add Chelse to readme
* Share button implementation and testing
* Bug fix for Azure OpenAI (missing `bind` call)
* Change `call_anthropic` to go through server proxy when not running locally
* Cleanup of unused imports
* Fixed bug in Azure OpenAI call
* Update README.md with Share, Play link, etc
* Update package version to 0.2.0.2
* Lint Python code with ruff (#60)
* Failure progress on Prompt Nodes
* Change PromptNode preview container color
* Ensure LLM colors are unique and the same across nodes
* Reset LLM colors upon flow load
* Add LLM colors to 3D scatterplot
* Extract inspector internals into separate component.
* Added inspect modal.
* Lower rate of failure for dummy LLM responses
* Fix useEffect bug in LLMResponseInspector
* Fix export to excel bug
* Remove dependence on browser support for regex negative lookbehind
* Use monospace font in textareas in Safari
* Fix settings modal bug in FireFox
* Change version
* Update README.md
the Harvard HCI website is terribly out of date (by multiple years) and my personal page on our lab website is not very informative, so I removed the Harvard HCI website and pointed to the glassmanlab main page, where all our publications are.
* Rename main folders
* Preparing for package deployment
* Use absolute paths and imports
* Move react-server into chainforge dir
* Add MANIFEST.in to copy react-server build over to package
* Add include_package_data
* Add manifest.json and icons to /static.
* Update README.md
* Update GUIDE.md