AI for ChainForge BETA: TextFields, Items (#191)

* Implement autofill backend

* Add autofill to ui

* Add argument to getUID to force recalculation of UID's on every call

* Add command fill

* Move popover to the right

* Merge autofill-ui into autofill

* Add minimum rows requirement for autofilling

* Rename local variable in autofill system

* Rename autofill.ts to ai.ts

* Implement generate and replace backend function

* Add purple AI button

* Add ai popover

* Add tabs to ai popover

* Cosmetic changes to AI popover

* Move command fill UI to purple button popover

* Add 'creative' toggle to generateAndReplace

* Generate and replace UI

* Call backend for generate and replace

* Change creative to unconventional in generate and replace system

* Fix generate and replace

* Add loading states

* Cosmetic changes

* Use sparkle icon

* Cosmetic changes

* Add a clarifying sentence to the prompt when the user asks for a prompt

* Change to markdown

* Add error handling to AI system

* Improve prompt prompt

* Remove 'suggestions loading' message

* Change 'pattern' to 'generate a list of' and fix a bug where i forgot to specify unordered markdown list

* Limit output to n in decode()

* Fix bug in error handling

* TEMP: try to fix autofill

* TEMP: disable autofill

* Finally fix autofill's debouncing

* Improve autofill prompt to handle commands

* Fix typo with semicolon

* Implement autofill backend

* Add autofill to ui

* Add argument to getUID to force recalculation of UID's on every call

* Add command fill

* Move popover to the right

* Merge autofill-ui into autofill

* Add minimum rows requirement for autofilling

* Rename local variable in autofill system

* Rename autofill.ts to ai.ts

* Implement generate and replace backend function

* Add purple AI button

* Add ai popover

* Add tabs to ai popover

* Cosmetic changes to AI popover

* Move command fill UI to purple button popover

* Add 'creative' toggle to generateAndReplace

* Generate and replace UI

* Call backend for generate and replace

* Change creative to unconventional in generate and replace system

* Fix generate and replace

* Add loading states

* Cosmetic changes

* Use sparkle icon

* Cosmetic changes

* Add a clarifying sentence to the prompt when the user asks for a prompt

* Change to markdown

* Add error handling to AI system

* Improve prompt prompt

* Remove 'suggestions loading' message

* Change 'pattern' to 'generate a list of' and fix a bug where i forgot to specify unordered markdown list

* Limit output to n in decode()

* Fix bug in error handling

* TEMP: try to fix autofill

* TEMP: disable autofill

* Finally fix autofill's debouncing

* Improve autofill prompt to handle commands

* Fix typo with semicolon

* Refactor the AI Popover into a new component

* Refactor the AI Popover into a new component

* Refactor the autofill functionality into two backend files

* Minor refactoring and styling fixes

* Parse markdown using markdown library

* Add no_cache flag support in backend to ignore cache for AI popover

* trim quotation marks and escape braces in AI autofill

* Add AI Support Tab in Global Settings pane.

* Convert Jinja braces

* Fix typo in AiPopover import

* Handle template variables with Extend and Autocomplete + Check template variable correctness in outputs

* Escape the braces of generate and replace prompts

* Update prompts to strengthen AI support for multiple template variables

* Log the system message

* Reduce minimum rows required to 1 for autocomplete to begin generating

* Reduce min rows to extend to 1 and add warning below 2

* Create a defaultdict utility

* Consider null values as nonexistant in defaultdict

* Make placeholders stick to their assigned text field without using defaultdict

* Make placeholder logic more readable

* Cache rendering of text fields to avoid expensive computation

* Calculate whether to refresh suggestions based on expected suggestions instead of previous suggestions

* Fix bug where LLM was returning templates in generate and replace where none was requested

* Force re-render of text fields on Extend

* Add Sean Yang to README

* Add GenAI support to Items Node

* Pass front-end API keys to AI support features

* Escape braces on Items Node outputs

* Update package to 0.2.8

* Disable autosaving if it takes 1 second or longer to save to localStorage

* Skip autosave when browser tab is inactive

* Fetch environment API keys only once upon load

* Check for OpenAI API key in AIPopover. If not present, display Alert.

---------

Co-authored-by: Sean Yang <53060248+shawseanyang@users.noreply.github.com>
This commit is contained in:
ianarawjo 2023-12-13 11:58:07 -05:00 committed by GitHub
parent bacf61be18
commit ce583a216c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
33 changed files with 1834 additions and 131 deletions

View File

@ -98,7 +98,7 @@ For more specific details, see the [Node Guide](https://github.com/ianarawjo/Cha
# Development
ChainForge was created by [Ian Arawjo](http://ianarawjo.com/index.html), a postdoctoral scholar in Harvard HCI's [Glassman Lab](http://glassmanlab.seas.harvard.edu/) with support from the Harvard HCI community. Collaborators include PhD students [Priyan Vaithilingam](https://priyan.info) and [Chelse Swoopes](https://seas.harvard.edu/person/chelse-swoopes) and faculty members [Elena Glassman](http://glassmanlab.seas.harvard.edu/glassman.html) and [Martin Wattenberg](https://www.bewitched.com/about.html).
ChainForge was created by [Ian Arawjo](http://ianarawjo.com/index.html), a postdoctoral scholar in Harvard HCI's [Glassman Lab](http://glassmanlab.seas.harvard.edu/) with support from the Harvard HCI community. Collaborators include PhD students [Priyan Vaithilingam](https://priyan.info) and [Chelse Swoopes](https://seas.harvard.edu/person/chelse-swoopes), Harvard undergraduate [Sean Yang](https://shawsean.com), and faculty members [Elena Glassman](http://glassmanlab.seas.harvard.edu/glassman.html) and [Martin Wattenberg](https://www.bewitched.com/about.html).
This work was partially funded by the NSF grants IIS-2107391, IIS-2040880, and IIS-1955699. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

View File

@ -1,15 +1,15 @@
{
"files": {
"main.css": "/static/css/main.ff59165b.css",
"main.js": "/static/js/main.760fe716.js",
"main.css": "/static/css/main.847ce933.css",
"main.js": "/static/js/main.72d00c7c.js",
"static/js/787.4c72bb55.chunk.js": "/static/js/787.4c72bb55.chunk.js",
"index.html": "/index.html",
"main.ff59165b.css.map": "/static/css/main.ff59165b.css.map",
"main.760fe716.js.map": "/static/js/main.760fe716.js.map",
"main.847ce933.css.map": "/static/css/main.847ce933.css.map",
"main.72d00c7c.js.map": "/static/js/main.72d00c7c.js.map",
"787.4c72bb55.chunk.js.map": "/static/js/787.4c72bb55.chunk.js.map"
},
"entrypoints": [
"static/css/main.ff59165b.css",
"static/js/main.760fe716.js"
"static/css/main.847ce933.css",
"static/js/main.72d00c7c.js"
]
}

View File

@ -1 +1 @@
<!doctype html><html lang="en"><head><meta charset="utf-8"/><script async src="https://www.googletagmanager.com/gtag/js?id=G-RN3FDBLMCR"></script><script>function gtag(){dataLayer.push(arguments)}window.dataLayer=window.dataLayer||[],gtag("js",new Date),gtag("config","G-RN3FDBLMCR")</script><link rel="icon" href="/favicon.ico"/><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="theme-color" content="#000000"/><meta name="description" content="A visual programming environment for prompt engineering"/><link rel="apple-touch-icon" href="/logo192.png"/><link rel="manifest" href="/manifest.json"/><title>ChainForge</title><script defer="defer" src="/static/js/main.760fe716.js"></script><link href="/static/css/main.ff59165b.css" rel="stylesheet"></head><body><noscript>You need to enable JavaScript to run this app.</noscript><div id="root"></div></body></html>
<!doctype html><html lang="en"><head><meta charset="utf-8"/><script async src="https://www.googletagmanager.com/gtag/js?id=G-RN3FDBLMCR"></script><script>function gtag(){dataLayer.push(arguments)}window.dataLayer=window.dataLayer||[],gtag("js",new Date),gtag("config","G-RN3FDBLMCR")</script><link rel="icon" href="/favicon.ico"/><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="theme-color" content="#000000"/><meta name="description" content="A visual programming environment for prompt engineering"/><link rel="apple-touch-icon" href="/logo192.png"/><link rel="manifest" href="/manifest.json"/><title>ChainForge</title><script defer="defer" src="/static/js/main.72d00c7c.js"></script><link href="/static/css/main.847ce933.css" rel="stylesheet"></head><body><noscript>You need to enable JavaScript to run this app.</noscript><div id="root"></div></body></html>

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -27,7 +27,7 @@
"@rjsf/core": "^5.7.3",
"@rjsf/utils": "^5.7.3",
"@rjsf/validator-ajv8": "^5.7.3",
"@tabler/icons-react": "^2.17.0",
"@tabler/icons-react": "^2.39.0",
"@testing-library/jest-dom": "^5.16.5",
"@testing-library/react": "^13.4.0",
"@testing-library/user-event": "^13.5.0",
@ -62,6 +62,7 @@
"mantine-react-table": "^1.0.0-beta.8",
"markdown-it": "^13.0.1",
"mathjs": "^11.8.2",
"mdast-util-from-markdown": "^2.0.0",
"net": "^1.0.2",
"net-browserify": "^0.2.4",
"node-fetch": "^2.6.11",
@ -4910,20 +4911,20 @@
}
},
"node_modules/@tabler/icons": {
"version": "2.35.0",
"resolved": "https://registry.npmjs.org/@tabler/icons/-/icons-2.35.0.tgz",
"integrity": "sha512-qW/itKdmFvfGw6mAQ+cZy+2MYTXb0XdGAVhO3obYLJEfsSPMwQRO0S9ckFk1xMQX/Tj7REC3TEmWUBWNi3/o3g==",
"version": "2.39.0",
"resolved": "https://registry.npmjs.org/@tabler/icons/-/icons-2.39.0.tgz",
"integrity": "sha512-iK3j2jIEGIUaJcbYYg5iwyG1Y/m4lzUxAUbxRpvgeXCWP29jvZaH5hajZmU3KaSealddHuJg7PSQislPHpCsoQ==",
"funding": {
"type": "github",
"url": "https://github.com/sponsors/codecalm"
}
},
"node_modules/@tabler/icons-react": {
"version": "2.35.0",
"resolved": "https://registry.npmjs.org/@tabler/icons-react/-/icons-react-2.35.0.tgz",
"integrity": "sha512-jK2zgtMF2+LSjtbjYsfer+ryz72TwyJqv3MsmCfEpQwP37u01xvmTlVhJ+ox3bV+trsgjojsldVDuB05JuXLaw==",
"version": "2.39.0",
"resolved": "https://registry.npmjs.org/@tabler/icons-react/-/icons-react-2.39.0.tgz",
"integrity": "sha512-MyUK1jqtmHPZBnDXqIc1Y5OnfoqG+tGaSB1/gcl0mlY462fJ5f3QB0ZIZzAHMAGYb6K2iJSdFIFavhcgpDDZ7Q==",
"dependencies": {
"@tabler/icons": "2.35.0",
"@tabler/icons": "2.39.0",
"prop-types": "^15.7.2"
},
"funding": {
@ -5499,6 +5500,14 @@
"@types/d3-selection": "*"
}
},
"node_modules/@types/debug": {
"version": "4.1.12",
"resolved": "https://registry.npmjs.org/@types/debug/-/debug-4.1.12.tgz",
"integrity": "sha512-vIChWdVG3LG1SMxEvI/AK+FWJthlrqlTu7fbrlywTkkaONwk/UAGaULXRlf8vkzFBLVm0zkMdCquhL5aOjhXPQ==",
"dependencies": {
"@types/ms": "*"
}
},
"node_modules/@types/eslint": {
"version": "8.44.3",
"resolved": "https://registry.npmjs.org/@types/eslint/-/eslint-8.44.3.tgz",
@ -5676,6 +5685,14 @@
"@types/mdurl": "*"
}
},
"node_modules/@types/mdast": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/@types/mdast/-/mdast-4.0.3.tgz",
"integrity": "sha512-LsjtqsyF+d2/yFOYaN22dHZI1Cpwkrj+g06G8+qtUKlhovPW89YhqSnfKtMbkgmEtYpH2gydRNULd6y8mciAFg==",
"dependencies": {
"@types/unist": "*"
}
},
"node_modules/@types/mdurl": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/@types/mdurl/-/mdurl-1.0.2.tgz",
@ -5691,6 +5708,11 @@
"resolved": "https://registry.npmjs.org/@types/minimatch/-/minimatch-5.1.2.tgz",
"integrity": "sha512-K0VQKziLUWkVKiRVrx4a40iPaxTUefQmjtkQofBkYRcoaaL/8rhwDWww9qWbrgicNOgnpIsMxyNIUM4+n6dUIA=="
},
"node_modules/@types/ms": {
"version": "0.7.34",
"resolved": "https://registry.npmjs.org/@types/ms/-/ms-0.7.34.tgz",
"integrity": "sha512-nG96G3Wp6acyAgJqGasjODb+acrI7KltPiRxzHPXnP3NgI28bpQDRv53olbqGXbfcgF5aiiHmO3xpwEpS5Ld9g=="
},
"node_modules/@types/node": {
"version": "20.6.5",
"resolved": "https://registry.npmjs.org/@types/node/-/node-20.6.5.tgz",
@ -5840,6 +5862,11 @@
"resolved": "https://registry.npmjs.org/@types/trusted-types/-/trusted-types-2.0.4.tgz",
"integrity": "sha512-IDaobHimLQhjwsQ/NMwRVfa/yL7L/wriQPMhw1ZJall0KX6E1oxk29XMDeilW5qTIg5aoiqf5Udy8U/51aNoQQ=="
},
"node_modules/@types/unist": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/@types/unist/-/unist-3.0.2.tgz",
"integrity": "sha512-dqId9J8K/vGi5Zr7oo212BGii5m3q5Hxlkwy3WpYuKPklmBEvsbMYYyLxAQpSffdLl/gdW0XUpKWFvYmyoWCoQ=="
},
"node_modules/@types/uuid": {
"version": "8.3.4",
"resolved": "https://registry.npmjs.org/@types/uuid/-/uuid-8.3.4.tgz",
@ -7815,6 +7842,15 @@
"node": ">=10"
}
},
"node_modules/character-entities": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/character-entities/-/character-entities-2.0.2.tgz",
"integrity": "sha512-shx7oQ0Awen/BRIdkjkvz54PnEEI/EjwXDSIZp86/KKdbafHh1Df/RYGBhn4hbe2+uKC9FnT5UCEdyPz3ai9hQ==",
"funding": {
"type": "github",
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/check-types": {
"version": "11.2.3",
"resolved": "https://registry.npmjs.org/check-types/-/check-types-11.2.3.tgz",
@ -9274,6 +9310,18 @@
"resolved": "https://registry.npmjs.org/decimal.js/-/decimal.js-10.4.3.tgz",
"integrity": "sha512-VBBaLc1MgL5XpzgIP7ny5Z6Nx3UrRkIViUkPUdtl9aya5amy3De1gsUUSB1g3+3sExYNjCAsAznmukyxCb1GRA=="
},
"node_modules/decode-named-character-reference": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/decode-named-character-reference/-/decode-named-character-reference-1.0.2.tgz",
"integrity": "sha512-O8x12RzrUF8xyVcY0KJowWsmaJxQbmy0/EtnNtHRpsOcT7dFk5W598coHqBVpmWo1oQQfsCqfCmkZN5DJrZVdg==",
"dependencies": {
"character-entities": "^2.0.0"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/dedent": {
"version": "0.7.0",
"resolved": "https://registry.npmjs.org/dedent/-/dedent-0.7.0.tgz",
@ -9470,6 +9518,18 @@
"resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
},
"node_modules/devlop": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/devlop/-/devlop-1.1.0.tgz",
"integrity": "sha512-RWmIqhcFf1lRYBvNmr7qTNuyCt/7/ns2jbpp1+PalgE/rDQcBT0fioSMUpJ93irlUhC5hrg4cYqe6U+0ImW0rA==",
"dependencies": {
"dequal": "^2.0.0"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/didyoumean": {
"version": "1.2.2",
"resolved": "https://registry.npmjs.org/didyoumean/-/didyoumean-1.2.2.tgz",
@ -16046,6 +16106,41 @@
"safe-buffer": "^5.1.2"
}
},
"node_modules/mdast-util-from-markdown": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/mdast-util-from-markdown/-/mdast-util-from-markdown-2.0.0.tgz",
"integrity": "sha512-n7MTOr/z+8NAX/wmhhDji8O3bRvPTV/U0oTCaZJkjhPSKTPhS3xufVhKGF8s1pJ7Ox4QgoIU7KHseh09S+9rTA==",
"dependencies": {
"@types/mdast": "^4.0.0",
"@types/unist": "^3.0.0",
"decode-named-character-reference": "^1.0.0",
"devlop": "^1.0.0",
"mdast-util-to-string": "^4.0.0",
"micromark": "^4.0.0",
"micromark-util-decode-numeric-character-reference": "^2.0.0",
"micromark-util-decode-string": "^2.0.0",
"micromark-util-normalize-identifier": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0",
"unist-util-stringify-position": "^4.0.0"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/unified"
}
},
"node_modules/mdast-util-to-string": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-4.0.0.tgz",
"integrity": "sha512-0H44vDimn51F0YwvxSJSm0eCDOJTRlmN0R1yBh4HLj9wiV1Dn0QoXGbvFAWj2hSItVTlCmBF1hqKlIyUBVFLPg==",
"dependencies": {
"@types/mdast": "^4.0.0"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/unified"
}
},
"node_modules/mdn-data": {
"version": "2.0.4",
"resolved": "https://registry.npmjs.org/mdn-data/-/mdn-data-2.0.4.tgz",
@ -16106,6 +16201,427 @@
"node": ">= 0.6"
}
},
"node_modules/micromark": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/micromark/-/micromark-4.0.0.tgz",
"integrity": "sha512-o/sd0nMof8kYff+TqcDx3VSrgBTcZpSvYcAHIfHhv5VAuNmisCxjhx6YmxS8PFEpb9z5WKWKPdzf0jM23ro3RQ==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"@types/debug": "^4.0.0",
"debug": "^4.0.0",
"decode-named-character-reference": "^1.0.0",
"devlop": "^1.0.0",
"micromark-core-commonmark": "^2.0.0",
"micromark-factory-space": "^2.0.0",
"micromark-util-character": "^2.0.0",
"micromark-util-chunked": "^2.0.0",
"micromark-util-combine-extensions": "^2.0.0",
"micromark-util-decode-numeric-character-reference": "^2.0.0",
"micromark-util-encode": "^2.0.0",
"micromark-util-normalize-identifier": "^2.0.0",
"micromark-util-resolve-all": "^2.0.0",
"micromark-util-sanitize-uri": "^2.0.0",
"micromark-util-subtokenize": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-core-commonmark": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-core-commonmark/-/micromark-core-commonmark-2.0.0.tgz",
"integrity": "sha512-jThOz/pVmAYUtkroV3D5c1osFXAMv9e0ypGDOIZuCeAe91/sD6BoE2Sjzt30yuXtwOYUmySOhMas/PVyh02itA==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"decode-named-character-reference": "^1.0.0",
"devlop": "^1.0.0",
"micromark-factory-destination": "^2.0.0",
"micromark-factory-label": "^2.0.0",
"micromark-factory-space": "^2.0.0",
"micromark-factory-title": "^2.0.0",
"micromark-factory-whitespace": "^2.0.0",
"micromark-util-character": "^2.0.0",
"micromark-util-chunked": "^2.0.0",
"micromark-util-classify-character": "^2.0.0",
"micromark-util-html-tag-name": "^2.0.0",
"micromark-util-normalize-identifier": "^2.0.0",
"micromark-util-resolve-all": "^2.0.0",
"micromark-util-subtokenize": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-factory-destination": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-factory-destination/-/micromark-factory-destination-2.0.0.tgz",
"integrity": "sha512-j9DGrQLm/Uhl2tCzcbLhy5kXsgkHUrjJHg4fFAeoMRwJmJerT9aw4FEhIbZStWN8A3qMwOp1uzHr4UL8AInxtA==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-character": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-factory-label": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-factory-label/-/micromark-factory-label-2.0.0.tgz",
"integrity": "sha512-RR3i96ohZGde//4WSe/dJsxOX6vxIg9TimLAS3i4EhBAFx8Sm5SmqVfR8E87DPSR31nEAjZfbt91OMZWcNgdZw==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"devlop": "^1.0.0",
"micromark-util-character": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-factory-space": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-factory-space/-/micromark-factory-space-2.0.0.tgz",
"integrity": "sha512-TKr+LIDX2pkBJXFLzpyPyljzYK3MtmllMUMODTQJIUfDGncESaqB90db9IAUcz4AZAJFdd8U9zOp9ty1458rxg==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-character": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-factory-title": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-factory-title/-/micromark-factory-title-2.0.0.tgz",
"integrity": "sha512-jY8CSxmpWLOxS+t8W+FG3Xigc0RDQA9bKMY/EwILvsesiRniiVMejYTE4wumNc2f4UbAa4WsHqe3J1QS1sli+A==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-factory-space": "^2.0.0",
"micromark-util-character": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-factory-whitespace": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-factory-whitespace/-/micromark-factory-whitespace-2.0.0.tgz",
"integrity": "sha512-28kbwaBjc5yAI1XadbdPYHX/eDnqaUFVikLwrO7FDnKG7lpgxnvk/XGRhX/PN0mOZ+dBSZ+LgunHS+6tYQAzhA==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-factory-space": "^2.0.0",
"micromark-util-character": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-util-character": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/micromark-util-character/-/micromark-util-character-2.0.1.tgz",
"integrity": "sha512-3wgnrmEAJ4T+mGXAUfMvMAbxU9RDG43XmGce4j6CwPtVxB3vfwXSZ6KhFwDzZ3mZHhmPimMAXg71veiBGzeAZw==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-util-chunked": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-chunked/-/micromark-util-chunked-2.0.0.tgz",
"integrity": "sha512-anK8SWmNphkXdaKgz5hJvGa7l00qmcaUQoMYsBwDlSKFKjc6gjGXPDw3FNL3Nbwq5L8gE+RCbGqTw49FK5Qyvg==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-symbol": "^2.0.0"
}
},
"node_modules/micromark-util-classify-character": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-classify-character/-/micromark-util-classify-character-2.0.0.tgz",
"integrity": "sha512-S0ze2R9GH+fu41FA7pbSqNWObo/kzwf8rN/+IGlW/4tC6oACOs8B++bh+i9bVyNnwCcuksbFwsBme5OCKXCwIw==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-character": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-util-combine-extensions": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-combine-extensions/-/micromark-util-combine-extensions-2.0.0.tgz",
"integrity": "sha512-vZZio48k7ON0fVS3CUgFatWHoKbbLTK/rT7pzpJ4Bjp5JjkZeasRfrS9wsBdDJK2cJLHMckXZdzPSSr1B8a4oQ==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-chunked": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-util-decode-numeric-character-reference": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/micromark-util-decode-numeric-character-reference/-/micromark-util-decode-numeric-character-reference-2.0.1.tgz",
"integrity": "sha512-bmkNc7z8Wn6kgjZmVHOX3SowGmVdhYS7yBpMnuMnPzDq/6xwVA604DuOXMZTO1lvq01g+Adfa0pE2UKGlxL1XQ==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-symbol": "^2.0.0"
}
},
"node_modules/micromark-util-decode-string": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-decode-string/-/micromark-util-decode-string-2.0.0.tgz",
"integrity": "sha512-r4Sc6leeUTn3P6gk20aFMj2ntPwn6qpDZqWvYmAG6NgvFTIlj4WtrAudLi65qYoaGdXYViXYw2pkmn7QnIFasA==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"decode-named-character-reference": "^1.0.0",
"micromark-util-character": "^2.0.0",
"micromark-util-decode-numeric-character-reference": "^2.0.0",
"micromark-util-symbol": "^2.0.0"
}
},
"node_modules/micromark-util-encode": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-encode/-/micromark-util-encode-2.0.0.tgz",
"integrity": "sha512-pS+ROfCXAGLWCOc8egcBvT0kf27GoWMqtdarNfDcjb6YLuV5cM3ioG45Ys2qOVqeqSbjaKg72vU+Wby3eddPsA==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
]
},
"node_modules/micromark-util-html-tag-name": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-html-tag-name/-/micromark-util-html-tag-name-2.0.0.tgz",
"integrity": "sha512-xNn4Pqkj2puRhKdKTm8t1YHC/BAjx6CEwRFXntTaRf/x16aqka6ouVoutm+QdkISTlT7e2zU7U4ZdlDLJd2Mcw==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
]
},
"node_modules/micromark-util-normalize-identifier": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-normalize-identifier/-/micromark-util-normalize-identifier-2.0.0.tgz",
"integrity": "sha512-2xhYT0sfo85FMrUPtHcPo2rrp1lwbDEEzpx7jiH2xXJLqBuy4H0GgXk5ToU8IEwoROtXuL8ND0ttVa4rNqYK3w==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-symbol": "^2.0.0"
}
},
"node_modules/micromark-util-resolve-all": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-resolve-all/-/micromark-util-resolve-all-2.0.0.tgz",
"integrity": "sha512-6KU6qO7DZ7GJkaCgwBNtplXCvGkJToU86ybBAUdavvgsCiG8lSSvYxr9MhwmQ+udpzywHsl4RpGJsYWG1pDOcA==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-util-sanitize-uri": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-sanitize-uri/-/micromark-util-sanitize-uri-2.0.0.tgz",
"integrity": "sha512-WhYv5UEcZrbAtlsnPuChHUAsu/iBPOVaEVsntLBIdpibO0ddy8OzavZz3iL2xVvBZOpolujSliP65Kq0/7KIYw==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"micromark-util-character": "^2.0.0",
"micromark-util-encode": "^2.0.0",
"micromark-util-symbol": "^2.0.0"
}
},
"node_modules/micromark-util-subtokenize": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-subtokenize/-/micromark-util-subtokenize-2.0.0.tgz",
"integrity": "sha512-vc93L1t+gpR3p8jxeVdaYlbV2jTYteDje19rNSS/H5dlhxUYll5Fy6vJ2cDwP8RnsXi818yGty1ayP55y3W6fg==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
],
"dependencies": {
"devlop": "^1.0.0",
"micromark-util-chunked": "^2.0.0",
"micromark-util-symbol": "^2.0.0",
"micromark-util-types": "^2.0.0"
}
},
"node_modules/micromark-util-symbol": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-symbol/-/micromark-util-symbol-2.0.0.tgz",
"integrity": "sha512-8JZt9ElZ5kyTnO94muPxIGS8oyElRJaiJO8EzV6ZSyGQ1Is8xwl4Q45qU5UOg+bGH4AikWziz0iN4sFLWs8PGw==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
]
},
"node_modules/micromark-util-types": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/micromark-util-types/-/micromark-util-types-2.0.0.tgz",
"integrity": "sha512-oNh6S2WMHWRZrmutsRmDDfkzKtxF+bc2VxLC9dvtrDIRFln627VsFP6fLMgTryGDljgLPjkrzQSDcPrjPyDJ5w==",
"funding": [
{
"type": "GitHub Sponsors",
"url": "https://github.com/sponsors/unifiedjs"
},
{
"type": "OpenCollective",
"url": "https://opencollective.com/unified"
}
]
},
"node_modules/micromatch": {
"version": "4.0.5",
"resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.5.tgz",
@ -22142,6 +22658,18 @@
"node": ">=8"
}
},
"node_modules/unist-util-stringify-position": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/unist-util-stringify-position/-/unist-util-stringify-position-4.0.0.tgz",
"integrity": "sha512-0ASV06AAoKCDkS2+xw5RXJywruurpbC4JZSm7nr7MOt1ojAzvyyaO+UxZf18j8FCF6kmzCZKcAgN/yu2gm2XgQ==",
"dependencies": {
"@types/unist": "^3.0.0"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/unified"
}
},
"node_modules/universalify": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-2.0.0.tgz",

View File

@ -18,7 +18,6 @@
"@mantine/dropzone": "^6.0.19",
"@mantine/form": "^6.0.11",
"@mantine/prism": "^6.0.15",
"reactflow": "^11.0",
"@reactflow/background": "^11.2.0",
"@reactflow/controls": "^11.1.11",
"@reactflow/core": "^11.7.0",
@ -26,7 +25,7 @@
"@rjsf/core": "^5.7.3",
"@rjsf/utils": "^5.7.3",
"@rjsf/validator-ajv8": "^5.7.3",
"@tabler/icons-react": "^2.17.0",
"@tabler/icons-react": "^2.39.0",
"@testing-library/jest-dom": "^5.16.5",
"@testing-library/react": "^13.4.0",
"@testing-library/user-event": "^13.5.0",
@ -81,6 +80,7 @@
"react-edit-text": "^5.1.0",
"react-plotly.js": "^2.6.0",
"react-scripts": "5.0.1",
"reactflow": "^11.0",
"request": "^2.88.2",
"socket.io-client": "^4.6.1",
"stream-browserify": "^3.0.0",

174
chainforge/react-server/src/AiPopover.js vendored Normal file
View File

@ -0,0 +1,174 @@
import React, { useMemo, useRef } from 'react';
import { Stack, NumberInput, Button, Text, TextInput, Switch, Tabs, Popover, Badge, Textarea, Alert } from "@mantine/core"
import { useState } from 'react';
import { autofill, generateAndReplace, AIError } from './backend/ai';
import { IconSparkles, IconAlertCircle } from '@tabler/icons-react';
import AlertModal from './AlertModal';
import { useStore } from './store';
const zeroGap = {gap: "0rem"};
const popoverShadow ="rgb(38, 57, 77) 0px 10px 30px -14px";
const ROW_CONSTANTS = {
"beginAutofilling": 1,
"warnIfBelow": 2,
}
function AIPopover({
// A list of strings for the Extend feature to use as a basis.
values,
// A function that takes a list of strings that the popover will call to add new values
onAddValues,
// A function that takes a list of strings that the popover will call to replace the existing values
onReplaceValues,
// A boolean that indicates whether the values are in a loading state
areValuesLoading,
// A function that takes a boolean that the popover will call to set whether the values should be loading
setValuesLoading,
// API keys to pass when querying LLMs (only those from front-end settings window)
apiKeys,
}) {
// Command Fill state
const [commandFillNumber, setCommandFillNumber] = useState(3);
const [isCommandFillLoading, setIsCommandFillLoading] = useState(false);
const [didCommandFillError, setDidCommandFillError] = useState(false);
// Generate and Replace state
const [generateAndReplaceNumber, setGenerateAndReplaceNumber] = useState(3);
const [generateAndReplacePrompt, setGenerateAndReplacePrompt] = useState('');
const [generateAndReplaceIsUnconventional, setGenerateAndReplaceIsUnconventional] = useState(false);
const [didGenerateAndReplaceError, setDidGenerateAndReplaceError] = useState(false);
// To check for OpenAI API key
const noOpenAIKeyMessage = useMemo(() => {
if (apiKeys && apiKeys['OpenAI']) return undefined;
else return (
<Alert variant="light" color="grape" title="No OpenAI API key detected." maw={200} fz='xs' icon={<IconAlertCircle />}>
You must set an OpenAI API key before you can use generative AI support features.
</Alert>
);
}, [apiKeys]);
// Alert for errors
const alertModal = useRef(null);
const nonEmptyRows = useMemo(() =>
Object.values(values).filter((row) => row !== '').length,
[values]);
const enoughRowsForSuggestions = useMemo(() =>
nonEmptyRows >= ROW_CONSTANTS.beginAutofilling,
[nonEmptyRows]);
const showWarning = useMemo(() =>
enoughRowsForSuggestions && nonEmptyRows < ROW_CONSTANTS.warnIfBelow,
[enoughRowsForSuggestions, nonEmptyRows]);
const handleCommandFill = () => {
setIsCommandFillLoading(true);
setDidCommandFillError(false);
autofill(
Object.values(values),
commandFillNumber,
apiKeys,
)
.then(onAddValues)
.catch(e => {
if (e instanceof AIError) {
setDidCommandFillError(true);
} else {
if (alertModal.current) alertModal.current.trigger(e?.message);
else console.error(e);
}
}).finally(() => setIsCommandFillLoading(false));
};
const handleGenerateAndReplace = () => {
setDidGenerateAndReplaceError(false);
setValuesLoading(true);
generateAndReplace(
generateAndReplacePrompt,
generateAndReplaceNumber,
generateAndReplaceIsUnconventional,
apiKeys,
)
.then(onReplaceValues)
.catch(e => {
if (e instanceof AIError) {
console.log(e);
setDidGenerateAndReplaceError(true);
} else {
if (alertModal.current) alertModal.current.trigger(e?.message);
else console.error(e);
}
}).finally(() => setValuesLoading(false));
};
const extendUI = useMemo(() => (
<Stack>
{didCommandFillError ?
<Text size="xs" c="red">
Failed to generate. Please try again.
</Text>
: <></>}
<NumberInput label="Items to add" mt={5} min={1} max={10} defaultValue={3} value={commandFillNumber} onChange={setCommandFillNumber}/>
{enoughRowsForSuggestions ? <></>
: <Text size="xs" c="grape" maw={200}>
You must enter at least {ROW_CONSTANTS.beginAutofilling} fields before extending.
</Text>}
{showWarning ?
<Text size="xs" c="grape" maw={200}>
You have less than {ROW_CONSTANTS.warnIfBelow} fields. You may want to add more. Adding more rows typically improves the quality of the suggestions.
</Text>
: <></>}
<Button size="sm" variant="light" color="grape" fullWidth onClick={handleCommandFill} disabled={!enoughRowsForSuggestions} loading={isCommandFillLoading}>Extend</Button>
</Stack>
), [didCommandFillError, enoughRowsForSuggestions, showWarning, isCommandFillLoading, handleCommandFill, setCommandFillNumber, commandFillNumber] );
const replaceUI = useMemo(() => (
<Stack style={zeroGap}>
{didGenerateAndReplaceError ?
<Text size="xs" c="red">
Failed to generate. Please try again.
</Text>
: <></>}
<Textarea label="Generate a list of..." data-autofocus minRows={1} maxRows={4} autosize mt={5} value={generateAndReplacePrompt} onChange={(e) => setGenerateAndReplacePrompt(e.currentTarget.value)}/>
<NumberInput label="Items to generate" size="xs" mb={10} min={1} max={10} defaultValue={3} value={generateAndReplaceNumber} onChange={setGenerateAndReplaceNumber} />
<Switch color="grape" mb={10} size="xs" label="Make outputs unconventional" value={generateAndReplaceIsUnconventional} onChange={(e) => setGenerateAndReplaceIsUnconventional(e.currentTarget.checked)}/>
<Button size="sm" variant="light" color="grape" fullWidth onClick={handleGenerateAndReplace} loading={areValuesLoading}>Replace</Button>
</Stack>
), [didGenerateAndReplaceError, generateAndReplacePrompt, setGenerateAndReplacePrompt, generateAndReplaceNumber, setGenerateAndReplaceNumber, generateAndReplaceIsUnconventional, setGenerateAndReplaceIsUnconventional, handleGenerateAndReplace, areValuesLoading]);
return (
<Popover position="right-start" withArrow shadow={popoverShadow} withinPortal keepMounted trapFocus>
<Popover.Target>
<button className="ai-button nodrag"><IconSparkles size={10} stroke={3}/></button>
</Popover.Target>
<Popover.Dropdown className="nodrag nowheel">
<Stack style={zeroGap}>
<Badge color="grape" variant="light" leftSection={<IconSparkles size={10} stroke={3}/>}>
Generative AI
</Badge>
<Tabs color="grape" defaultValue="replace">
<Tabs.List grow>
<Tabs.Tab value="replace">Replace</Tabs.Tab>
<Tabs.Tab value="extend">Extend</Tabs.Tab>
</Tabs.List>
<Tabs.Panel value="extend" pb="xs">
{noOpenAIKeyMessage ? noOpenAIKeyMessage : extendUI}
</Tabs.Panel>
<Tabs.Panel value="replace" pb="xs">
{noOpenAIKeyMessage ? noOpenAIKeyMessage : replaceUI}
</Tabs.Panel>
</Tabs>
</Stack>
</Popover.Dropdown>
<AlertModal ref={alertModal} />
</Popover>
);
}
export default AIPopover;

View File

@ -40,7 +40,7 @@ import { shallow } from 'zustand/shallow';
import useStore from './store';
import fetch_from_backend from './fetch_from_backend';
import StorageCache from './backend/cache';
import { APP_IS_RUNNING_LOCALLY } from './backend/utils';
import { APP_IS_RUNNING_LOCALLY, browserTabIsActive } from './backend/utils';
// Device / Browser detection
import { isMobile, isChrome, isFirefox, isEdgeChromium, isChromium } from 'react-device-detect';
@ -57,6 +57,7 @@ const selector = (state) => ({
setNodes: state.setNodes,
setEdges: state.setEdges,
resetLLMColors: state.resetLLMColors,
setAPIKeys: state.setAPIKeys,
});
// The initial LLM to use when new flows are created, or upon first load
@ -134,7 +135,7 @@ const App = () => {
// Get nodes, edges, etc. state from the Zustand store:
const { nodes, edges, onNodesChange, onEdgesChange,
onConnect, addNode, setNodes, setEdges, resetLLMColors } = useStore(selector, shallow);
onConnect, addNode, setNodes, setEdges, resetLLMColors, setAPIKeys } = useStore(selector, shallow);
// For saving / loading
const [rfInstance, setRfInstance] = useState(null);
@ -297,6 +298,7 @@ const App = () => {
const saveFlow = useCallback((rf_inst) => {
const rf = rf_inst || rfInstance;
if (!rf) return;
// NOTE: This currently only saves the front-end state. Cache files
// are not pulled or overwritten upon loading from localStorage.
const flow = rf.toObject();
@ -343,7 +345,9 @@ const App = () => {
// Save flow that user loaded to autosave cache, in case they refresh the browser
StorageCache.saveToLocalStorage('chainforge-flow', flow);
StorageCache.saveToLocalStorage('chainforge-state');
// Start auto-saving, if it's not already enabled
if (rf_inst) initAutosaving(rf_inst);
}
};
const autosavedFlowExists = () => {
@ -606,16 +610,50 @@ const App = () => {
}, [rfInstance, nodes, IS_RUNNING_LOCALLY, handleError, clipboard, waitingForShare]);
// Initialize auto-saving
const initAutosaving = (rf_inst) => {
if (autosavingInterval !== null) return; // autosaving interval already set
console.log("Init autosaving!");
// Autosave the flow to localStorage every minute:
const interv = setInterval(() => {
// Check the visibility of the browser tab --if it's not visible, don't autosave
if (!browserTabIsActive()) return;
// Start a timer, in case the saving takes a long time
const startTime = Date.now();
// Save the flow to localStorage
saveFlow(rf_inst);
// Check how long the save took
const duration = Date.now() - startTime;
if (duration > 1500) {
// If the operation took longer than 1.5 seconds, that's not good.
// Although this function is called async inside setInterval,
// calls to localStorage block the UI in JavaScript, freezing the screen.
// We smart-disable autosaving here when we detect it's starting the freeze the UI:
console.warn("Autosaving disabled. The time required to save to localStorage exceeds 1 second. This can happen when there's a lot of data in your flow. Make sure to export frequently to save your work.");
clearInterval(interv);
setAutosavingInterval(null);
}
}, 60000); // 60000 milliseconds = 1 minute
setAutosavingInterval(interv);
};
// Run once upon ReactFlow initialization
const onInit = (rf_inst) => {
setRfInstance(rf_inst);
// Autosave the flow to localStorage every minute:
console.log('set autosaving interval');
const interv = setInterval(() => saveFlow(rf_inst), 60000); // 60000 milliseconds = 1 minute
setAutosavingInterval(interv);
if (!IS_RUNNING_LOCALLY) {
if (IS_RUNNING_LOCALLY) {
// If we're running locally, try to fetch API keys from Python os.environ variables in the locally running Flask backend:
fetch_from_backend('fetchEnvironAPIKeys').then((api_keys) => {
setAPIKeys(api_keys);
}).catch((err) => {
// Soft fail
console.warn('Warning: Could not fetch API key environment variables from Flask server. Error:', err.message);
});
} else {
// Check if there's a shared flow UID in the URL as a GET param
// If so, we need to look it up in the database and attempt to load it:

View File

@ -1,8 +1,8 @@
import React, { useState, forwardRef, useImperativeHandle, useCallback, useEffect } from 'react';
import { TextInput, Button, Group, Box, Modal, Divider, Text, Tabs, useMantineTheme, rem, Flex, Center, Badge, Card } from '@mantine/core';
import { TextInput, Button, Group, Box, Modal, Divider, Text, Tabs, useMantineTheme, rem, Flex, Center, Badge, Card, Switch } from '@mantine/core';
import { useDisclosure } from '@mantine/hooks';
import { useForm } from '@mantine/form';
import { IconUpload, IconBrandPython, IconX } from '@tabler/icons-react';
import { IconUpload, IconBrandPython, IconX, IconSparkles } from '@tabler/icons-react';
import { Dropzone, DropzoneProps } from '@mantine/dropzone';
import useStore, { initLLMProviders } from './store';
import { APP_IS_RUNNING_LOCALLY } from './backend/utils';
@ -102,12 +102,32 @@ const CustomProviderScriptDropzone = ({onError, onSetProviders}) => {
const GlobalSettingsModal = forwardRef((props, ref) => {
const [opened, { open, close }] = useDisclosure(false);
const setAPIKeys = useStore((state) => state.setAPIKeys);
const getFlag = useStore((state) => state.getFlag);
const setFlag = useStore((state) => state.setFlag);
const AvailableLLMs = useStore((state) => state.AvailableLLMs);
const setAvailableLLMs = useStore((state) => state.setAvailableLLMs);
const nodes = useStore((state) => state.nodes);
const setDataPropsForNode = useStore((state) => state.setDataPropsForNode);
const alertModal = props?.alertModal;
const [aiSupportActive, setAISupportActive] = useState(getFlag("aiSupport"));
const handleAISupportChecked = useCallback((e) => {
const checked = e.currentTarget.checked;
setAISupportActive(checked);
setFlag("aiSupport", checked);
if (!checked) { // turn off autocomplete if AI support is not checked
setAIAutocompleteActive(false);
setFlag("aiAutocomplete", false);
}
}, [setFlag, setAISupportActive]);
const [aiAutocompleteActive, setAIAutocompleteActive] = useState(getFlag("aiAutocomplete"));
const handleAIAutocompleteChecked = useCallback((e) => {
const checked = e.currentTarget.checked;
setAIAutocompleteActive(checked);
setFlag("aiAutocomplete", checked);
}, [setFlag, setAIAutocompleteActive]);
const handleError = useCallback((msg) => {
if (alertModal && alertModal.current)
alertModal.current.trigger(msg);
@ -187,15 +207,15 @@ const GlobalSettingsModal = forwardRef((props, ref) => {
return (
<Modal keepMounted opened={opened} onClose={close} title="ChainForge Settings" closeOnClickOutside={false} style={{position: 'relative', 'left': '-5%'}}>
<Box maw={380} mx="auto">
<Box maw={400} mx="auto">
<Tabs defaultValue="api-keys">
<Tabs.List>
<Tabs.Tab value="api-keys" >API Keys</Tabs.Tab>
<Tabs.Tab value="custom-providers" >Custom Model Providers</Tabs.Tab>
<Tabs.Tab value="ai-support" >AI Support (BETA)</Tabs.Tab>
<Tabs.Tab value="custom-providers" >Custom Providers</Tabs.Tab>
</Tabs.List>
<Tabs.Panel value="api-keys" pt="xs">
<Text mb="md" fz="xs" lh={1.15} color='dimmed'>
Note: <b>We do not store your API keys</b> &mdash;not in a cookie, localStorage, or server.
@ -256,8 +276,21 @@ return (
<Button type="submit">Submit</Button>
</Group>
</form>
</Tabs.Panel>
</Tabs.Panel>
<Tabs.Panel value="ai-support" pt="xs">
<Text mb="md" fz="sm" lh={1.3}>
AI support features in ChainForge include purple sparkly buttons <IconSparkles size="10pt" /> and smart autocomplete.
By default, AI support features require OpenAI API access to call GPT3.5 and GPT4 models.
You can hide, disable, or change these features here.
</Text>
<Switch label="AI Support Features" size="sm" description="Adds purple sparkly AI buttons to nodes. Must have OpenAI API key access to use."
checked={aiSupportActive} onChange={handleAISupportChecked} />
{aiSupportActive ? <Group>
<Switch label="Autocomplete" size="sm" mt="sm" disabled={!aiSupportActive} description="Works in background to streamline generation of input data. Press Tab in TextFields Nodes in empty fields to extend input data (currently only works in TextFields). NOTE: This will make OpenAI API calls in the background. We are not responsible for any additional costs incurred."
checked={aiAutocompleteActive} onChange={handleAIAutocompleteChecked} />
</Group>: <></>}
</Tabs.Panel>
{APP_IS_RUNNING_LOCALLY() ?
<Tabs.Panel value="custom-providers" pt="md">
@ -285,6 +318,7 @@ return (
}} />
</Tabs.Panel>
: <></>}
</Tabs>
</Box>
</Modal>

View File

@ -1,19 +1,50 @@
import React, { useState, useEffect, useCallback } from 'react';
import { Text } from '@mantine/core';
import { Skeleton, Text } from '@mantine/core';
import useStore from './store';
import NodeLabel from './NodeLabelComponent'
import { IconForms } from '@tabler/icons-react';
import { Handle } from 'reactflow';
import BaseNode from './BaseNode';
import { processCSV } from "./backend/utils"
import AISuggestionsManager from './backend/aiSuggestionsManager';
import AIPopover from './AiPopover';
import { cleanEscapedBraces, escapeBraces } from './backend/template';
const replaceDoubleQuotesWithSingle = (str) => str.replaceAll('"', "'");
const wrapInQuotesIfContainsComma = (str) => str.includes(",") ? `"${str}"` : str;
const makeSafeForCSLFormat = (str) => wrapInQuotesIfContainsComma(replaceDoubleQuotesWithSingle(str));
const stripWrappingQuotes = (str) => {
if (typeof str === "string" && str.length >= 2 && str.charAt(0) === '"' && str.charAt(str.length-1) === '"')
return str.substring(1, str.length-1);
else
return str;
};
const ItemsNode = ({ data, id }) => {
const setDataPropsForNode = useStore((state) => state.setDataPropsForNode);
const pingOutputNodes = useStore((state) => state.pingOutputNodes);
const apiKeys = useStore((state) => state.apiKeys);
const flags = useStore((state) => state.flags);
const [contentDiv, setContentDiv] = useState(null);
const [isEditing, setIsEditing] = useState(true);
const [csvInput, setCsvInput] = useState(null);
const [countText, setCountText] = useState(null);
// Only if AI autocomplete is enabled.
// TODO: This is harder to implement; see https://codepen.io/2undercover/pen/oNzyYO
const [autocompletePlaceholders, setAutocompletePlaceholders] = useState([]);
// Whether text field is in a loading state
const [isLoading, setIsLoading] = useState(false);
const [aiSuggestionsManager] = useState(new AISuggestionsManager(
// Do nothing when suggestions are simply updated because we are managing the placeholder state manually here.
undefined,
// When suggestions are refreshed, revise placeholders
setAutocompletePlaceholders
));
// initializing
useEffect(() => {
@ -23,9 +54,9 @@ const ItemsNode = ({ data, id }) => {
}, []);
// Handle a change in a text fields' input.
const handleInputChange = useCallback((event) => {
const setFieldsFromText = useCallback((text_val) => {
// Update the data for this text fields' id.
let new_data = { 'text': event.target.value, 'fields': processCSV(event.target.value) };
let new_data = { text: text_val, fields: processCSV(text_val).map(stripWrappingQuotes).map(escapeBraces) };
setDataPropsForNode(id, new_data);
pingOutputNodes(id);
}, [id, pingOutputNodes, setDataPropsForNode]);
@ -56,7 +87,7 @@ const ItemsNode = ({ data, id }) => {
const html = [];
elements.forEach((e, idx) => {
// html.push(<Badge color="orange" size="lg" radius="sm">{e}</Badge>)
html.push(<span key={idx} className="csv-element">{e}</span>);
html.push(<span key={idx} className="csv-element">{cleanEscapedBraces(e)}</span>);
if (idx < elements.length - 1) {
html.push(<span key={idx + 'comma'} className="csv-comma">,</span>);
}
@ -88,14 +119,14 @@ const ItemsNode = ({ data, id }) => {
defaultValue={text_val}
placeholder='Put your comma-separated list here'
onKeyDown={handKeyDown}
onChange={handleInputChange}
onChange={(event) => setFieldsFromText(event.target.value)}
onBlur={handleOnBlur}
autoFocus={true}/>
</div>
);
setContentDiv(null);
setCountText(null);
}, [isEditing, handleInputChange, handleOnBlur, handKeyDown]);
}, [isEditing, setFieldsFromText, handleOnBlur, handKeyDown]);
// when data.text changes, update the content div
useEffect(() => {
@ -108,10 +139,25 @@ const ItemsNode = ({ data, id }) => {
return (
<BaseNode classNames="text-fields-node" nodeId={id}>
<NodeLabel title={data.title || 'Items Node'} nodeId={id} icon={<IconForms size="16px" />} />
{csvInput}
{contentDiv}
{countText ? countText : <></>}
<NodeLabel title={data.title || 'Items Node'}
nodeId={id}
icon={<IconForms size="16px" />}
customButtons={
(flags["aiSupport"] ?
[<AIPopover key='ai-popover'
values={data.fields ?? []}
onAddValues={(vals) => setFieldsFromText(data.text + ", " + vals.map(makeSafeForCSLFormat).join(", "))}
onReplaceValues={(vals) => setFieldsFromText(vals.map(makeSafeForCSLFormat).join(", "))}
areValuesLoading={isLoading}
setValuesLoading={setIsLoading}
apiKeys={apiKeys} />]
: [])
} />
<Skeleton visible={isLoading}>
{csvInput}
{contentDiv}
{countText}
</Skeleton>
<Handle
type="source"
position="right"

View File

@ -6,7 +6,9 @@ import StatusIndicator from './StatusIndicatorComponent';
import AlertModal from './AlertModal';
import AreYouSureModal from './AreYouSureModal';
import { useState, useEffect, useCallback} from 'react';
import { Tooltip } from '@mantine/core';
import { Tooltip, Popover, Badge, Stack } from '@mantine/core';
import { IconSparkles } from '@tabler/icons-react';
export default function NodeLabel({ title, nodeId, icon, onEdit, onSave, editable, status, alertModal, customButtons, handleRunClick, handleRunHover, runButtonTooltip }) {
const setDataPropsForNode = useStore((state) => state.setDataPropsForNode);

View File

@ -1,12 +1,15 @@
import React, { useState, useRef, useEffect, useCallback } from 'react';
import React, { useState, useRef, useEffect, useCallback, useMemo } from 'react';
import { Handle } from 'reactflow';
import { Textarea, Tooltip } from '@mantine/core';
import { Textarea, Tooltip, Skeleton } from '@mantine/core';
import { IconTextPlus, IconEye, IconEyeOff } from '@tabler/icons-react';
import useStore from './store';
import NodeLabel from './NodeLabelComponent';
import TemplateHooks, { extractBracketedSubstrings } from './TemplateHooksComponent';
import BaseNode from './BaseNode';
import { setsAreEqual } from './backend/utils';
import AIPopover from './AiPopover';
import AISuggestionsManager from './backend/aiSuggestionsManager';
import DefaultDict from './backend/defaultdict';
// Helper funcs
const union = (setA, setB) => {
@ -25,19 +28,35 @@ const TextFieldsNode = ({ data, id }) => {
const [templateVars, setTemplateVars] = useState(data.vars || []);
const setDataPropsForNode = useStore((state) => state.setDataPropsForNode);
const pingOutputNodes = useStore((state) => state.pingOutputNodes);
const apiKeys = useStore((state) => state.apiKeys);
const flags = useStore((state) => state.flags);
const [textfieldsValues, setTextfieldsValues] = useState(data.fields || {});
const [fieldVisibility, setFieldVisibility] = useState(data.fields_visibility || {});
const getUID = useCallback(() => {
if (textfieldsValues) {
return 'f' + (1 + Object.keys(textfieldsValues).reduce((acc, key) => (
// Whether the text fields should be in a loading state
const [isLoading, setIsLoading] = useState(false);
const [aiSuggestionsManager] = useState(new AISuggestionsManager(
// Do nothing when suggestions are simply updated because we are managing the placeholder state manually here.
undefined,
// When suggestions are refreshed, throw out existing placeholders.
() => setPlaceholders({}),
() => apiKeys,
));
// Placeholders to show in the textareas. Object keyed by textarea index.
let [placeholders, setPlaceholders] = useState({});
const getUID = useCallback((textFields) => {
if (textFields) {
return 'f' + (1 + Object.keys(textFields).reduce((acc, key) => (
Math.max(acc, parseInt(key.slice(1)))
), 0)).toString();
} else {
return 'f0';
}
}, [textfieldsValues]);
}, []);
// Handle delete text field.
const handleDelete = useCallback((event) => {
@ -49,7 +68,7 @@ const TextFieldsNode = ({ data, id }) => {
delete new_vis[item_id];
// if the new_data is empty, initialize it with one empty field
if (Object.keys(new_fields).length === 0) {
new_fields[getUID()] = "";
new_fields[getUID(textfieldsValues)] = "";
}
setTextfieldsValues(new_fields);
setFieldVisibility(new_vis);
@ -61,7 +80,7 @@ const TextFieldsNode = ({ data, id }) => {
useEffect(() => {
if (!textfieldsValues || Object.keys(textfieldsValues).length === 0) {
let init_fields = {};
init_fields[getUID()] = "";
init_fields[getUID(textfieldsValues)] = "";
setTextfieldsValues(init_fields);
setDataPropsForNode(id, { fields: init_fields });
}
@ -70,11 +89,18 @@ const TextFieldsNode = ({ data, id }) => {
// Add a text field
const handleAddField = useCallback(() => {
let new_fields = {...textfieldsValues};
new_fields[getUID()] = "";
new_fields[getUID(textfieldsValues)] = "";
setTextfieldsValues(new_fields);
setDataPropsForNode(id, { fields: new_fields });
pingOutputNodes(id);
}, [textfieldsValues, id, setDataPropsForNode, pingOutputNodes]);
// Cycle suggestions when new field is created
//aiSuggestionsManager.cycleSuggestions();
// Ping AI suggestions to generate autocomplete options
if (flags["aiAutocomplete"])
aiSuggestionsManager.update(Object.values(textfieldsValues));
}, [textfieldsValues, id, flags, setDataPropsForNode, pingOutputNodes]);
// Disable/hide a text field temporarily
const handleDisableField = useCallback((field_id) => {
@ -85,17 +111,7 @@ const TextFieldsNode = ({ data, id }) => {
pingOutputNodes(id);
}, [fieldVisibility, setDataPropsForNode, pingOutputNodes]);
// Save the state of a textfield when it changes and update hooks
const handleTextFieldChange = useCallback((field_id, val) => {
// Update the value of the controlled Textarea component
let new_fields = {...textfieldsValues};
new_fields[field_id] = val;
setTextfieldsValues(new_fields);
// Update the data for the ReactFlow node
let new_data = { 'fields': new_fields };
const updateTemplateVars = useCallback((new_data) => {
// TODO: Optimize this check.
let all_found_vars = new Set();
const new_field_ids = Object.keys(new_data.fields);
@ -111,13 +127,25 @@ const TextFieldsNode = ({ data, id }) => {
if (!setsAreEqual(all_found_vars, past_vars)) {
const new_vars_arr = Array.from(all_found_vars);
new_data.vars = new_vars_arr;
setTemplateVars(new_vars_arr);
}
return new_data;
}, [templateVars]);
// Save the state of a textfield when it changes and update hooks
const handleTextFieldChange = useCallback((field_id, val) => {
// Update the value of the controlled Textarea component
let new_fields = {...textfieldsValues};
new_fields[field_id] = val;
setTextfieldsValues(new_fields);
// Update the data for the ReactFlow node
let new_data = updateTemplateVars({ 'fields': new_fields });
if (new_data.vars) setTemplateVars(new_data.vars);
setDataPropsForNode(id, new_data);
pingOutputNodes(id);
}, [textfieldsValues, templateVars, id]);
}, [textfieldsValues, templateVars, updateTemplateVars, id]);
// Dynamically update the textareas and position of the template hooks
const ref = useRef(null);
@ -154,37 +182,120 @@ const TextFieldsNode = ({ data, id }) => {
}
}, [data, id, pingOutputNodes]);
// Handle keydown events for the text fields
function handleTextAreaKeyDown(event, placeholder, textareaIndex) {
// Insert the AI suggested text if:
// (1) the user presses the Tab key
// (2) the user has not typed anything in the textarea
// (3) the suggestions are loaded
if (
event.key === 'Tab' &&
textfieldsValues[textareaIndex] === '' &&
!aiSuggestionsManager.areSuggestionsLoading()
) {
event.preventDefault();
// Insert the suggestion corresponding to the text field that was tabbed into by index.
aiSuggestionsManager.removeSuggestion(placeholder);
handleTextFieldChange(textareaIndex, placeholder);
}
};
// Add the entire list of `fields` to `textfieldsValues`
function addMultipleFields(fields) {
// Unpack the object to force a re-render
const buffer = {...textfieldsValues};
for (const field of fields) {
const uid = getUID(buffer);
buffer[uid] = field;
}
setTextfieldsValues(buffer);
setDataPropsForNode(id, { fields: buffer });
pingOutputNodes(id);
}
// Replace the entirety of `textfieldValues` with `newFields`
function replaceFields(fields) {
const buffer = {};
for (const field of fields) {
const uid = getUID(buffer);
buffer[uid] = field;
}
setTextfieldsValues(buffer);
let new_data = updateTemplateVars({ 'fields': buffer });
if (new_data.vars) setTemplateVars(new_data.vars);
setDataPropsForNode(id, { fields: buffer });
pingOutputNodes(id);
}
// Whether a placeholder is needed for the text field with id `i`.
function placeholderNeeded(i) {
return !textfieldsValues[i] && !placeholders[i] && flags["aiAutocomplete"];
}
// Load a placeholder into placeholders for the text field with id `i` if needed.
function loadPlaceholderIfNeeded(i) {
if (placeholderNeeded(i) && !aiSuggestionsManager.areSuggestionsLoading()) {
placeholders[i] = aiSuggestionsManager.popSuggestion();
}
}
// Cache the rendering of the text fields.
const textFields = useMemo(() =>
Object.keys(textfieldsValues).map(i => {
loadPlaceholderIfNeeded(i);
const placeholder = placeholders[i];
return (
<div className="input-field" key={i}>
<Textarea id={i} name={i}
className="text-field-fixed nodrag nowheel"
autosize
minRows="2"
maxRows="8"
value={textfieldsValues[i]}
placeholder={flags["aiAutocomplete"] ? placeholder : undefined}
disabled={fieldVisibility[i] === false}
onChange={(event) => handleTextFieldChange(i, event.currentTarget.value)}
onKeyDown={(event) => handleTextAreaKeyDown(event, placeholder, i)} />
{Object.keys(textfieldsValues).length > 1 ? (
<div style={{display: 'flex', flexDirection: 'column'}}>
<Tooltip label='remove field' position='right' withArrow arrowSize={10} withinPortal>
<button id={delButtonId + i} className="remove-text-field-btn nodrag" onClick={handleDelete} style={{flex: 1}}>X</button>
</Tooltip>
<Tooltip label={(fieldVisibility[i] === false ? 'enable' : 'disable') + ' field'} position='right' withArrow arrowSize={10} withinPortal>
<button id={visibleButtonId + i} className="remove-text-field-btn nodrag" onClick={() => handleDisableField(i)} style={{flex: 1}}>
{fieldVisibility[i] === false ?
<IconEyeOff size='14pt' pointerEvents='none' />
: <IconEye size='14pt' pointerEvents='none' />
}
</button>
</Tooltip>
</div>
) : <></>}
</div>)}),
// Update the text fields only when their values or their placeholders change.
[textfieldsValues, placeholders]);
return (
<BaseNode classNames="text-fields-node" nodeId={id}>
<NodeLabel title={data.title || 'TextFields Node'} nodeId={id} icon={<IconTextPlus size="16px" />} />
<div ref={setRef}>
{Object.keys(textfieldsValues).map(i => (
<div className="input-field" key={i}>
<Textarea id={i} name={i}
className="text-field-fixed nodrag nowheel"
autosize
minRows="2"
maxRows="8"
value={textfieldsValues[i]}
disabled={fieldVisibility[i] === false}
onChange={(event) => handleTextFieldChange(i, event.currentTarget.value)} />
{Object.keys(textfieldsValues).length > 1 ? (
<div style={{display: 'flex', flexDirection: 'column'}}>
<Tooltip label='remove field' position='right' withArrow arrowSize={10} withinPortal>
<button id={delButtonId + i} className="remove-text-field-btn nodrag" onClick={handleDelete} style={{flex: 1}}>X</button>
</Tooltip>
<Tooltip label={(fieldVisibility[i] === false ? 'enable' : 'disable') + ' field'} position='right' withArrow arrowSize={10} withinPortal>
<button id={visibleButtonId + i} className="remove-text-field-btn nodrag" onClick={() => handleDisableField(i)} style={{flex: 1}}>
{fieldVisibility[i] === false ?
<IconEyeOff size='14pt' pointerEvents='none' />
: <IconEye size='14pt' pointerEvents='none' />
}
</button>
</Tooltip>
</div>
) : <></>}
</div>))}
</div>
<NodeLabel title={data.title || 'TextFields Node'}
nodeId={id}
icon={<IconTextPlus size="16px" />}
customButtons={
(flags["aiSupport"] ?
[<AIPopover key='ai-popover'
values={textfieldsValues}
onAddValues={addMultipleFields}
onReplaceValues={replaceFields}
areValuesLoading={isLoading}
setValuesLoading={setIsLoading}
apiKeys={apiKeys} />]
: [])
} />
<Skeleton visible={isLoading}>
<div ref={setRef}>
{textFields}
</div>
</Skeleton>
<Handle
type="source"
position="right"

View File

@ -0,0 +1,25 @@
import { autofill, generateAndReplace } from "../ai";
describe("autofill", () => {
it("should return an array of n rows", async () => {
const input = ["1", "2", "3", "4", "5"];
const n = 3;
const result = await autofill(input, n);
expect(result).toHaveLength(n);
result.forEach((row) => {
expect(typeof row).toBe("string");
});
});
});
describe("generateAndReplace", () => {
it("should return an array of n rows", async () => {
const prompt = "animals";
const n = 3;
const result = await generateAndReplace(prompt, n);
expect(result).toHaveLength(n);
result.forEach((row) => {
expect(typeof row).toBe("string");
});
});
});

View File

@ -0,0 +1,82 @@
import AISuggestionsManager from "../aiSuggestionsManager";
import { Row } from "../ai";
import * as AI from "../ai";
describe("AISuggestionsManager", () => {
let suggestionsManager: AISuggestionsManager;
let mockRows: string[];
beforeEach(() => {
suggestionsManager = new AISuggestionsManager((suggestions: Row[]) => {});
mockRows = [
'one',
'two',
'three',
];
});
describe("update", () => {
it("should clear suggestions if necessary", () => {
jest.useFakeTimers()
suggestionsManager.suggestions = [...mockRows];
suggestionsManager.update(["one", "", ""]);
jest.runAllTimers();
expect(suggestionsManager.suggestions).toEqual([]);
});
});
describe("peekSuggestions", () => {
it("should return the current suggestions", () => {
suggestionsManager.suggestions = [...mockRows];
expect(suggestionsManager.peekSuggestions()).toEqual(mockRows);
});
});
describe("popSuggestion", () => {
it("should return and remove the first suggestion by default", () => {
suggestionsManager.suggestions = [...mockRows];
const firstSuggestion = mockRows[0];
expect(suggestionsManager.popSuggestion()).toEqual(firstSuggestion);
expect(suggestionsManager.suggestions).toEqual(mockRows.slice(1));
});
it("should return and remove the suggestion at the given index", () => {
suggestionsManager.suggestions = [...mockRows];
const secondSuggestion = mockRows[1];
expect(suggestionsManager.popSuggestion(1)).toEqual(secondSuggestion);
expect(suggestionsManager.suggestions).toEqual(
mockRows.slice(0, 1).concat(mockRows.slice(2))
);
});
});
describe("removeSuggestion", () => {
it("should remove the given suggestion", () => {
suggestionsManager.suggestions = [...mockRows];
const secondSuggestion = mockRows[1];
suggestionsManager.removeSuggestion(secondSuggestion);
expect(suggestionsManager.suggestions).toEqual(
mockRows.slice(0, 1).concat(mockRows.slice(2))
);
});
});
describe("areSuggestionsLoading", () => {
it("should return the current loading state", () => {
expect(suggestionsManager.areSuggestionsLoading()).toBe(false);
suggestionsManager.isLoading = true;
expect(suggestionsManager.areSuggestionsLoading()).toBe(true);
});
});
describe("cycleSuggestions", () => {
it("should deterministically reorder the suggestions", () => {
suggestionsManager.suggestions = [...mockRows];
expect(suggestionsManager.peekSuggestions()).toEqual(mockRows);
suggestionsManager.cycleSuggestions();
// Except to be a recombination: not equal but set-equal
expect(suggestionsManager.peekSuggestions()).not.toEqual(mockRows);
expect(new Set(suggestionsManager.peekSuggestions())).toEqual(new Set(mockRows));
});
});
});

View File

@ -0,0 +1,21 @@
import DefaultDict from '../defaultdict';
describe('DefaultDict', () => {
it('should return the default value when a key is not found', () => {
const dict = new DefaultDict(() => 0);
expect(dict['a']).toBe(0);
expect(dict['b']).toBe(0);
});
it('should return the value set for a key', () => {
const dict = new DefaultDict(() => 0);
dict['a'] = 1;
expect(dict['a']).toBe(1);
});
it('should return the default value for a key set to null', () => {
const dict = new DefaultDict(() => 0);
dict['a'] = null;
expect(dict['a']).toBe(0);
});
});

View File

@ -0,0 +1,107 @@
import { union, isSubset, isExtension, isExtensionIgnoreEmpty, isEqual } from "../setUtils";
describe("setUtils", () => {
describe("isEqual", () => {
it("returns true if two sets are equal", () => {
const setA = new Set([1, 2, 3]);
const setB = new Set([1, 2, 3]);
expect(isEqual(setA, setB)).toBe(true);
});
it("returns false if two sets are not equal", () => {
const setA = new Set([1, 2, 3]);
const setB = new Set([1, 2, 4]);
expect(isEqual(setA, setB)).toBe(false);
});
});
describe("union", () => {
it("returns the union of two sets", () => {
const setA = new Set([1, 2, 3]);
const setB = new Set([2, 3, 4]);
const expected = new Set([1, 2, 3, 4]);
expect(union(setA, setB)).toEqual(expected);
});
});
describe("isSubset", () => {
it("returns true if A is a subset of B", () => {
const setA = new Set([1, 2]);
const setB = new Set([1, 2, 3]);
expect(isSubset(setA, setB)).toBe(true);
});
it("returns false if A is not a subset of B", () => {
const setA = new Set([1, 2, 3]);
const setB = new Set([1, 2]);
expect(isSubset(setA, setB)).toBe(false);
});
});
describe("isExtension", () => {
it("returns true if A is an extension of B and C", () => {
const setA = new Set([1, 2, 3]);
const setB = new Set([1, 2]);
const setC = new Set([3, 4]);
expect(isExtension(setA, setB, setC)).toBe(true);
});
it("returns false if A is not an extension of B and C", () => {
const setA = new Set([1, 3, 4]);
const setB = new Set([1, 2]);
const setC = new Set([3, 4]);
expect(isExtension(setA, setB, setC)).toBe(false);
});
});
describe("isExtensionIgnoreEmpty", () => {
it("returns true if A is an extension of B and C, ignoring empty strings", () => {
const setA = ["", "1", "2", "", "3"];
const setB = ["", "1", "2", ""];
const setC = ["3", "4", ""];
expect(isExtensionIgnoreEmpty(setA, setB, setC)).toBe(true);
});
it("returns false if A is not an extension of B and C, ignoring empty strings", () => {
const setA = ["", "1", "3", "", "4"];
const setB = ["", "1", "2", ""];
const setC = ["3", "4", ""];
expect(isExtensionIgnoreEmpty(setA, setB, setC)).toBe(false);
});
it(("return true on this real-life color example"), () => {
const setA = [
"Red",
"Sky Blue",
"Deep Purple",
"Sunshine Yellow",
"Midnight Black",
"Emerald Green",
"Electric Pink",
"Arctic White",
"",
""
];
const setB = [
"Red",
"Sky Blue",
"Deep Purple",
"Sunshine Yellow",
"Midnight Black",
"Emerald Green",
"Electric Pink",
"",
"",
""
];
const setC = [
"Arctic White",
"Ocean Blue",
"Fiery Orange",
"Lavender Purple",
"Goldenrod Yellow"
];
expect(isExtensionIgnoreEmpty(setA, setB, setC)).toBe(true);
});
});
});

View File

@ -0,0 +1,210 @@
/**
* Business logic for the AI-generated features.
*/
import { ConsoleView } from "react-device-detect";
import { queryLLM } from "./backend";
import { StringTemplate, escapeBraces, containsSameTemplateVariables } from "./template";
import { ChatHistoryInfo, Dict } from "./typing";
import { fromMarkdown } from "mdast-util-from-markdown";
export class AIError extends Error {
constructor(message: string) {
super(message);
this.name = "AIError";
}
}
// Input and outputs of autofill are both rows of strings.
export type Row = string;
// LLM to use for AI features.
const LLM = "gpt-3.5-turbo";
/**
* Flattens markdown AST to text
*/
function compileTextFromMdAST(md: Dict): string {
if (md?.type === "text")
return md.value ?? "";
else if (md?.children?.length > 0)
return md.children.map(compileTextFromMdAST).join("\n");
return "";
}
/**
* Removes trailing quotation marks
*/
function trimQuotationMarks(s: string): string {
if (s.length <= 1) return s;
const [c0, c1] = [s.charAt(0), s.charAt(s.length-1)];
if ((c0 === '"' && c1 === '"') || (c0 === "'" && c1 === "'"))
return s.slice(1, s.length-1);
return s;
}
/**
* Converts any double-brace variables like {{this}} to single-braces, like {this}
*/
function convertDoubleToSingleBraces(s: string): string {
// Use a regular expression to find all double-brace template variables
const regex = /{{(.*?)}}/g;
// Replace each double-brace variable with single braces
return s.replace(regex, '{$1}');
}
/**
* A message to instruct the LLM to handle template variables properly, mentioning the given variables.
*/
function templateVariableMessage(vars: string[]): string {
const stringed = vars.map(v => `{${v}}`).join(", ") ?? "";
const varMessage = vars.length > 0 ? `Each item must use all of these variables: ${stringed}` : "";
return `Your output is a template in Jinja format, with single braces {} around the masked variables. ${varMessage}`;
}
/**
* Generate the system message used for autofilling.
* @param n number of rows to generate
*/
function autofillSystemMessage(n: number, templateVariables?: string[]): string {
return `Here is a list of commands or items. Say what the pattern seems to be in a single sentence. Then, generate ${n} more commands or items following the pattern, as an unordered markdown list. ${templateVariables && templateVariables.length > 0 ? templateVariableMessage(templateVariables) : ""}`;
}
/**
* Generate the system message used for generate and replace (GAR).
*/
function GARSystemMessage(n: number, creative?: boolean, generatePrompts?: boolean): string {
return `Generate a list of exactly ${n} items. Format your response as an unordered markdown list using "-". Do not ever repeat anything.${creative ? "Be unconventional with your outputs." : ""} ${generatePrompts ? "Your outputs should be commands that can be given to an AI chat assistant." : ""} If the user has specified items or inputs to their command, generate a template in Jinja format, with single braces {} around the masked variables.`;
}
/**
* Returns a string representing the given rows as a markdown list
* @param rows to encode
*/
function encode(rows: Row[]): string {
return escapeBraces(rows.map(row => `- ${row}`).join('\n'));
}
/**
* Returns a list of items that appears in the given markdown text. Throws an AIError if the string is not in markdown list format.
* @param mdText raw text to decode (in markdown format)
* @param templateVariables to check for
*/
function decode(mdText: string): Row[] {
let result: Row[] = [];
// Parse string as markdown
const md = fromMarkdown(mdText);
if (md?.children.length > 0 && md.children.some(c => c.type === 'list')) {
// Find the first list that appears in the markdown text, if any
const md_list = md.children.filter(c => c.type === 'list')[0];
// Extract and iterate over the list items, converting them to text
const md_list_items = "children" in md_list ? md_list.children : [];
for (const item of md_list_items) {
const text = trimQuotationMarks(
compileTextFromMdAST(item).trim());
if (text && text.length > 0)
result.push(text);
}
}
if (result.length === 0)
throw new AIError(`Failed to decode output: ${mdText}`);
// Convert any double-brace template variables to single-braces:
result = result.map(convertDoubleToSingleBraces);
return result;
}
/**
* Uses an LLM to interpret the pattern from the given rows as return new rows following the pattern.
* @param input rows for the autofilling system
* @param n number of results to return
*/
export async function autofill(input: Row[], n: number, apiKeys?: Dict): Promise<Row[]> {
// hash the arguments to get a unique id
let id = JSON.stringify([input, n]);
let encoded = encode(input);
let templateVariables = [...new Set(new StringTemplate(input.join('\n')).get_vars())];
console.log("System message: ", autofillSystemMessage(n, templateVariables));
let history: ChatHistoryInfo[] = [{
messages: [{
"role": "system",
"content": autofillSystemMessage(n, templateVariables)
}],
fill_history: {},
}];
let result = await queryLLM(
/*id=*/ id,
/*llm=*/ LLM,
/*n=*/ 1,
/*prompt=*/ encoded,
/*vars=*/ {},
/*chat_history=*/ history,
/*api_keys=*/ apiKeys,
/*no_cache=*/ true);
if (result.errors && Object.keys(result.errors).length > 0)
throw new Error(Object.values(result.errors)[0].toString());
const output = result.responses[0].responses[0];
console.log("LLM said: ", output);
if (!containsSameTemplateVariables(input.join('\n'), output))
throw new AIError(`Generated output does not use template variables properly with respect to the input. Output: ${output}`);
const new_items = decode(output);
return new_items.slice(0, n);
}
/**
* Uses an LLM to generate `n` new rows based on the pattern explained in `prompt`.
* @param prompt
* @param n
* @param templateVariables list of template variables to use
*/
export async function generateAndReplace(prompt: string, n: number, creative?: boolean, apiKeys?: Dict): Promise<Row[]> {
// hash the arguments to get a unique id
let id = JSON.stringify([prompt, n]);
// True if `prompt` contains the word 'prompt'
let generatePrompts = prompt.toLowerCase().includes('prompt');
let history: ChatHistoryInfo[] = [{
messages: [{
"role": "system",
"content": GARSystemMessage(n, creative, generatePrompts),
}],
fill_history: {},
}];
let input = `Generate a list of ${escapeBraces(prompt)}`;
const result = await queryLLM(
/*id=*/ id,
/*llm=*/ LLM,
/*n=*/ 1,
/*prompt=*/ input,
/*vars=*/ {},
/*chat_history=*/ history,
/*api_keys=*/ apiKeys,
/*no_cache=*/ true);
if (result.errors && Object.keys(result.errors).length > 0)
throw new Error(Object.values(result.errors)[0].toString());
console.log("LLM said: ", result.responses[0].responses[0]);
const new_items = decode(result.responses[0].responses[0]);
return new_items.slice(0, n);
}

View File

@ -0,0 +1,178 @@
import { Row, autofill, AIError } from "./ai";
import { debounce } from "lodash";
import { isExtensionIgnoreEmpty } from "./setUtils";
import { Dict } from "./typing";
const DEBOUNCE_MILLISECONDS = 1000;
const MIN_ROWS_FOR_SUGGESTIONS = 1;
const NUM_SUGGESTIONS_TO_CACHE = 5;
/**
* Helper Function
*/
// Returns whether there are enough non-empty rows to generate suggestions.
function enoughRows(rows: Row[]): boolean {
return rows.filter(row => row !== "").length >= MIN_ROWS_FOR_SUGGESTIONS;
}
// Returns whether suggestions should be completely cleared.
function shouldClearSuggestions(rows: Row[]): boolean {
// If there aren't enough rows to generate suggestions, clear.
return !enoughRows(rows);
}
// Consumes AI errors but throws other errors up.
function consumeAIErrors(e: Error) {
if (e instanceof AIError) {
console.log('Encountered but subdued error while generating suggestions:', e);
} else {
throw e;
}
}
/**
* Holds a cache of suggestions generated by AI.
*/
class AISuggestionsManager {
// The values that the current suggestions are based on.
base: Row[] = [];
// A cache of suggestions.
suggestions: Row[] = [];
// Suggestions that should now be in the base if the user accepts the suggestions.
expectedSuggestions: Row[] = [];
// Callback to call when the suggestions change.
onSuggestionsChanged: (suggestions: Row[]) => void;
// Callback to call when the suggestions are completely refreshed.
onSuggestionsRefreshed: (suggestions: Row[]) => void;
// Fetches API keys from front-end
getAPIKeys: () => Dict;
// Whether the suggestions are loading.
isLoading: boolean = false;
constructor(
onSuggestionsChanged?: (suggestions: Row[]) => void,
onSuggestionsRefreshed?: (suggestions: Row[]) => void,
getAPIKeys?: () => Dict,
) {
this.onSuggestionsChanged = onSuggestionsChanged
? onSuggestionsChanged
: () => {};
this.onSuggestionsRefreshed = onSuggestionsRefreshed
? onSuggestionsRefreshed
: () => {};
this.getAPIKeys = getAPIKeys;
}
/**
* Private Functions
*/
// Helper to set the suggestions and previousSuggestions together and notify the callback.
private setSuggestions(suggestions: Row[]) {
this.suggestions = suggestions;
this.onSuggestionsChanged(this.suggestions);
}
// Returns whether suggestions should be updated based on the current state and the new base.
private shouldUpdateSuggestions(newBase: Row[]): boolean {
// (1) If there are no more suggestions, always update.
if (this.suggestions.length === 0) return true;
// Otherwise, update if all of the following are true:
// (1) Suggestions aren't already loading.
// (2) There are enough rows to generate suggestions.
// (3) The new base is different from the old base.
// (4) The new base isn't an "extension" of the old base.
if (
!this.isLoading &&
enoughRows(newBase) &&
this.base !== newBase &&
!isExtensionIgnoreEmpty(newBase, this.base, this.expectedSuggestions)
) {
return true;
}
return false;
}
// Clears the suggestions.
private clearSuggestions() {
this.setSuggestions([]);
}
// Updates the suggestions by querying the LLM.
private updateSuggestions() {
this.isLoading = true;
// Query LLM.
autofill(this.base, NUM_SUGGESTIONS_TO_CACHE, this.getAPIKeys())
// Update suggestions.
.then((suggestions) => {
this.setSuggestions(suggestions);
this.expectedSuggestions = suggestions;
this.onSuggestionsRefreshed(this.suggestions);
})
.catch(consumeAIErrors)
.finally(() => {
this.isLoading = false;
});
}
/**
* Public API
*/
// Update what the suggestions are based off of. Debounce included.
update: (newBase: Row[]) => void
= debounce((newBase) => {
// Clear suggestions if necessary.
if (shouldClearSuggestions(newBase)) {
this.clearSuggestions();
return;
}
// Update suggestions if necessary.
if (this.shouldUpdateSuggestions(newBase)) {
this.base = newBase;
this.updateSuggestions();
}
// If the new base is an extension of the old base, update the base to reflect the extension.
if (isExtensionIgnoreEmpty(newBase, this.base, this.expectedSuggestions)) {
this.base = newBase;
}
}, DEBOUNCE_MILLISECONDS);
// Returns the suggestions.
peekSuggestions(): Row[] {
return this.suggestions;
}
// Returns the suggestion and removes it from the list. Defaults to the first one if no index.
popSuggestion(index?: number): Row {
const i = index ? index : 0;
const popped = this.suggestions[i];
const leftHalf = this.suggestions.slice(0, i);
const rightHalf = this.suggestions.slice(i + 1);
this.setSuggestions(leftHalf.concat(rightHalf));
return popped;
}
// Removes a suggestion from the list.
removeSuggestion(suggestion: Row): void {
const i = this.suggestions.indexOf(suggestion);
this.popSuggestion(i);
}
// Returns whether suggestions are loading.
areSuggestionsLoading(): boolean {
return this.isLoading;
}
// Deterministically reorders the list of suggestions
cycleSuggestions(): void {
// Move the current suggestion to the end of the list
const first = this.suggestions[0];
const rest = this.suggestions.slice(1);
this.setSuggestions(rest.concat([first]));
}
}
export default AISuggestionsManager;

View File

@ -1,8 +1,8 @@
import markdownIt from "markdown-it";
import { Dict, StringDict, LLMResponseError, LLMResponseObject, StandardizedLLMResponse, ChatHistoryInfo, isEqualChatHistory } from "./typing";
import { LLM, NativeLLM, getEnumName } from "./models";
import { APP_IS_RUNNING_LOCALLY, set_api_keys, FLASK_BASE_URL, call_flask_backend, filterDict, deepcopy } from "./utils";
import { LLM, getEnumName } from "./models";
import { APP_IS_RUNNING_LOCALLY, set_api_keys, FLASK_BASE_URL, call_flask_backend } from "./utils";
import StorageCache from "./cache";
import { PromptPipeline } from "./query";
import { PromptPermutationGenerator, PromptTemplate } from "./template";
@ -137,17 +137,6 @@ function get_cache_keys_related_to_id(cache_id: string, include_basefile: boolea
}
async function setAPIKeys(api_keys: StringDict): Promise<void> {
if (APP_IS_RUNNING_LOCALLY()) {
// Try to fetch API keys from os.environ variables in the locally running Flask backend:
try {
const api_keys = await fetchEnvironAPIKeys();
set_api_keys(api_keys);
} catch (err) {
console.warn('Warning: Could not fetch API key environment variables from Flask server. Error:', err.message);
// Soft fail
}
}
if (api_keys !== undefined)
set_api_keys(api_keys);
}
@ -583,12 +572,16 @@ export async function queryLLM(id: string,
llm = llm as (Array<string> | Array<Dict>);
await setAPIKeys(api_keys);
if (api_keys !== undefined)
set_api_keys(api_keys);
// Get the storage keys of any cache files for specific models + settings
const llms = llm;
let cache: Dict = StorageCache.get(`${id}.json`) || {}; // returns {} if 'id' is not in the storage cache yet
// Ignore cache if no_cache is present
if (no_cache) cache = {};
let llm_to_cache_filename = {};
let past_cache_files = {};
if (typeof cache === 'object' && cache.cache_files !== undefined) {
@ -623,7 +616,8 @@ export async function queryLLM(id: string,
}
// Store the overall cache file for this id:
StorageCache.store(`${id}.json`, cache);
if (!no_cache)
StorageCache.store(`${id}.json`, cache);
// Create a Proxy object to 'listen' for changes to a variable (see https://stackoverflow.com/a/50862441)
// and then stream those changes back to a provided callback used to update progress bars.
@ -665,7 +659,7 @@ export async function queryLLM(id: string,
// Create an object to query the LLM, passing a storage key for cache'ing responses
const cache_filepath = llm_to_cache_filename[llm_key];
const prompter = new PromptPipeline(prompt, cache_filepath);
const prompter = new PromptPipeline(prompt, no_cache ? undefined : cache_filepath);
// Prompt the LLM with all permutations of the input prompt template:
// NOTE: If the responses are already cache'd, this just loads them (no LLM is queried, saving $$$)
@ -760,10 +754,11 @@ export async function queryLLM(id: string,
cache_filenames[filename] = llm_spec;
});
StorageCache.store(`${id}.json`, {
cache_files: cache_filenames,
responses_last_run: res,
});
if (!no_cache)
StorageCache.store(`${id}.json`, {
cache_files: cache_filenames,
responses_last_run: res,
});
// Return all responses for all LLMs
return {
@ -927,7 +922,8 @@ export async function evalWithLLM(id: string,
response_ids = [ response_ids ];
response_ids = response_ids as Array<string>;
if (api_keys) setAPIKeys(api_keys);
if (api_keys !== undefined)
set_api_keys(api_keys);
// Load all responses with the given ID:
let all_evald_responses: StandardizedLLMResponse[] = [];

View File

@ -0,0 +1,19 @@
/**
* A dictionary that uses a default value when a key is not found.
*
* Example usage:
* const dict = new DefaultDict(() => 0);
* dict["a"] = 1;
* console.log(dict["a"]); // 1
* console.log(dict["b"]); // 0
*/
class DefaultDict {
constructor(defaultFactory) {
return new Proxy({}, {
get: (target, name) => (name in target && target[name] != null) ? target[name] : defaultFactory()
})
}
}
export default DefaultDict;

View File

@ -240,7 +240,8 @@ export class PromptPipeline {
* Useful for continuing if computation was interrupted halfway through.
*/
_load_cached_responses(): {[key: string]: (LLMResponseObject | LLMResponseObject[])} {
return StorageCache.get(this._storageKey) || {};
if (this._storageKey === undefined) return {};
else return StorageCache.get(this._storageKey) || {};
}
/**
@ -248,7 +249,8 @@ export class PromptPipeline {
* (Overrides the existing responses stored in the cache.)
*/
_cache_responses(responses: Dict): void {
StorageCache.store(this._storageKey, responses);
if (this._storageKey !== undefined)
StorageCache.store(this._storageKey, responses);
}
async _prompt_llm(llm: LLM,

View File

@ -0,0 +1,39 @@
// Returns whether two sets are equal.
export function isEqual<T>(a: Set<T>, b: Set<T>): boolean {
return isSubset(a, b) && isSubset(b, a);
}
// Returns the union of two sets.
export function union<T>(a: Set<T>, b: Set<T>): Set<T> {
return new Set([...a, ...b]);
}
// Returns whether A is a subset of B.
export function isSubset<T>(a: Set<T>, b: Set<T>): boolean {
return [...a].every((x) => b.has(x));
}
export function isSuperset<T>(a: Set<T>, b: Set<T>): boolean {
return isSubset(b, a);
}
export function subtract<T>(a: Set<T>, b: Set<T>): Set<T> {
return new Set([...a].filter((x) => !b.has(x)));
}
// A is an "extension" of B and C if
// (1) A is a superset of B
// (2) The elements in A that are not in B are a subset of C.
export function isExtension<T>(a: Set<T>, b: Set<T>, c: Set<T>): boolean {
return isSuperset(a, b) && isSubset(subtract(a, b), c);
}
// Returns whether A is an "extension" of B and C, ignoring empty strings.
export function isExtensionIgnoreEmpty(a: string[], b: string[], c: string[]) {
const emptyStringFilter = (x: string) => x !== "";
return isExtension(
new Set(a.filter(emptyStringFilter)),
new Set(b.filter(emptyStringFilter)),
new Set(c.filter(emptyStringFilter))
);
}

View File

@ -1,3 +1,5 @@
import { isEqual } from "./setUtils";
function len(o: object | string | Array<any>): number {
// Acts akin to Python's builtin 'len' method
if (Array.isArray(o)) {
@ -24,6 +26,15 @@ export function escapeBraces(str: string): string {
return str.replace(/[{}]/g, '\\$&');
}
/**
* Whether s1 and s2 contain the same set of template variables.
*/
export function containsSameTemplateVariables(s1: string, s2: string): boolean {
const vars1 = new Set(new StringTemplate(s1).get_vars());
const vars2 = new Set(new StringTemplate(s2).get_vars());
return isEqual(vars1, vars2);
}
/**
* Given a string, returns the same string with the \ before any braces \{ and \} removed. Does nothing else.
* @param str The string to transform

View File

@ -897,9 +897,9 @@ export const filterDict = (dict: Dict, keyFilterFunc: (key: string) => boolean)
};
export const processCSV = (csv: string): string[] => {
var matches = csv.match(/(\s*"[^"]+"\s*|\s*[^,]+|,)(?=,|$)/g);
let matches = csv.match(/(\s*"[^"]+"\s*|\s*[^,]+|,)(?=,|$)/g);
if (!matches) return;
for (var n = 0; n < matches.length; ++n) {
for (let n = 0; n < matches.length; ++n) {
matches[n] = matches[n].trim();
if (matches[n] == ',') matches[n] = '';
}
@ -964,4 +964,15 @@ export const toStandardResponseFormat = (r) => {
if ('chat_history' in r)
resp_obj.chat_history = r.chat_history;
return resp_obj;
};
// Check if the current browser window/tab is 'active' or not
export const browserTabIsActive = () => {
try {
const visible = document.visibilityState === 'visible';
return visible;
} catch(e) {
console.error(e);
return true; // indeterminate
}
};

View File

@ -2,7 +2,7 @@ import { queryLLM, executejs, executepy,
fetchExampleFlow, fetchOpenAIEval, importCache,
exportCache, countQueries, grabResponses,
generatePrompts, initCustomProvider,
removeCustomProvider, evalWithLLM, loadCachedCustomProviders } from "./backend/backend";
removeCustomProvider, evalWithLLM, loadCachedCustomProviders, fetchEnvironAPIKeys } from "./backend/backend";
const clone = (obj) => JSON.parse(JSON.stringify(obj));
@ -30,6 +30,8 @@ async function _route_to_js_backend(route, params) {
return fetchExampleFlow(params.name);
case 'fetchOpenAIEval':
return fetchOpenAIEval(params.name);
case 'fetchEnvironAPIKeys':
return fetchEnvironAPIKeys();
case 'initCustomProvider':
return initCustomProvider(params.code);
case 'removeCustomProvider':

View File

@ -10,6 +10,7 @@ import { APP_IS_RUNNING_LOCALLY } from './backend/utils';
// Initial project settings
const initialAPIKeys = {};
const initialFlags = { "aiSupport": true };
const initialLLMColors = {};
/** The color palette used for displaying info about different LLMs. */
@ -56,7 +57,21 @@ const useStore = create((set, get) => ({
// Keeping track of LLM API keys
apiKeys: initialAPIKeys,
setAPIKeys: (apiKeys) => {
set({apiKeys: apiKeys});
// Filter out any empty or incorrectly formatted API key values:
const new_keys = filterDict(apiKeys, (key) => typeof apiKeys[key] === "string" && apiKeys[key].length > 0);
// Only update API keys present in the new array; don't delete existing ones:
set({apiKeys: {...get().apiKeys, ...new_keys}});
},
// Flags to toggle on or off features across the application
flags: initialFlags,
getFlag: (flagName) => {
return get().flags[flagName] ?? false;
},
setFlag: (flagName, flagValue) => {
let flags = {...get().flags};
flags[flagName] = flagValue;
set({flags: flags});
},
// Keep track of LLM colors, to ensure color consistency across various plots and displays

View File

@ -917,6 +917,58 @@
transform: skewX(-20deg);
}
/* AI button */
.ai-button {
position: relative;
padding: 2px 6px;
margin-top: -7px;
margin-right: 3px;
border-radius: 5px;
border: 1px solid #999;
font-size: 12px;
color: #666;
overflow: hidden;
box-shadow: 0 0 0 0 transparent;
-webkit-transition: all 0.2s ease-in;
-moz-transition: all 0.2s ease-in;
transition: all 0.2s ease-in;
cursor: pointer;
background: #f2d2f3;
}
.ai-button:hover {
background: rgb(177,63,204);
color: white;
box-shadow: 0 0 30px 5px rgb(177,63,204);
-webkit-transition: all 0.2s ease-out;
-moz-transition: all 0.2s ease-out;
transition: all 0.2s ease-out;
}
.ai-button:active {
background: rgb(129,40,151);
}
.ai-button:hover::before {
-moz-animation: sh02 0.5s 0s linear;
animation: sh02 0.5s 0s linear;
}
.ai-button::before {
content: '';
display: block;
width: 0px;
height: 86%;
position: absolute;
top: 7%;
left: 0%;
opacity: 0;
color: black;
background: #fff;
box-shadow: 0 0 50px 30px #fff;
-webkit-transform: skewX(-20deg);
-moz-transform: skewX(-20deg);
-ms-transform: skewX(-20deg);
-o-transform: skewX(-20deg);
transform: skewX(-20deg);
}
/* Make text blink */
.text-blink {
animation: blinker .75s linear infinite;

View File

@ -6,7 +6,7 @@ def readme():
setup(
name='chainforge',
version='0.2.7.8',
version='0.2.8.0',
packages=find_packages(),
author="Ian Arawjo",
description="A Visual Programming Environment for Prompt Engineering",