1 line
76 KiB
Plaintext
Raw Permalink Normal View History

TypeScript backend, HuggingFace models, JavaScript evaluators, Comment Nodes, and more (#81) * Beginning to convert Python backend to Typescript * Change all fetch() calls to fetch_from_backend switcher * wip converting query.py to query.ts * wip started utils.js conversion. Tested that OpenAI API call works * more progress on converting utils.py to Typescript * jest tests for query, utils, template.ts. Confirmed PromptPipeline works. * wip converting queryLLM in flask_app to TS * Tested queryLLM and StorageCache compressed saving/loading * wip execute() in backend.ts * Added execute() and tested w concrete func. Need to test eval() * Added craco for optional webpack config. Config'd for TypeScript with Node.js packages browserify'd * Execute JS code on iframe sandbox * Tested and working JS Evaluator execution. * wip swapping backends * Tested TypeScript backendgit status! :) woot * Added fetchEnvironAPIKeys to Flask server to fetch os.environ keys when running locally * Route Anthropic calls through Flask when running locally * Added info button to Eval nodes. Rebuilt react * Edits to info modal on Eval node * Remove/error out on Python eval nodes when not running locally. * Check browser compat and display error if not supported * Changed all example flows to use JS. Bug fix in query.ts * Refactored to LLMProvider to streamline model additions * Added HuggingFace models API * Added back Dalai call support, routing through Flask * Remove flask app calls and socketio server that are no longer used * Added Comment Nodes. Rebuilt react. * Fix PaLM temp=0 build, update package vers and rebuild react
2023-06-30 15:11:20 -04:00
{"flow": {"nodes": [{"width": 312, "height": 311, "id": "prompt-food", "type": "prompt", "data": {"prompt": "{prompt}", "n": 1, "llms": [{"key": "aa3c0f03-22bd-416e-af4d-4bf5c4278c99", "settings": {"system_msg": "I want you to act as a gastronomer. Do not explain the answer you wrote. Answer Yes or No only.", "temperature": 1, "functions": [], "function_call": "", "top_p": 1, "stop": [], "presence_penalty": 0, "frequency_penalty": 0}, "name": "GPT3.5", "emoji": "\ud83d\ude42", "model": "gpt-3.5-turbo", "base_model": "gpt-3.5-turbo", "temp": 1, "formData": {"shortname": "GPT3.5", "model": "gpt-3.5-turbo", "system_msg": "I want you to act as a gastronomer. Do not explain the answer you wrote. Answer Yes or No only.", "temperature": 1, "functions": "", "function_call": "", "top_p": 1, "stop": "", "presence_penalty": 0, "frequency_penalty": 0}}]}, "position": {"x": 448, "y": 224}, "selected": false, "positionAbsolute": {"x": 448, "y": 224}, "dragging": false}, {"width": 333, "height": 182, "id": "eval-food", "type": "evaluator", "data": {"code": "function evaluate(response) {\n\tlet ideal = response.meta['Ideal'];\n\treturn response.text.startsWith(ideal);\n}", "language": "javascript"}, "position": {"x": 820, "y": 150}, "positionAbsolute": {"x": 820, "y": 150}}, {"width": 228, "height": 196, "id": "vis-food", "type": "vis", "data": {"input": "eval-food"}, "position": {"x": 1200, "y": 250}, "positionAbsolute": {"x": 1200, "y": 250}}, {"width": 302, "height": 260, "id": "inspect-food", "type": "inspect", "data": {"input": "prompt-food"}, "position": {"x": 820, "y": 400}, "positionAbsolute": {"x": 820, "y": 400}}, {"width": 423, "height": 417, "id": "table-food", "type": "table", "data": {"rows": [{"prompt": "Is the following string a food item or related to food? 100 grand bar", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? 15 bean soup", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? 3 musketeers ", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? 50/50 burger", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? 98% cocoa stevia bar", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? a.1. steak sauce", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? a1 steak sauce", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abalone", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abba-zaba", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abbacchio alla cacciatora", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abbamar", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abbaye de belloc", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abelia", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abeliophyllum", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abelmoschus", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abelmoschus moschatus", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? aberdeenshire ", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? aberfest", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abernethy biscuit", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abertam cheese", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abgoosht", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abies", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to food? abiu", "ideal": "Yes"}, {"prompt": "Is the following string a food item or related to foo