feat: implement human/LLM dual-format databank architecture with Joplin integration\n\n- Restructure databank with collab/artifacts/human/llm top-level directories\n- Move CTO and COO directories under pmo/artifacts/ as requested\n- Create dual-format architecture for human-friendly markdown and LLM-optimized structured data\n- Add Joplin integration pipeline in databank/collab/fromjoplin/\n- Create intake system with templates, responses, and workflows\n- Add sample files demonstrating human/LLM format differences\n- Link to TSYSDevStack repository in main README\n- Update PMO structure to reflect CTO/COO under artifacts/\n- Add processing scripts and workflows for automated conversion\n- Maintain clear separation between editable collab/ and readonly databank/\n- Create comprehensive README documentation for new architecture\n- Ensure all changes align with single source of truth principle

Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>
This commit is contained in:
2025-10-24 12:15:36 -05:00
parent 61919ae452
commit 919349aad2
34 changed files with 1154 additions and 14 deletions

View File

@@ -0,0 +1,117 @@
# AI Tools and Agent Preferences Intake Template
## Overview
This template guides the collection of AI tool preferences and agent interaction guidelines.
## Interview Structure
### 1. Current Tool Usage
- Primary tools and their roles
- Subscription status and limitations
- Usage patterns and workflows
- Strengths and limitations of each tool
- Quota management and availability strategies
- Backup and alternative tool selections
### 2. Agent Guidelines and Rules
- Core operating principles
- Communication protocols and expectations
- Documentation standards and formats
- Quality assurance and validation approaches
- Error handling and recovery procedures
- Security and privacy considerations
### 3. Workflow Preferences
- Preferred interaction styles
- Response length and detail expectations
- Formatting and presentation preferences
- Decision-making and approval processes
- Feedback and iteration approaches
- Collaboration and delegation patterns
### 4. Technical Environment
- Development environment preferences
- Tool integration and interoperability
- Version control and change management
- Testing and quality assurance practices
- Deployment and delivery mechanisms
- Monitoring and observability requirements
### 5. Performance Optimization
- Token efficiency strategies
- Context window management
- Response time expectations
- Resource utilization considerations
- Cost optimization approaches
- Scalability and reliability requirements
## Response Format
Please provide responses in the following structured format:
```yaml
tool_usage:
primary_tools:
- name: ""
role: ""
subscription_status: ""
usage_patterns: []
strengths: []
limitations: []
quota_management:
strategies: []
backup_selections: []
workflow_integration:
primary_flows: []
backup_flows: []
agent_guidelines:
core_principles: []
communication_protocols: []
documentation_standards: []
quality_assurance: []
error_handling: []
security_considerations: []
workflow_preferences:
interaction_styles: []
response_expectations:
length_preference: ""
detail_level: ""
formatting_preferences: []
decision_processes: []
feedback_approaches: []
collaboration_patterns: []
technical_environment:
development_preferences: []
tool_integration: []
version_control: []
testing_practices: []
deployment_mechanisms: []
monitoring_requirements: []
performance_optimization:
token_efficiency: []
context_management: []
response_time: []
resource_utilization: []
cost_optimization: []
scalability_requirements: []
```
## Validation Criteria
- Alignment with current tool subscriptions
- Consistency with documented workflows
- Practicality of implementation
- Completeness of coverage
- Clarity of expectations
## Frequency
This intake should be updated:
- Semi-annually for tool changes
- As-needed for workflow modifications
- Quarterly for performance optimization reviews