# PHASE 10: AI-Powered Development Platform ## Complete Task Documentation **Status:** BLUE SKY → CONCRETE TASKS **Total Duration:** 24-32 weeks **Total Effort:** 400-550 hours **Dependencies:** Phase 6 (UBA) recommended but not blocking for 10A --- ## Phase Overview | Phase | Name | Duration | Tasks | Effort | | ------- | ------------------------------- | ---------- | -------- | ------------- | | **10A** | Project Structure Modernization | 6-8 weeks | 9 tasks | 80-110 hours | | **10B** | Frontend AI Assistant | 6-8 weeks | 8 tasks | 100-130 hours | | **10C** | Backend Creation AI | 8-10 weeks | 10 tasks | 140-180 hours | 9A: Project Structure ──────────────────────────────┐ │ │ ├──────────────────┐ │ ↓ ↓ ↓ 9B: Frontend AI 9F: Migration System 9E: DEPLOY Updates │ │ │ └─────────┬────────┘ │ ↓ │ 9C: Backend AI │ │ │ └─────────────────┬──────────────────┘ ↓ 9D: Unified Experience ``` --- # PHASE 10A: PROJECT STRUCTURE MODERNIZATION | **10D** | Unified AI Experience | 4-6 weeks | 6 tasks | 60-80 hours | | **10E** | DEPLOY System Updates | 1-2 weeks | 4 tasks | 20-30 hours | | **10F** | Legacy Migration System | 2-3 weeks | 5 tasks | 40-50 hours | ``` PHASE 10 DEPENDENCY GRAPH 10A: Project Structure ─────────────────────────────┐ │ │ ├──────────────────┐ │ ↓ ↓ ↓ 10B: Frontend AI 10F: Migration System 10E: DEPLOY Updates │ │ │ └─────────┬────────┘ │ ↓ │ 10C: Backend AI │ │ │ └─────────────────┬──────────────────┘ ↓ 10D: Unified Experience ``` --- # PHASE 10A: PROJECT STRUCTURE MODERNIZATION ======================== 9A: Project Structure ──────────────────────────────┐ │ │ ├──────────────────┐ │ ↓ ↓ ↓ 9B: Frontend AI 9F: Migration System 9E: DEPLOY Updates │ │ │ └─────────┬────────┘ │ ↓ │ 9C: Backend AI │ │ │ └─────────────────┬──────────────────┘ ↓ 9D: Unified Experience ``` --- # PHASE 10A: PROJECT STRUCTURE MODERNIZATION ## Overview Transform the monolithic `project.json` into a React-style component file structure to enable AI assistance, improve Git collaboration, and enhance performance. **Duration:** 6-8 weeks **Total Effort:** 80-110 hours **Priority:** CRITICAL - Blocks all AI features **Depends on:** Phase 6 (UBA) recommended but not blocking ### The Problem ``` Current: Single project.json with 50,000+ lines - Can't fit in AI context window - 200,000 tokens to read entire project - Any edit risks corrupting unrelated components - Git diffs are meaningless Target: Component-based file structure - 3,000 tokens to read one component - Surgical AI modifications - Meaningful Git diffs - 10x faster operations ``` ### Target Structure ``` project/ ├── nodegx.project.json # Project metadata (~100 lines) ├── nodegx.routes.json # Route definitions ├── nodegx.styles.json # Global styles ├── components/ │ ├── _registry.json # Component index │ ├── App/ │ │ ├── component.json # Metadata, ports │ │ ├── nodes.json # Node graph │ │ └── connections.json # Wiring │ ├── pages/ │ │ ├── HomePage/ │ │ └── ProfilePage/ │ └── shared/ │ ├── Button/ │ └── Avatar/ └── models/ ├── _registry.json └── User.json ``` --- ## STRUCT-001: JSON Schema Definition **Effort:** 12-16 hours (3 days) **Priority:** Critical **Blocks:** All other STRUCT tasks ### Description Define formal JSON schemas for all new file formats. These schemas enable validation, IDE support, and serve as documentation. ### Deliverables ``` schemas/ ├── project-v2.schema.json # Root project file ├── component.schema.json # Component metadata ├── nodes.schema.json # Node definitions ├── connections.schema.json # Connection definitions ├── registry.schema.json # Component registry ├── routes.schema.json # Route definitions ├── styles.schema.json # Global styles └── model.schema.json # Model definitions ``` ### Schema: component.json ```json { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://nodegx.dev/schemas/component-v2.json", "title": "Nodegx Component", "type": "object", "required": ["id", "name", "type"], "properties": { "id": { "type": "string", "pattern": "^comp_[a-z0-9_]+$", "description": "Unique component identifier" }, "name": { "type": "string", "description": "Component name (folder name)" }, "displayName": { "type": "string", "description": "Human-readable display name" }, "path": { "type": "string", "description": "Original Noodl path for backward compatibility" }, "type": { "type": "string", "enum": ["root", "page", "visual", "logic", "cloud"], "description": "Component type" }, "description": { "type": "string" }, "category": { "type": "string" }, "tags": { "type": "array", "items": { "type": "string" } }, "ports": { "type": "object", "properties": { "inputs": { "type": "array", "items": { "$ref": "#/definitions/port" } }, "outputs": { "type": "array", "items": { "$ref": "#/definitions/port" } } } }, "dependencies": { "type": "array", "items": { "type": "string" }, "description": "Paths to components this component uses" }, "created": { "type": "string", "format": "date-time" }, "modified": { "type": "string", "format": "date-time" } }, "definitions": { "port": { "type": "object", "required": ["name", "type"], "properties": { "name": { "type": "string" }, "type": { "type": "string" }, "displayName": { "type": "string" }, "description": { "type": "string" }, "default": {}, "required": { "type": "boolean" } } } } } ``` ### Schema: nodes.json ```json { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://nodegx.dev/schemas/nodes-v2.json", "title": "Nodegx Component Nodes", "type": "object", "required": ["componentId", "nodes"], "properties": { "componentId": { "type": "string", "description": "Reference to parent component" }, "version": { "type": "integer", "description": "Schema version for migrations" }, "nodes": { "type": "array", "items": { "$ref": "#/definitions/node" } } }, "definitions": { "node": { "type": "object", "required": ["id", "type"], "properties": { "id": { "type": "string" }, "type": { "type": "string" }, "label": { "type": "string" }, "position": { "type": "object", "properties": { "x": { "type": "number" }, "y": { "type": "number" } } }, "parent": { "type": "string" }, "children": { "type": "array", "items": { "type": "string" } }, "properties": { "type": "object" }, "ports": { "type": "array" }, "dynamicports": { "type": "array" }, "states": { "type": "object" }, "metadata": { "type": "object" } } } } } ``` ### Schema: connections.json ```json { "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://nodegx.dev/schemas/connections-v2.json", "title": "Nodegx Component Connections", "type": "object", "required": ["componentId", "connections"], "properties": { "componentId": { "type": "string" }, "version": { "type": "integer" }, "connections": { "type": "array", "items": { "$ref": "#/definitions/connection" } } }, "definitions": { "connection": { "type": "object", "required": ["id", "from", "to"], "properties": { "id": { "type": "string" }, "from": { "type": "object", "required": ["node", "port"], "properties": { "node": { "type": "string" }, "port": { "type": "string" } } }, "to": { "type": "object", "required": ["node", "port"], "properties": { "node": { "type": "string" }, "port": { "type": "string" } } } } } } } ``` ### Implementation Steps 1. Study existing project.json structure exhaustively 2. Map all node types and their serialization 3. Design schemas with backward compatibility 4. Create JSON Schema files 5. Build validation utilities using Ajv 6. Test against 10+ real projects ### Acceptance Criteria - [ ] All 8 schema files defined - [ ] Schemas published to `/schemas` endpoint - [ ] Validation utility created and tested - [ ] IDE autocomplete works with schemas - [ ] 100% coverage of existing node types - [ ] Edge cases documented ### Files to Create ``` packages/noodl-editor/src/schemas/ ├── index.ts # Schema exports ├── project-v2.schema.json ├── component.schema.json ├── nodes.schema.json ├── connections.schema.json ├── registry.schema.json ├── routes.schema.json ├── styles.schema.json ├── model.schema.json └── validator.ts # Ajv validation wrapper ``` --- ## STRUCT-002: Export Engine Core **Effort:** 16-20 hours (4 days) **Priority:** Critical **Depends on:** STRUCT-001 **Blocks:** STRUCT-003, STRUCT-006 ### Description Build the engine that converts legacy `project.json` to the new multi-file format. ### Core Class ```typescript // packages/noodl-editor/src/editor/src/services/ProjectStructure/Exporter.ts import * as path from 'path'; import { validateSchema } from '@nodegx/schemas'; import * as fs from 'fs-extra'; import { LegacyProject, LegacyComponent } from './types/legacy'; import { ModernProject, ComponentFiles } from './types/modern'; export interface ExportOptions { outputDir: string; preserveOriginalPaths: boolean; validateOutput: boolean; onProgress?: (percent: number, message: string) => void; } export interface ExportResult { success: boolean; componentsExported: number; modelsExported: number; warnings: string[]; errors: string[]; outputPath: string; } export class ProjectExporter { private options: ExportOptions; constructor(options: ExportOptions) { this.options = options; } async export(project: LegacyProject): Promise { const result: ExportResult = { success: false, componentsExported: 0, modelsExported: 0, warnings: [], errors: [], outputPath: this.options.outputDir }; try { this.report(0, 'Starting export...'); // 1. Create output directory structure await this.createDirectoryStructure(); this.report(5, 'Directory structure created'); // 2. Export project metadata await this.exportProjectMetadata(project); this.report(10, 'Project metadata exported'); // 3. Export components const componentCount = Object.keys(project.components || {}).length; let processed = 0; const registry: ComponentRegistry = { version: 1, lastUpdated: new Date().toISOString(), components: {}, stats: { totalComponents: 0, totalNodes: 0, totalConnections: 0 } }; for (const [legacyPath, component] of Object.entries(project.components || {})) { try { const componentResult = await this.exportComponent(legacyPath, component); registry.components[componentResult.name] = componentResult.registryEntry; registry.stats.totalComponents++; registry.stats.totalNodes += componentResult.nodeCount; registry.stats.totalConnections += componentResult.connectionCount; result.componentsExported++; } catch (error) { result.warnings.push(`Failed to export component ${legacyPath}: ${error.message}`); } processed++; this.report(10 + (processed / componentCount) * 70, `Exported ${processed}/${componentCount} components`); } // 4. Write registry await this.writeRegistry(registry); this.report(85, 'Registry written'); // 5. Export models await this.exportModels(project); this.report(90, 'Models exported'); // 6. Export styles await this.exportStyles(project); this.report(95, 'Styles exported'); // 7. Validate if requested if (this.options.validateOutput) { const validation = await this.validateExport(); result.warnings.push(...validation.warnings); if (validation.errors.length > 0) { result.errors.push(...validation.errors); result.success = false; return result; } } this.report(100, 'Export complete'); result.success = true; } catch (error) { result.errors.push(`Export failed: ${error.message}`); result.success = false; } return result; } private async exportComponent(legacyPath: string, component: LegacyComponent): Promise { // Convert path: "/#__cloud__/SendGrid/Send Email" → "cloud/SendGrid/SendEmail" const folderPath = this.normalizePath(legacyPath); const outputDir = path.join(this.options.outputDir, 'components', folderPath); await fs.mkdir(outputDir, { recursive: true }); // Extract and write component.json const componentMeta = this.extractComponentMeta(legacyPath, component); await this.atomicWrite(path.join(outputDir, 'component.json'), JSON.stringify(componentMeta, null, 2)); // Extract and write nodes.json const nodes = this.extractNodes(component); await this.atomicWrite(path.join(outputDir, 'nodes.json'), JSON.stringify(nodes, null, 2)); // Extract and write connections.json const connections = this.extractConnections(component); await this.atomicWrite(path.join(outputDir, 'connections.json'), JSON.stringify(connections, null, 2)); // Write variants if exist if (component.variants && Object.keys(component.variants).length > 0) { await this.atomicWrite(path.join(outputDir, 'variants.json'), JSON.stringify(component.variants, null, 2)); } return { name: this.getComponentName(legacyPath), path: folderPath, nodeCount: nodes.nodes.length, connectionCount: connections.connections.length, registryEntry: { path: folderPath, type: this.inferComponentType(legacyPath, component), created: new Date().toISOString(), modified: new Date().toISOString(), nodeCount: nodes.nodes.length, connectionCount: connections.connections.length } }; } private extractComponentMeta(legacyPath: string, component: LegacyComponent): ComponentMeta { return { id: component.id || this.generateId(), name: this.getComponentName(legacyPath), displayName: component.name || this.getDisplayName(legacyPath), path: legacyPath, // PRESERVE original path for cross-component references! type: this.inferComponentType(legacyPath, component), ports: { inputs: this.extractPorts(component, 'input'), outputs: this.extractPorts(component, 'output') }, dependencies: this.extractDependencies(component), created: new Date().toISOString(), modified: new Date().toISOString() }; } private extractNodes(component: LegacyComponent): NodesFile { const nodes = (component.graph?.roots || []).map((node) => ({ id: node.id, type: node.type, // Keep EXACTLY as-is for component references label: node.label, position: { x: node.x, y: node.y }, parent: node.parent, children: node.children || [], properties: node.parameters || {}, ports: node.ports || [], dynamicports: node.dynamicports || [], states: node.states, metadata: node.metadata })); return { componentId: component.id, version: 1, nodes }; } private extractConnections(component: LegacyComponent): ConnectionsFile { const connections = (component.graph?.connections || []).map((conn, index) => ({ id: conn.id || `conn_${index}`, from: { node: conn.fromId, port: conn.fromProperty }, to: { node: conn.toId, port: conn.toProperty } })); return { componentId: component.id, version: 1, connections }; } private normalizePath(legacyPath: string): string { // "/#__cloud__/SendGrid/Send Email" → "cloud/SendGrid/SendEmail" return legacyPath .replace(/^\/#?/, '') // Remove leading /# or / .replace(/__cloud__/g, 'cloud') // Normalize cloud prefix .replace(/\s+/g, '') // Remove spaces .replace(/[^a-zA-Z0-9/]/g, '_'); // Replace special chars } private async atomicWrite(filePath: string, content: string): Promise { const tempPath = `${filePath}.tmp`; const backupPath = `${filePath}.backup`; try { await fs.writeFile(tempPath, content); // Verify write const verify = await fs.readFile(tempPath, 'utf-8'); if (verify !== content) { throw new Error('Write verification failed'); } // Backup existing if (await fs.pathExists(filePath)) { await fs.rename(filePath, backupPath); } // Atomic rename await fs.rename(tempPath, filePath); // Remove backup on success if (await fs.pathExists(backupPath)) { await fs.unlink(backupPath); } } catch (error) { // Restore from backup if (await fs.pathExists(backupPath)) { await fs.rename(backupPath, filePath); } throw error; } } private report(percent: number, message: string): void { this.options.onProgress?.(percent, message); } } ``` ### Implementation Steps 1. Create type definitions for legacy and modern formats 2. Implement core Exporter class 3. Implement path normalization (handle special chars) 4. Implement atomic file writes 5. Implement metadata extraction 6. Implement node extraction (preserve all fields) 7. Implement connection extraction 8. Implement registry generation 9. Add progress reporting 10. Test with 10+ real projects ### Acceptance Criteria - [ ] Exports all component types correctly - [ ] Preserves all node properties and metadata - [ ] Preserves all connections - [ ] Handles special characters in paths - [ ] Handles deeply nested components - [ ] Atomic writes prevent corruption - [ ] Progress reporting works - [ ] Performance: <30 seconds for 200 components ### Test Cases ```typescript describe('ProjectExporter', () => { it('should export simple project', async () => { const result = await exporter.export(simpleProject); expect(result.success).toBe(true); expect(result.componentsExported).toBe(5); }); it('should handle cloud components', async () => { const result = await exporter.export(cloudProject); expect(result.success).toBe(true); // Check cloud/ directory created }); it('should preserve original paths for references', async () => { const result = await exporter.export(projectWithRefs); const componentMeta = await readJSON('components/Header/component.json'); expect(componentMeta.path).toBe('/#Header'); // Original preserved }); it('should handle special characters in names', async () => { const result = await exporter.export(projectWithSpecialChars); // Verify folders created without errors }); it('should preserve AI metadata', async () => { const result = await exporter.export(projectWithAIHistory); const nodes = await readJSON('components/Function/nodes.json'); expect(nodes.nodes[0].metadata.prompt.history).toBeDefined(); }); }); ``` --- ## STRUCT-003: Import Engine Core **Effort:** 16-20 hours (4 days) **Priority:** Critical **Depends on:** STRUCT-001, STRUCT-002 **Blocks:** STRUCT-004 ### Description Build the engine that converts the new multi-file format back to legacy `project.json` for runtime compatibility. ### Core Class ```typescript // packages/noodl-editor/src/editor/src/services/ProjectStructure/Importer.ts export class ProjectImporter { async import(projectDir: string): Promise { // 1. Read project metadata const projectMeta = await this.readProjectFile(projectDir); // 2. Read registry const registry = await this.readRegistry(projectDir); // 3. Import all components const components: Record = {}; for (const [name, info] of Object.entries(registry.components)) { const componentDir = path.join(projectDir, 'components', info.path); const legacyPath = await this.getLegacyPath(componentDir); components[legacyPath] = await this.importComponent(componentDir); } // 4. Import models const models = await this.importModels(projectDir); // 5. Import styles const styles = await this.importStyles(projectDir); // 6. Reconstruct full project return { name: projectMeta.name, version: projectMeta.version, components, variants: await this.importVariants(projectDir), styles, cloudservices: projectMeta.cloudservices, metadata: projectMeta.metadata }; } async importComponent(componentDir: string): Promise { const meta = await this.readJSON(path.join(componentDir, 'component.json')); const nodes = await this.readJSON(path.join(componentDir, 'nodes.json')); const connections = await this.readJSON(path.join(componentDir, 'connections.json')); // Reconstruct legacy format return { id: meta.id, name: meta.displayName, graph: { roots: nodes.nodes.map((node) => ({ id: node.id, type: node.type, label: node.label, x: node.position?.x || 0, y: node.position?.y || 0, parameters: node.properties, ports: node.ports, dynamicports: node.dynamicports, children: node.children, metadata: node.metadata, states: node.states })), connections: connections.connections.map((conn) => ({ fromId: conn.from.node, fromProperty: conn.from.port, toId: conn.to.node, toProperty: conn.to.port })) } // ... other legacy fields }; } async importSingleComponent(componentPath: string): Promise { // For incremental imports - load just one component return this.importComponent(componentPath); } } ``` ### Round-Trip Validation ```typescript // packages/noodl-editor/src/editor/src/services/ProjectStructure/Validator.ts export class RoundTripValidator { async validate(original: LegacyProject, imported: LegacyProject): Promise { const errors: string[] = []; const warnings: string[] = []; // 1. Component count const origCount = Object.keys(original.components || {}).length; const impCount = Object.keys(imported.components || {}).length; if (origCount !== impCount) { errors.push(`Component count mismatch: ${origCount} → ${impCount}`); } // 2. Deep compare each component for (const [path, origComp] of Object.entries(original.components || {})) { const impComp = imported.components[path]; if (!impComp) { errors.push(`Missing component: ${path}`); continue; } // Node count const origNodes = origComp.graph?.roots?.length || 0; const impNodes = impComp.graph?.roots?.length || 0; if (origNodes !== impNodes) { errors.push(`Node count mismatch in ${path}: ${origNodes} → ${impNodes}`); } // Connection count const origConns = origComp.graph?.connections?.length || 0; const impConns = impComp.graph?.connections?.length || 0; if (origConns !== impConns) { errors.push(`Connection count mismatch in ${path}: ${origConns} → ${impConns}`); } // Deep compare nodes for (const origNode of origComp.graph?.roots || []) { const impNode = impComp.graph?.roots?.find((n) => n.id === origNode.id); if (!impNode) { errors.push(`Missing node ${origNode.id} in ${path}`); continue; } // Type must match exactly if (origNode.type !== impNode.type) { errors.push(`Node type mismatch: ${origNode.type} → ${impNode.type}`); } // Metadata must be preserved if (!deepEqual(origNode.metadata, impNode.metadata)) { warnings.push(`Metadata changed for node ${origNode.id} in ${path}`); } } } return { valid: errors.length === 0, errors, warnings }; } } ``` ### Acceptance Criteria - [ ] Imports all components correctly - [ ] Reconstructs legacy format exactly - [ ] Round-trip validation passes for all test projects - [ ] Single component import works - [ ] Performance: <5 seconds for 200 components - [ ] Handles missing files gracefully --- ## STRUCT-004: Editor Format Detection **Effort:** 6-8 hours (1.5 days) **Priority:** High **Depends on:** STRUCT-003 ### Description Add automatic detection of project format (legacy vs v2) when opening projects. ### Implementation ```typescript // packages/noodl-editor/src/editor/src/services/ProjectStructure/FormatDetector.ts export enum ProjectFormat { LEGACY = 'legacy', V2 = 'v2', UNKNOWN = 'unknown' } export interface FormatDetectionResult { format: ProjectFormat; confidence: 'high' | 'medium' | 'low'; details: { hasProjectJson: boolean; hasNodgexProject: boolean; hasComponentsDir: boolean; hasRegistry: boolean; }; } export class FormatDetector { async detect(projectPath: string): Promise { const details = { hasProjectJson: await fs.pathExists(path.join(projectPath, 'project.json')), hasNodgexProject: await fs.pathExists(path.join(projectPath, 'nodegx.project.json')), hasComponentsDir: await fs.pathExists(path.join(projectPath, 'components')), hasRegistry: await fs.pathExists(path.join(projectPath, 'components', '_registry.json')) }; // V2 format: has nodegx.project.json + components directory if (details.hasNodgexProject && details.hasComponentsDir && details.hasRegistry) { return { format: ProjectFormat.V2, confidence: 'high', details }; } // Legacy format: has project.json only if (details.hasProjectJson && !details.hasNodgexProject) { return { format: ProjectFormat.LEGACY, confidence: 'high', details }; } // Mixed or unknown if (details.hasProjectJson && details.hasNodgexProject) { // Both exist - prefer V2 but warn return { format: ProjectFormat.V2, confidence: 'medium', details }; } return { format: ProjectFormat.UNKNOWN, confidence: 'low', details }; } } ``` ### Integration with Project Loading ```typescript // packages/noodl-editor/src/editor/src/models/projectmodel.ts class ProjectModel { private format: ProjectFormat = ProjectFormat.LEGACY; private formatDetector = new FormatDetector(); async loadProject(projectPath: string): Promise { const detection = await this.formatDetector.detect(projectPath); this.format = detection.format; if (detection.confidence !== 'high') { console.warn(`Project format detection confidence: ${detection.confidence}`); } if (this.format === ProjectFormat.V2) { await this.loadV2Project(projectPath); } else if (this.format === ProjectFormat.LEGACY) { await this.loadLegacyProject(projectPath); } else { throw new Error('Unknown project format'); } } private async loadV2Project(projectPath: string): Promise { // Only load metadata and registry - components on demand this.metadata = await this.loadProjectMeta(projectPath); this.registry = await this.loadRegistry(projectPath); // Components loaded lazily via getComponent() } } ``` ### Acceptance Criteria - [ ] Correctly identifies legacy format - [ ] Correctly identifies V2 format - [ ] Handles mixed/corrupted states - [ ] Reports confidence level - [ ] Integrates with project loading --- ## STRUCT-005: Lazy Component Loading **Effort:** 12-16 hours (3 days) **Priority:** High **Depends on:** STRUCT-004 ### Description Implement on-demand component loading for V2 format projects to reduce memory usage and improve startup time. ### Implementation ```typescript // packages/noodl-editor/src/editor/src/services/ProjectStructure/ComponentLoader.ts export class ComponentLoader { private cache: Map = new Map(); private maxCacheAge = 5 * 60 * 1000; // 5 minutes private maxCacheSize = 50; // components async loadComponent(projectPath: string, componentPath: string): Promise { const cacheKey = `${projectPath}:${componentPath}`; // Check cache const cached = this.cache.get(cacheKey); if (cached && Date.now() - cached.loadedAt < this.maxCacheAge) { return cached.component; } // Load from files const componentDir = path.join(projectPath, 'components', componentPath); const [meta, nodes, connections] = await Promise.all([ this.readJSON(path.join(componentDir, 'component.json')), this.readJSON(path.join(componentDir, 'nodes.json')), this.readJSON(path.join(componentDir, 'connections.json')) ]); const component = this.reconstructComponent(meta, nodes, connections); // Update cache this.cache.set(cacheKey, { component, loadedAt: Date.now() }); this.pruneCache(); return component; } async preloadComponents(projectPath: string, componentPaths: string[]): Promise { // Parallel loading for known needed components await Promise.all(componentPaths.map((p) => this.loadComponent(projectPath, p))); } invalidate(componentPath?: string): void { if (componentPath) { // Invalidate specific component for (const key of this.cache.keys()) { if (key.includes(componentPath)) { this.cache.delete(key); } } } else { // Invalidate all this.cache.clear(); } } private pruneCache(): void { if (this.cache.size > this.maxCacheSize) { // Remove oldest entries const entries = Array.from(this.cache.entries()).sort((a, b) => a[1].loadedAt - b[1].loadedAt); const toRemove = entries.slice(0, this.cache.size - this.maxCacheSize); toRemove.forEach(([key]) => this.cache.delete(key)); } } } ``` ### Acceptance Criteria - [ ] Components load on demand - [ ] Cache improves repeated access - [ ] Cache eviction works correctly - [ ] Memory usage stays bounded - [ ] Parallel preloading works - [ ] Cache invalidation works --- ## STRUCT-006: Component-Level Save **Effort:** 12-16 hours (3 days) **Priority:** High **Depends on:** STRUCT-002, STRUCT-005 ### Description Implement saving changes to individual component files instead of rewriting the entire project. ### Implementation ```typescript // packages/noodl-editor/src/editor/src/services/ProjectStructure/ComponentSaver.ts export class ComponentSaver { private pendingWrites: Map = new Map(); private writeDebounce = 500; // ms async saveComponent(projectPath: string, componentPath: string, component: Component): Promise { const componentDir = path.join(projectPath, 'components', componentPath); // Ensure directory exists await fs.mkdir(componentDir, { recursive: true }); // Prepare files const meta = this.extractMeta(component); const nodes = this.extractNodes(component); const connections = this.extractConnections(component); // Atomic writes await Promise.all([ this.atomicWrite(path.join(componentDir, 'component.json'), JSON.stringify(meta, null, 2)), this.atomicWrite(path.join(componentDir, 'nodes.json'), JSON.stringify(nodes, null, 2)), this.atomicWrite(path.join(componentDir, 'connections.json'), JSON.stringify(connections, null, 2)) ]); // Update registry await this.updateRegistry(projectPath, componentPath, component); } async saveComponentDebounced(projectPath: string, componentPath: string, component: Component): Promise { const key = `${projectPath}:${componentPath}`; this.pendingWrites.set(key, { content: JSON.stringify(component), timestamp: Date.now() }); // Debounced write setTimeout(async () => { const pending = this.pendingWrites.get(key); if (pending && Date.now() - pending.timestamp >= this.writeDebounce) { this.pendingWrites.delete(key); await this.saveComponent(projectPath, componentPath, JSON.parse(pending.content)); } }, this.writeDebounce); } private async updateRegistry(projectPath: string, componentPath: string, component: Component): Promise { const registryPath = path.join(projectPath, 'components', '_registry.json'); const registry = await this.readJSON(registryPath); registry.components[componentPath] = { ...registry.components[componentPath], modified: new Date().toISOString(), nodeCount: component.graph?.roots?.length || 0, connectionCount: component.graph?.connections?.length || 0 }; registry.lastUpdated = new Date().toISOString(); await this.atomicWrite(registryPath, JSON.stringify(registry, null, 2)); } } ``` ### Acceptance Criteria - [ ] Single component saves work - [ ] Atomic writes prevent corruption - [ ] Registry updates correctly - [ ] Debouncing prevents excessive writes - [ ] Auto-save integration works - [ ] Performance: <100ms per component --- ## STRUCT-007: Migration Wizard UI **Effort:** 10-14 hours (2.5 days) **Priority:** Medium **Depends on:** STRUCT-002, STRUCT-003 ### Description Create a user-friendly wizard for migrating projects from legacy to V2 format. ### UI Design ``` ┌─────────────────────────────────────────────────────────────────────────────┐ │ Project Structure Migration │ ├─────────────────────────────────────────────────────────────────────────────┤ │ │ │ Your project uses the legacy single-file format. Migrating to the new │ │ format enables powerful features: │ │ │ │ ✓ AI-powered editing assistance │ │ ✓ Better Git collaboration (meaningful diffs) │ │ ✓ 10x faster project loading │ │ ✓ Component-level version history │ │ │ │ ┌─────────────────────────────────────────────────────────────────────┐ │ │ │ Project Analysis │ │ │ │ │ │ │ │ Project: My Conference App │ │ │ │ Components: 54 │ │ │ │ Total Nodes: 3,420 │ │ │ │ Current Size: 2.4 MB │ │ │ │ │ │ │ │ Estimated time: ~30 seconds │ │ │ │ New size: ~2.6 MB (54 files) │ │ │ └─────────────────────────────────────────────────────────────────────┘ │ │ │ │ ┌─────────────────────────────────────────────────────────────────────┐ │ │ │ ⚠️ Pre-flight Checks │ │ │ │ │ │ │ │ ✓ All 54 components can be parsed │ │ │ │ ✓ No circular reference issues detected │ │ │ │ ✓ All node types supported │ │ │ │ ⚠ 2 components have very long paths (may be truncated) │ │ │ └─────────────────────────────────────────────────────────────────────┘ │ │ │ │ ☑ Create backup before migration (recommended) │ │ ☐ Delete legacy project.json after successful migration │ │ │ │ │ │ [Cancel] [Start Migration] │ │ │ └─────────────────────────────────────────────────────────────────────────────┘ ``` ### Progress View ``` ┌─────────────────────────────────────────────────────────────────────────────┐ │ Migration in Progress │ ├─────────────────────────────────────────────────────────────────────────────┤ │ │ │ ████████████████████████████████░░░░░░░░░░░░░░░░░░ 65% │ │ │ │ Current step: Exporting components/pages/SchedulePage... │ │ │ │ ┌─────────────────────────────────────────────────────────────────────┐ │ │ │ Progress Log [▼] │ │ │ ├─────────────────────────────────────────────────────────────────────┤ │ │ │ ✓ Created backup: project_backup_20260107_143022.json │ │ │ │ ✓ Created directory structure │ │ │ │ ✓ Exported project metadata │ │ │ │ ✓ Exported 32/54 components │ │ │ │ → Exporting components/pages/SchedulePage... │ │ │ └─────────────────────────────────────────────────────────────────────┘ │ │ │ │ [Cancel Migration] │ │ │ └─────────────────────────────────────────────────────────────────────────────┘ ``` ### Component Implementation ```typescript // packages/noodl-editor/src/editor/src/views/MigrationWizard/MigrationWizard.tsx interface MigrationWizardProps { projectPath: string; onComplete: (success: boolean) => void; onCancel: () => void; } export function MigrationWizard({ projectPath, onComplete, onCancel }: MigrationWizardProps) { const [step, setStep] = useState<'analysis' | 'progress' | 'complete' | 'error'>('analysis'); const [analysis, setAnalysis] = useState(null); const [progress, setProgress] = useState(0); const [progressMessage, setProgressMessage] = useState(''); const [error, setError] = useState(null); const [createBackup, setCreateBackup] = useState(true); const [deleteLegacy, setDeleteLegacy] = useState(false); useEffect(() => { // Run analysis on mount analyzeProject(projectPath).then(setAnalysis); }, [projectPath]); const startMigration = async () => { setStep('progress'); try { if (createBackup) { setProgressMessage('Creating backup...'); await createProjectBackup(projectPath); } const exporter = new ProjectExporter({ outputDir: projectPath, preserveOriginalPaths: true, validateOutput: true, onProgress: (percent, message) => { setProgress(percent); setProgressMessage(message); } }); const result = await exporter.export(await loadLegacyProject(projectPath)); if (result.success) { if (deleteLegacy) { await fs.unlink(path.join(projectPath, 'project.json')); } setStep('complete'); onComplete(true); } else { setError(result.errors.join('\n')); setStep('error'); } } catch (err) { setError(err.message); setStep('error'); } }; // ... render logic } ``` ### Acceptance Criteria - [ ] Analysis shows project statistics - [ ] Pre-flight checks identify issues - [ ] Progress indicator is accurate - [ ] Backup creation works - [ ] Cancel mid-migration works (with rollback) - [ ] Error handling shows clear messages - [ ] Success message with next steps --- ## STRUCT-008: Testing & Validation **Effort:** 16-20 hours (4 days) **Priority:** High **Depends on:** All other STRUCT tasks ### Description Comprehensive testing of the entire structure system with real projects. ### Test Suite ```typescript // packages/noodl-editor/tests/structure/ describe('Project Structure Migration', () => { describe('Schema Validation', () => { it('should validate component.json schema'); it('should validate nodes.json schema'); it('should validate connections.json schema'); it('should reject invalid schemas with clear errors'); }); describe('Export Engine', () => { it('should export minimal project'); it('should export complex project with 200+ components'); it('should handle cloud components'); it('should handle special characters in paths'); it('should preserve all AI metadata'); it('should preserve dynamic ports'); it('should preserve component references'); it('should generate valid registry'); }); describe('Import Engine', () => { it('should import exported project'); it('should reconstruct exact legacy format'); it('should pass round-trip validation'); it('should import single component'); }); describe('Round-Trip Validation', () => { testProjects.forEach((project) => { it(`should pass round-trip for ${project.name}`, async () => { const original = await loadProject(project.path); const exported = await exportProject(original); const imported = await importProject(exported); const validation = await validateRoundTrip(original, imported); expect(validation.valid).toBe(true); expect(validation.errors).toHaveLength(0); }); }); }); describe('Performance', () => { it('should export 200 components in <30 seconds'); it('should import 200 components in <5 seconds'); it('should load single component in <100ms'); it('should save single component in <100ms'); }); describe('Edge Cases', () => { it('should handle empty project'); it('should handle project with no connections'); it('should handle circular component references'); it('should handle very deep nesting'); it('should handle components with 1000+ nodes'); it('should handle unicode in component names'); }); }); ``` ### Real Project Test Set Collect 10+ real Noodl projects for testing: - Small project (5-10 components) - Medium project (50-100 components) - Large project (200+ components) - Project with cloud functions - Project with AI-generated code - Project with complex nesting - Project with special characters - Project from old Noodl version ### Acceptance Criteria - [ ] 100% pass rate on unit tests - [ ] 100% pass rate on integration tests - [ ] All real projects pass round-trip - [ ] Performance benchmarks pass - [ ] Edge cases handled --- ## STRUCT-009: Documentation **Effort:** 6-8 hours (1.5 days) **Priority:** Medium **Depends on:** All other STRUCT tasks ### Documentation Files ``` docs/ ├── structure/ │ ├── overview.md # Why we did this │ ├── file-formats.md # Detailed format specs │ ├── migration-guide.md # How to migrate │ ├── troubleshooting.md # Common issues │ └── api-reference.md # For developers └── schemas/ └── README.md # Schema documentation ``` ### Acceptance Criteria - [ ] Overview explains benefits - [ ] File formats fully documented - [ ] Migration guide is step-by-step - [ ] Troubleshooting covers common issues - [ ] API reference for developers --- # PHASE 10B: FRONTEND AI ASSISTANT ## Overview Build an AI assistant that can understand, navigate, and modify Nodegx frontend components using natural language. **Duration:** 6-8 weeks **Total Effort:** 100-130 hours **Priority:** HIGH **Depends on:** PHASE 10A complete ### Vision ``` User: "Add a loading spinner to the UserProfile component while fetching data" AI: [Reading components/UserProfile/...] I'll add a loading state with spinner: - Added condition node for loading state - Added Spinner component from shared library - Connected fetch.isPending → spinner.visible [Modified: components/UserProfile/nodes.json] [Modified: components/UserProfile/connections.json] Done! The component now shows a spinner during fetch. ``` --- ## AI-001: Component Reading Tools **Effort:** 12-16 hours (3 days) **Priority:** Critical ### Description Create tools that allow AI to read and understand component structure. ### Tool Definitions ```typescript // packages/noodl-editor/src/editor/src/services/AI/tools/ComponentTools.ts export const componentTools = [ { name: 'list_components', description: 'List all components in the project with summary info', parameters: { type: 'object', properties: { filter: { type: 'string', description: 'Optional filter: "pages", "shared", "cloud", or path prefix' } } }, execute: async ({ filter }) => { const registry = await loadRegistry(); let components = Object.entries(registry.components); if (filter) { components = components.filter(([path]) => path.includes(filter) || path.startsWith(filter)); } return components.map(([path, info]) => ({ path, type: info.type, nodeCount: info.nodeCount, modified: info.modified })); } }, { name: 'read_component', description: 'Read complete component definition including metadata, nodes, and connections', parameters: { type: 'object', properties: { componentPath: { type: 'string', description: 'Path to component, e.g., "pages/HomePage" or "shared/Button"' } }, required: ['componentPath'] }, execute: async ({ componentPath }) => { const componentDir = path.join(projectPath, 'components', componentPath); const [meta, nodes, connections] = await Promise.all([ readJSON(path.join(componentDir, 'component.json')), readJSON(path.join(componentDir, 'nodes.json')), readJSON(path.join(componentDir, 'connections.json')) ]); return { meta, nodes, connections }; } }, { name: 'get_component_dependencies', description: 'Get all components that this component depends on', parameters: { type: 'object', properties: { componentPath: { type: 'string' } }, required: ['componentPath'] }, execute: async ({ componentPath }) => { const meta = await readComponentMeta(componentPath); return { direct: meta.dependencies, transitive: await getTransitiveDependencies(componentPath) }; } }, { name: 'find_components_using', description: 'Find all components that use a specific component or model', parameters: { type: 'object', properties: { targetPath: { type: 'string' }, targetType: { type: 'string', enum: ['component', 'model'] } }, required: ['targetPath'] }, execute: async ({ targetPath, targetType }) => { // Search through all component.json for dependencies const registry = await loadRegistry(); const using: string[] = []; for (const [path] of Object.entries(registry.components)) { const meta = await readComponentMeta(path); if (meta.dependencies?.includes(targetPath)) { using.push(path); } } return using; } }, { name: 'explain_component', description: 'Generate a natural language explanation of what a component does', parameters: { type: 'object', properties: { componentPath: { type: 'string' } }, required: ['componentPath'] }, execute: async ({ componentPath }) => { const { meta, nodes, connections } = await readFullComponent(componentPath); // Analyze structure const analysis = { inputPorts: meta.ports?.inputs || [], outputPorts: meta.ports?.outputs || [], visualNodes: nodes.nodes.filter((n) => isVisualNode(n.type)), logicNodes: nodes.nodes.filter((n) => isLogicNode(n.type)), dataNodes: nodes.nodes.filter((n) => isDataNode(n.type)), componentRefs: nodes.nodes.filter((n) => n.type.startsWith('component:')) }; return analysis; } } ]; ``` ### Acceptance Criteria - [ ] All 5 reading tools implemented - [ ] Tools return properly structured data - [ ] Error handling for missing components - [ ] Performance: each tool <500ms --- ## AI-002: Component Modification Tools **Effort:** 16-20 hours (4 days) **Priority:** Critical ### Tool Definitions ```typescript export const modificationTools = [ { name: 'add_node', description: 'Add a new node to a component', parameters: { type: 'object', properties: { componentPath: { type: 'string' }, node: { type: 'object', properties: { type: { type: 'string' }, label: { type: 'string' }, parent: { type: 'string' }, position: { type: 'object', properties: { x: { type: 'number' }, y: { type: 'number' } } }, properties: { type: 'object' } }, required: ['type'] } }, required: ['componentPath', 'node'] }, execute: async ({ componentPath, node }) => { const nodes = await readNodes(componentPath); const newNode = { id: generateNodeId(), ...node, position: node.position || calculatePosition(nodes, node.parent) }; nodes.nodes.push(newNode); // Update parent's children if specified if (node.parent) { const parent = nodes.nodes.find((n) => n.id === node.parent); if (parent) { parent.children = parent.children || []; parent.children.push(newNode.id); } } await saveNodes(componentPath, nodes); return { success: true, nodeId: newNode.id }; } }, { name: 'update_node_property', description: 'Update a property on an existing node', parameters: { type: 'object', properties: { componentPath: { type: 'string' }, nodeId: { type: 'string' }, property: { type: 'string' }, value: {} }, required: ['componentPath', 'nodeId', 'property', 'value'] }, execute: async ({ componentPath, nodeId, property, value }) => { const nodes = await readNodes(componentPath); const node = nodes.nodes.find((n) => n.id === nodeId); if (!node) { return { success: false, error: `Node ${nodeId} not found` }; } node.properties = node.properties || {}; node.properties[property] = value; await saveNodes(componentPath, nodes); return { success: true }; } }, { name: 'add_connection', description: 'Add a connection between two nodes', parameters: { type: 'object', properties: { componentPath: { type: 'string' }, from: { type: 'object', properties: { node: { type: 'string' }, port: { type: 'string' } }, required: ['node', 'port'] }, to: { type: 'object', properties: { node: { type: 'string' }, port: { type: 'string' } }, required: ['node', 'port'] } }, required: ['componentPath', 'from', 'to'] }, execute: async ({ componentPath, from, to }) => { const connections = await readConnections(componentPath); const newConnection = { id: generateConnectionId(), from, to }; connections.connections.push(newConnection); await saveConnections(componentPath, connections); return { success: true, connectionId: newConnection.id }; } }, { name: 'remove_node', description: 'Remove a node and its connections from a component', parameters: { type: 'object', properties: { componentPath: { type: 'string' }, nodeId: { type: 'string' } }, required: ['componentPath', 'nodeId'] }, execute: async ({ componentPath, nodeId }) => { // Remove node const nodes = await readNodes(componentPath); nodes.nodes = nodes.nodes.filter((n) => n.id !== nodeId); // Remove from parent's children nodes.nodes.forEach((n) => { if (n.children?.includes(nodeId)) { n.children = n.children.filter((c) => c !== nodeId); } }); // Remove connections const connections = await readConnections(componentPath); connections.connections = connections.connections.filter((c) => c.from.node !== nodeId && c.to.node !== nodeId); await saveNodes(componentPath, nodes); await saveConnections(componentPath, connections); return { success: true }; } }, { name: 'create_component', description: 'Create a new component', parameters: { type: 'object', properties: { name: { type: 'string' }, path: { type: 'string', description: 'e.g., "shared" or "pages"' }, type: { type: 'string', enum: ['visual', 'logic', 'page'] }, inputs: { type: 'array' }, outputs: { type: 'array' } }, required: ['name'] }, execute: async ({ name, path: basePath, type, inputs, outputs }) => { const componentPath = path.join(basePath || 'shared', name); const componentDir = path.join(projectPath, 'components', componentPath); await fs.mkdir(componentDir, { recursive: true }); // Create component.json await writeJSON(path.join(componentDir, 'component.json'), { id: generateComponentId(), name, type: type || 'visual', ports: { inputs: inputs || [], outputs: outputs || [] }, dependencies: [], created: new Date().toISOString(), modified: new Date().toISOString() }); // Create empty nodes.json await writeJSON(path.join(componentDir, 'nodes.json'), { componentId: generateComponentId(), version: 1, nodes: [] }); // Create empty connections.json await writeJSON(path.join(componentDir, 'connections.json'), { componentId: generateComponentId(), version: 1, connections: [] }); // Update registry await updateRegistry(componentPath); return { success: true, componentPath }; } } ]; ``` ### Acceptance Criteria - [ ] All modification tools implemented - [ ] Changes persist correctly - [ ] Undo support works - [ ] Validation prevents invalid states - [ ] Registry updates on changes --- ## AI-003: LangGraph Agent Setup **Effort:** 16-20 hours (4 days) **Priority:** Critical ### Agent Architecture ```typescript // packages/noodl-editor/src/editor/src/services/AI/FrontendAgent.ts import { ChatAnthropic } from '@langchain/anthropic'; import { StateGraph, MemorySaver } from '@langchain/langgraph'; interface AgentState { messages: Message[]; currentComponent: string | null; pendingChanges: Change[]; context: ProjectContext; } export class FrontendAgent { private graph: StateGraph; private memory: MemorySaver; constructor() { this.memory = new MemorySaver(); this.graph = this.buildGraph(); } private buildGraph(): StateGraph { const graph = new StateGraph({ channels: { messages: { default: () => [] }, currentComponent: { default: () => null }, pendingChanges: { default: () => [] }, context: { default: () => ({}) } } }); // Add nodes graph.addNode('understand', this.understandRequest); graph.addNode('gather_context', this.gatherContext); graph.addNode('plan_changes', this.planChanges); graph.addNode('execute_changes', this.executeChanges); graph.addNode('respond', this.generateResponse); // Add edges graph.addEdge('__start__', 'understand'); graph.addConditionalEdges('understand', this.routeAfterUnderstand, { needs_context: 'gather_context', ready_to_plan: 'plan_changes', clarify: 'respond' }); graph.addEdge('gather_context', 'plan_changes'); graph.addEdge('plan_changes', 'execute_changes'); graph.addEdge('execute_changes', 'respond'); graph.addEdge('respond', '__end__'); return graph.compile({ checkpointer: this.memory }); } private understandRequest = async (state: AgentState) => { const llm = new ChatAnthropic({ model: 'claude-sonnet-4-20250514' }); const response = await llm.invoke([{ role: 'system', content: UNDERSTAND_PROMPT }, ...state.messages]); return { ...state, understanding: response.content }; }; private gatherContext = async (state: AgentState) => { // Use reading tools to gather needed context const tools = componentTools; // Determine what context is needed const neededComponents = extractComponentReferences(state.understanding); const context = {}; for (const comp of neededComponents) { context[comp] = await tools.find((t) => t.name === 'read_component').execute({ componentPath: comp }); } return { ...state, context }; }; private planChanges = async (state: AgentState) => { const llm = new ChatAnthropic({ model: 'claude-sonnet-4-20250514' }); const response = await llm.invoke([ { role: 'system', content: PLANNING_PROMPT }, { role: 'user', content: JSON.stringify(state.context) }, ...state.messages ]); // Parse planned changes const changes = parseChangePlan(response.content); return { ...state, pendingChanges: changes }; }; private executeChanges = async (state: AgentState) => { const results = []; for (const change of state.pendingChanges) { const tool = modificationTools.find((t) => t.name === change.tool); if (tool) { const result = await tool.execute(change.params); results.push({ change, result }); } } return { ...state, executionResults: results }; }; async process(message: string, threadId: string): Promise { const config = { configurable: { thread_id: threadId } }; const result = await this.graph.invoke( { messages: [{ role: 'user', content: message }] }, config ); return { response: result.messages[result.messages.length - 1].content, changes: result.pendingChanges, success: result.executionResults?.every((r) => r.result.success) }; } } ``` ### Acceptance Criteria - [ ] Agent processes natural language requests - [ ] Context gathering is efficient - [ ] Change planning is accurate - [ ] Execution handles errors gracefully - [ ] Conversation memory persists --- ## AI-004: Conversation Memory & Caching **Effort:** 12-16 hours (3 days) **Priority:** High ### Implementation ```typescript // packages/noodl-editor/src/editor/src/services/AI/ConversationMemory.ts export class ConversationMemory { private messages: Message[] = []; private projectContextCache: Map = new Map(); private maxContextTokens = 100000; private cacheTTL = 5 * 60 * 1000; // 5 minutes addUserMessage(content: string): void { this.messages.push({ role: 'user', content, timestamp: Date.now() }); this.pruneIfNeeded(); } addAssistantMessage(content: string): void { this.messages.push({ role: 'assistant', content, timestamp: Date.now() }); } getMessagesForRequest(): Message[] { // Return messages formatted for API call return this.messages.map((m) => ({ role: m.role, content: m.content })); } getCachedContext(key: string): any | null { const cached = this.projectContextCache.get(key); if (cached && Date.now() - cached.timestamp < this.cacheTTL) { return cached.data; } return null; } setCachedContext(key: string, data: any): void { this.projectContextCache.set(key, { data, timestamp: Date.now() }); } private pruneIfNeeded(): void { // Estimate token count const estimatedTokens = this.messages.reduce((sum, m) => sum + Math.ceil(m.content.length / 4), 0); if (estimatedTokens > this.maxContextTokens * 0.8) { // Summarize older messages this.summarizeOldMessages(); } } private async summarizeOldMessages(): Promise { if (this.messages.length <= 10) return; const oldMessages = this.messages.slice(0, -10); const recentMessages = this.messages.slice(-10); const summary = await this.generateSummary(oldMessages); this.messages = [ { role: 'system', content: `Previous conversation summary:\n${summary}`, timestamp: Date.now() }, ...recentMessages ]; } } ``` ### Acceptance Criteria - [ ] Messages persist across turns - [ ] Context caching reduces API calls - [ ] Token count stays within limits - [ ] Summarization works correctly - [ ] Cache invalidation works --- ## AI-005: AI Panel UI **Effort:** 16-20 hours (4 days) **Priority:** High ### UI Component ```typescript // packages/noodl-editor/src/editor/src/views/AIPanel/AIPanel.tsx export function AIPanel() { const [messages, setMessages] = useState([]); const [input, setInput] = useState(''); const [isProcessing, setIsProcessing] = useState(false); const [pendingChanges, setPendingChanges] = useState([]); const agent = useFrontendAgent(); const handleSend = async () => { if (!input.trim() || isProcessing) return; const userMessage = input; setInput(''); setMessages((prev) => [...prev, { role: 'user', content: userMessage }]); setIsProcessing(true); try { const response = await agent.process(userMessage); setMessages((prev) => [...prev, { role: 'assistant', content: response.response }]); if (response.changes?.length > 0) { setPendingChanges(response.changes); } } catch (error) { setMessages((prev) => [ ...prev, { role: 'assistant', content: `Error: ${error.message}`, isError: true } ]); } finally { setIsProcessing(false); } }; return (

AI Assistant

{messages.map((msg, i) => ( ))} {isProcessing && }
{pendingChanges.length > 0 && ( setPendingChanges([])} /> )}