New data query node for Directus backend integration

This commit is contained in:
Richard Osborne
2025-12-30 11:55:30 +01:00
parent 6fd59e83e6
commit ae7d3b8a8b
52 changed files with 17798 additions and 303 deletions

View File

@@ -0,0 +1,739 @@
# AGENT-001: Server-Sent Events (SSE) Node
## Overview
Create a new runtime node that establishes and manages Server-Sent Events (SSE) connections, enabling real-time streaming data from servers to Noodl applications. This is critical for agentic UI patterns where AI backends stream responses incrementally.
**Phase:** 3.5 (Real-Time Agentic UI)
**Priority:** CRITICAL (blocks Erleah development)
**Effort:** 3-5 days
**Risk:** Medium
---
## Problem Statement
### Current Limitation
The existing HTTP Request node only supports request-response patterns:
```
User Action → HTTP Request → Wait → Response → UI Update
```
This doesn't work for streaming use cases:
```
User Action → SSE Connect → Stream messages → Progressive UI Updates
Message 1 → UI Update
Message 2 → UI Update
Message 3 → UI Update
...
```
### Real-World Use Cases
1. **AI Chat Responses** - Stream tokens as they generate (Erleah)
2. **Live Notifications** - Server pushes updates without polling
3. **Real-Time Dashboards** - Continuous metric updates
4. **Progress Updates** - Long-running backend operations
5. **Event Feeds** - News, social media, activity streams
---
## Goals
1. ✅ Establish and maintain SSE connections
2. ✅ Parse incoming messages (text and JSON)
3. ✅ Handle connection lifecycle (open, error, close)
4. ✅ Auto-reconnect on connection loss
5. ✅ Clean disconnection on node deletion
6. ✅ Support custom event types (via addEventListener)
---
## Technical Design
### Node Specification
```javascript
{
name: 'net.noodl.SSE',
displayNodeName: 'Server-Sent Events',
category: 'Data',
color: 'data',
docs: 'https://docs.noodl.net/nodes/data/sse',
searchTags: ['sse', 'stream', 'server-sent', 'events', 'realtime', 'websocket']
}
```
### Port Schema
#### Inputs
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| `url` | string | Connection | SSE endpoint URL |
| `connect` | signal | Actions | Establish connection |
| `disconnect` | signal | Actions | Close connection |
| `autoReconnect` | boolean | Connection | Auto-reconnect on disconnect (default: true) |
| `reconnectDelay` | number | Connection | Delay before reconnect (ms, default: 3000) |
| `withCredentials` | boolean | Connection | Include credentials (default: false) |
| `customHeaders` | object | Connection | Custom headers (EventSource doesn't support, document limitation) |
#### Outputs
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| `message` | object | Data | Raw message event object |
| `data` | * | Data | Parsed message data (JSON or text) |
| `eventType` | string | Data | Event type (if custom events used) |
| `connected` | signal | Events | Fired when connection opens |
| `disconnected` | signal | Events | Fired when connection closes |
| `error` | string | Events | Error message |
| `isConnected` | boolean | Status | Current connection state |
| `lastMessageTime` | number | Status | Timestamp of last message (ms) |
### State Machine
```
┌─────────────┐
START │ │
────┬─→│ DISCONNECTED│←──┐
│ │ │ │
│ └─────────────┘ │
│ │ │
│ [connect] │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ CONNECTING │ │
│ └─────────────┘ │
│ │ │
│ [onopen] │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
└──│ CONNECTED │ │
│ │ │
└─────────────┘ │
│ │ │
[onmessage] │
│ │ │
[onerror/close] │
│ │
└───────────┘
[autoReconnect
after delay]
```
---
## Implementation Details
### File Structure
```
packages/noodl-runtime/src/nodes/std-library/data/
├── ssenode.js # Main node implementation
└── ssenode.test.js # Unit tests
```
### Core Implementation
```javascript
var SSENode = {
name: 'net.noodl.SSE',
displayNodeName: 'Server-Sent Events',
category: 'Data',
color: 'data',
initialize: function() {
this._internal.eventSource = null;
this._internal.isConnected = false;
this._internal.messageBuffer = [];
this._internal.reconnectTimer = null;
this._internal.customEventListeners = new Map();
},
inputs: {
url: {
type: 'string',
displayName: 'URL',
group: 'Connection',
set: function(value) {
this._internal.url = value;
}
},
connect: {
type: 'signal',
displayName: 'Connect',
group: 'Actions',
valueChangedToTrue: function() {
this.doConnect();
}
},
disconnect: {
type: 'signal',
displayName: 'Disconnect',
group: 'Actions',
valueChangedToTrue: function() {
this.doDisconnect();
}
},
autoReconnect: {
type: 'boolean',
displayName: 'Auto Reconnect',
group: 'Connection',
default: true,
set: function(value) {
this._internal.autoReconnect = value;
}
},
reconnectDelay: {
type: 'number',
displayName: 'Reconnect Delay (ms)',
group: 'Connection',
default: 3000,
set: function(value) {
this._internal.reconnectDelay = value;
}
},
withCredentials: {
type: 'boolean',
displayName: 'With Credentials',
group: 'Connection',
default: false,
set: function(value) {
this._internal.withCredentials = value;
}
},
// For custom event types (beyond 'message')
eventType: {
type: 'string',
displayName: 'Listen for Event Type',
group: 'Connection',
set: function(value) {
this._internal.customEventType = value;
if (this._internal.eventSource && value) {
this.addCustomEventListener(value);
}
}
}
},
outputs: {
message: {
type: 'object',
displayName: 'Message',
group: 'Data',
getter: function() {
return this._internal.lastMessage;
}
},
data: {
type: '*',
displayName: 'Data',
group: 'Data',
getter: function() {
return this._internal.lastData;
}
},
eventType: {
type: 'string',
displayName: 'Event Type',
group: 'Data',
getter: function() {
return this._internal.lastEventType;
}
},
connected: {
type: 'signal',
displayName: 'Connected',
group: 'Events'
},
disconnected: {
type: 'signal',
displayName: 'Disconnected',
group: 'Events'
},
messageReceived: {
type: 'signal',
displayName: 'Message Received',
group: 'Events'
},
error: {
type: 'string',
displayName: 'Error',
group: 'Events',
getter: function() {
return this._internal.lastError;
}
},
isConnected: {
type: 'boolean',
displayName: 'Is Connected',
group: 'Status',
getter: function() {
return this._internal.isConnected;
}
},
lastMessageTime: {
type: 'number',
displayName: 'Last Message Time',
group: 'Status',
getter: function() {
return this._internal.lastMessageTime;
}
}
},
methods: {
doConnect: function() {
// Disconnect existing connection if any
if (this._internal.eventSource) {
this.doDisconnect();
}
const url = this._internal.url;
if (!url) {
this.setError('URL is required');
return;
}
try {
// Create EventSource with options
const options = {
withCredentials: this._internal.withCredentials || false
};
const eventSource = new EventSource(url, options);
this._internal.eventSource = eventSource;
// Connection opened
eventSource.onopen = () => {
this._internal.isConnected = true;
this._internal.lastError = null;
this.flagOutputDirty('isConnected');
this.flagOutputDirty('error');
this.sendSignalOnOutput('connected');
console.log('[SSE Node] Connected to', url);
};
// Message received (default event type)
eventSource.onmessage = (event) => {
this.handleMessage(event, 'message');
};
// Connection error/closed
eventSource.onerror = (error) => {
console.error('[SSE Node] Connection error:', error);
const wasConnected = this._internal.isConnected;
this._internal.isConnected = false;
this.flagOutputDirty('isConnected');
if (wasConnected) {
this.sendSignalOnOutput('disconnected');
}
// Check if connection is permanently closed
if (eventSource.readyState === EventSource.CLOSED) {
this.setError('Connection closed');
// Auto-reconnect if enabled
if (this._internal.autoReconnect) {
const delay = this._internal.reconnectDelay || 3000;
console.log(`[SSE Node] Reconnecting in ${delay}ms...`);
this._internal.reconnectTimer = setTimeout(() => {
this.doConnect();
}, delay);
}
}
};
// Add custom event listener if specified
if (this._internal.customEventType) {
this.addCustomEventListener(this._internal.customEventType);
}
} catch (e) {
this.setError(e.message);
console.error('[SSE Node] Failed to connect:', e);
}
},
doDisconnect: function() {
// Clear reconnect timer
if (this._internal.reconnectTimer) {
clearTimeout(this._internal.reconnectTimer);
this._internal.reconnectTimer = null;
}
// Close connection
if (this._internal.eventSource) {
this._internal.eventSource.close();
this._internal.eventSource = null;
this._internal.isConnected = false;
this.flagOutputDirty('isConnected');
this.sendSignalOnOutput('disconnected');
console.log('[SSE Node] Disconnected');
}
},
addCustomEventListener: function(eventType) {
if (!this._internal.eventSource || !eventType) return;
// Remove old listener if exists
const oldListener = this._internal.customEventListeners.get(eventType);
if (oldListener) {
this._internal.eventSource.removeEventListener(eventType, oldListener);
}
// Add new listener
const listener = (event) => {
this.handleMessage(event, eventType);
};
this._internal.eventSource.addEventListener(eventType, listener);
this._internal.customEventListeners.set(eventType, listener);
},
handleMessage: function(event, type) {
this._internal.lastMessageTime = Date.now();
this._internal.lastEventType = type || 'message';
this._internal.lastMessage = {
data: event.data,
lastEventId: event.lastEventId,
type: type || 'message'
};
// Try to parse as JSON
try {
this._internal.lastData = JSON.parse(event.data);
} catch (e) {
// Not JSON, use raw string
this._internal.lastData = event.data;
}
this.flagOutputDirty('message');
this.flagOutputDirty('data');
this.flagOutputDirty('eventType');
this.flagOutputDirty('lastMessageTime');
this.sendSignalOnOutput('messageReceived');
},
setError: function(message) {
this._internal.lastError = message;
this.flagOutputDirty('error');
},
_onNodeDeleted: function() {
this.doDisconnect();
}
},
getInspectInfo: function() {
if (this._internal.isConnected) {
return {
type: 'value',
value: {
status: 'Connected',
url: this._internal.url,
lastMessage: this._internal.lastData,
messageCount: this._internal.messageCount || 0
}
};
}
return {
type: 'text',
value: this._internal.isConnected ? 'Connected' : 'Disconnected'
};
}
};
module.exports = {
node: SSENode
};
```
---
## Usage Examples
### Example 1: AI Chat Streaming
```
[Text Input: "Hello AI"]
→ [Send] signal
→ [HTTP Request] POST /chat/start → returns chatId
→ [String Format] "/chat/{chatId}/stream"
→ [SSE Node] url
→ [SSE Node] connect
[SSE Node] data
→ [Array] accumulate messages
→ [Repeater] render messages
[SSE Node] messageReceived
→ [Scroll To Bottom] in chat container
```
### Example 2: Live Notifications
```
[Component Mounted] signal
→ [SSE Node] connect to "/notifications/stream"
[SSE Node] data
→ [Show Toast] notification
[SSE Node] connected
→ [Variable] isLive = true → [Visual indicator]
[SSE Node] disconnected
→ [Variable] isLive = false → [Show offline banner]
```
### Example 3: Progress Tracker
```
[Start Upload] signal
→ [HTTP Request] POST /upload → returns uploadId
→ [SSE Node] connect to "/upload/{uploadId}/progress"
[SSE Node] data → { percent: number }
→ [Progress Bar] value
[SSE Node] data → { status: "complete" }
→ [SSE Node] disconnect
→ [Navigate] to success page
```
---
## Testing Checklist
### Functional Tests
- [ ] Connection establishes to valid SSE endpoint
- [ ] Connection fails gracefully with invalid URL
- [ ] Messages are received and parsed correctly
- [ ] JSON messages parse to objects
- [ ] Plain text messages output as strings
- [ ] `connected` signal fires on open
- [ ] `disconnected` signal fires on close
- [ ] `messageReceived` signal fires for each message
- [ ] `isConnected` reflects current state
- [ ] Auto-reconnect works after disconnect
- [ ] Reconnect delay is respected
- [ ] Manual disconnect stops auto-reconnect
- [ ] Custom event types are received
- [ ] withCredentials flag works correctly
- [ ] Node cleanup works (no memory leaks)
### Edge Cases
- [ ] Handles rapid connect/disconnect cycles
- [ ] Handles very large messages (>1MB)
- [ ] Handles malformed JSON gracefully
- [ ] Handles server sending error events
- [ ] Handles network going offline/online
- [ ] Handles multiple SSE nodes simultaneously
- [ ] Connection closes cleanly on component unmount
- [ ] Reconnect timer clears on manual disconnect
### Performance
- [ ] Memory usage stable over 1000+ messages
- [ ] No visible UI lag during streaming
- [ ] Message parsing doesn't block main thread
- [ ] Multiple connections don't interfere
---
## Browser Compatibility
EventSource (SSE) is supported in:
| Browser | Support |
|---------|---------|
| Chrome | ✅ Yes |
| Firefox | ✅ Yes |
| Safari | ✅ Yes |
| Edge | ✅ Yes |
| IE 11 | ❌ No (polyfill available) |
For IE 11 support, can use: [event-source-polyfill](https://www.npmjs.com/package/event-source-polyfill)
---
## Limitations & Workarounds
### Limitation 1: No Custom Headers
EventSource API doesn't support custom headers (like Authorization).
**Workaround:**
- Send auth token as query parameter: `/stream?token=abc123`
- Use cookie-based authentication (withCredentials: true)
- Or use WebSocket instead (AGENT-002)
### Limitation 2: No Request Body
SSE is GET-only, can't send request body.
**Workaround:**
- Create session/channel via POST first
- Connect SSE to session-specific URL: `/stream/{sessionId}`
### Limitation 3: One-Way Communication
Server → Client only. Client can't send messages over same connection.
**Workaround:**
- Use separate HTTP requests for client → server
- Or use WebSocket for bidirectional (AGENT-002)
---
## Documentation Requirements
### User-Facing Docs
Create: `docs/nodes/data/sse.md`
```markdown
# Server-Sent Events
Stream real-time data from your server to your Noodl app using Server-Sent Events (SSE). Perfect for live notifications, AI chat responses, progress updates, and real-time dashboards.
## When to Use
- **AI Chat**: Stream tokens as they generate
- **Notifications**: Push updates without polling
- **Dashboards**: Continuous metric updates
- **Progress**: Long-running operation status
## Basic Usage
1. Add SSE node to your component
2. Set the `URL` to your SSE endpoint
3. Send the `Connect` signal
4. Use the `data` output to access messages
5. Listen to `messageReceived` to trigger actions
## Example: Live Chat
[Screenshot showing SSE node connected to chat UI]
## Authentication
Since EventSource doesn't support custom headers, use query parameters:
```
URL: https://api.example.com/stream?token={authToken}
```
Or enable `With Credentials` for cookie-based auth.
## Auto-Reconnect
By default, SSE nodes auto-reconnect if connection drops. Configure:
- `Auto Reconnect`: Enable/disable
- `Reconnect Delay`: Wait time in milliseconds
## Custom Events
Servers can send named events. Use the `Event Type` input to listen for specific events.
```javascript
// Server sends:
event: notification
data: {"message": "New user signed up"}
// Noodl receives on 'notification' event type
```
```
### Technical Docs
Add to: `dev-docs/reference/NODE-PATTERNS.md`
Section on streaming nodes.
---
## Success Criteria
1. ✅ SSE node successfully streams data in test app
2. ✅ Auto-reconnect works reliably
3. ✅ No memory leaks over extended usage
4. ✅ Clear documentation with examples
5. ✅ Works in Erleah prototype for AI chat streaming
---
## Future Enhancements
Post-MVP features to consider:
1. **Message Buffering** - Store last N messages for replays
2. **Rate Limiting** - Throttle message processing
3. **Event Filtering** - Filter messages by criteria before output
4. **Last Event ID** - Resume from last message on reconnect
5. **Connection Pooling** - Share connection across components
---
## References
- [MDN: EventSource](https://developer.mozilla.org/en-US/docs/Web/API/EventSource)
- [HTML Living Standard: SSE](https://html.spec.whatwg.org/multipage/server-sent-events.html)
- [SSE vs WebSocket](https://ably.com/topic/server-sent-events-vs-websockets)
---
## Dependencies
- None (uses native EventSource API)
## Blocked By
- None
## Blocks
- AGENT-007 (Stream Parser Utilities) - needs SSE for testing
- Erleah development - requires streaming AI responses
---
## Estimated Effort Breakdown
| Phase | Estimate | Description |
|-------|----------|-------------|
| Core Implementation | 1 day | Basic SSE node with connect/disconnect/message |
| Error Handling | 0.5 day | Graceful failures, reconnect logic |
| Testing | 1 day | Unit tests, integration tests, edge cases |
| Documentation | 0.5 day | User docs, technical docs, examples |
| Edge Cases & Polish | 0.5-1 day | Performance, memory, browser compat |
**Total: 3.5-4 days**
Buffer: +1 day for unexpected issues = **4-5 days total**

View File

@@ -0,0 +1,923 @@
# AGENT-002: WebSocket Node
## Overview
Create a new runtime node that establishes and manages WebSocket connections, enabling bidirectional real-time communication between Noodl applications and servers. This complements SSE (AGENT-001) by supporting two-way messaging patterns.
**Phase:** 3.5 (Real-Time Agentic UI)
**Priority:** HIGH
**Effort:** 3-5 days
**Risk:** Medium
---
## Problem Statement
### Current Limitation
The existing HTTP Request node and SSE node (AGENT-001) only support one-way communication:
```
HTTP: Client → Server → Response (one-shot)
SSE: Server → Client (one-way stream)
```
WebSocket enables true bidirectional communication:
```
WebSocket: Client ⇄ Server (continuous two-way)
```
### Real-World Use Cases
1. **Collaborative Editing** - Multiple users editing same document
2. **Gaming** - Real-time multiplayer interactions
3. **Chat Applications** - Send and receive messages
4. **Live Cursors** - Show where other users are pointing
5. **Device Control** - Send commands, receive telemetry
6. **Trading Platforms** - Real-time price updates + order placement
---
## Goals
1. ✅ Establish and maintain WebSocket connections
2. ✅ Send messages (text and binary)
3. ✅ Receive messages (text and binary)
4. ✅ Handle connection lifecycle (open, error, close)
5. ✅ Auto-reconnect with exponential backoff
6. ✅ Ping/pong heartbeat for connection health
7. ✅ Queue messages when disconnected
---
## Technical Design
### Node Specification
```javascript
{
name: 'net.noodl.WebSocket',
displayNodeName: 'WebSocket',
category: 'Data',
color: 'data',
docs: 'https://docs.noodl.net/nodes/data/websocket',
searchTags: ['websocket', 'ws', 'realtime', 'bidirectional', 'socket']
}
```
### Port Schema
#### Inputs
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| `url` | string | Connection | WebSocket URL (ws:// or wss://) |
| `connect` | signal | Actions | Establish connection |
| `disconnect` | signal | Actions | Close connection |
| `send` | signal | Actions | Send message |
| `message` | * | Message | Data to send (JSON serialized if object) |
| `messageType` | enum | Message | 'text' or 'binary' (default: text) |
| `autoReconnect` | boolean | Connection | Auto-reconnect on disconnect (default: true) |
| `reconnectDelay` | number | Connection | Initial delay (ms, default: 1000) |
| `maxReconnectDelay` | number | Connection | Max delay with backoff (ms, default: 30000) |
| `protocols` | string | Connection | Comma-separated subprotocols |
| `queueWhenDisconnected` | boolean | Message | Queue messages while offline (default: true) |
| `heartbeatInterval` | number | Connection | Ping interval (ms, 0=disabled, default: 30000) |
#### Outputs
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| `received` | * | Data | Received message data |
| `receivedRaw` | string | Data | Raw message string |
| `messageReceived` | signal | Events | Fired when message arrives |
| `messageSent` | signal | Events | Fired after send succeeds |
| `connected` | signal | Events | Fired when connection opens |
| `disconnected` | signal | Events | Fired when connection closes |
| `error` | string | Events | Error message |
| `isConnected` | boolean | Status | Current connection state |
| `queueSize` | number | Status | Messages waiting to send |
| `latency` | number | Status | Round-trip time (ms) |
### State Machine
```
┌─────────────┐
START │ │
────┬─→│ DISCONNECTED│←──┐
│ │ │ │
│ └─────────────┘ │
│ │ │
│ [connect] │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
│ │ CONNECTING │ │
│ └─────────────┘ │
│ │ │
│ [onopen] │
│ │ │
│ ▼ │
│ ┌─────────────┐ │
└──│ CONNECTED │ │
│ │ │
│ [send/recv]│ │
│ │ │
└─────────────┘ │
│ │ │
[onclose/error] │
│ │
└───────────┘
[autoReconnect
with backoff]
```
---
## Implementation Details
### File Structure
```
packages/noodl-runtime/src/nodes/std-library/data/
├── websocketnode.js # Main node implementation
└── websocketnode.test.js # Unit tests
```
### Core Implementation
```javascript
var WebSocketNode = {
name: 'net.noodl.WebSocket',
displayNodeName: 'WebSocket',
category: 'Data',
color: 'data',
initialize: function() {
this._internal.socket = null;
this._internal.isConnected = false;
this._internal.messageQueue = [];
this._internal.reconnectAttempts = 0;
this._internal.reconnectTimer = null;
this._internal.heartbeatTimer = null;
this._internal.lastPingTime = 0;
},
inputs: {
url: {
type: 'string',
displayName: 'URL',
group: 'Connection',
set: function(value) {
this._internal.url = value;
}
},
connect: {
type: 'signal',
displayName: 'Connect',
group: 'Actions',
valueChangedToTrue: function() {
this.doConnect();
}
},
disconnect: {
type: 'signal',
displayName: 'Disconnect',
group: 'Actions',
valueChangedToTrue: function() {
this.doDisconnect();
}
},
send: {
type: 'signal',
displayName: 'Send',
group: 'Actions',
valueChangedToTrue: function() {
this.doSend();
}
},
message: {
type: '*',
displayName: 'Message',
group: 'Message',
set: function(value) {
this._internal.messageToSend = value;
}
},
messageType: {
type: {
name: 'enum',
enums: [
{ label: 'Text', value: 'text' },
{ label: 'Binary', value: 'binary' }
]
},
displayName: 'Message Type',
group: 'Message',
default: 'text',
set: function(value) {
this._internal.messageType = value;
}
},
autoReconnect: {
type: 'boolean',
displayName: 'Auto Reconnect',
group: 'Connection',
default: true,
set: function(value) {
this._internal.autoReconnect = value;
}
},
reconnectDelay: {
type: 'number',
displayName: 'Reconnect Delay (ms)',
group: 'Connection',
default: 1000,
set: function(value) {
this._internal.reconnectDelay = value;
}
},
maxReconnectDelay: {
type: 'number',
displayName: 'Max Reconnect Delay (ms)',
group: 'Connection',
default: 30000,
set: function(value) {
this._internal.maxReconnectDelay = value;
}
},
protocols: {
type: 'string',
displayName: 'Protocols',
group: 'Connection',
set: function(value) {
this._internal.protocols = value;
}
},
queueWhenDisconnected: {
type: 'boolean',
displayName: 'Queue When Disconnected',
group: 'Message',
default: true,
set: function(value) {
this._internal.queueWhenDisconnected = value;
}
},
heartbeatInterval: {
type: 'number',
displayName: 'Heartbeat Interval (ms)',
group: 'Connection',
default: 30000,
set: function(value) {
this._internal.heartbeatInterval = value;
if (this._internal.isConnected) {
this.startHeartbeat();
}
}
}
},
outputs: {
received: {
type: '*',
displayName: 'Received',
group: 'Data',
getter: function() {
return this._internal.lastReceived;
}
},
receivedRaw: {
type: 'string',
displayName: 'Received Raw',
group: 'Data',
getter: function() {
return this._internal.lastReceivedRaw;
}
},
messageReceived: {
type: 'signal',
displayName: 'Message Received',
group: 'Events'
},
messageSent: {
type: 'signal',
displayName: 'Message Sent',
group: 'Events'
},
connected: {
type: 'signal',
displayName: 'Connected',
group: 'Events'
},
disconnected: {
type: 'signal',
displayName: 'Disconnected',
group: 'Events'
},
error: {
type: 'string',
displayName: 'Error',
group: 'Events',
getter: function() {
return this._internal.lastError;
}
},
isConnected: {
type: 'boolean',
displayName: 'Is Connected',
group: 'Status',
getter: function() {
return this._internal.isConnected;
}
},
queueSize: {
type: 'number',
displayName: 'Queue Size',
group: 'Status',
getter: function() {
return this._internal.messageQueue.length;
}
},
latency: {
type: 'number',
displayName: 'Latency (ms)',
group: 'Status',
getter: function() {
return this._internal.latency || 0;
}
}
},
methods: {
doConnect: function() {
// Disconnect existing
if (this._internal.socket) {
this.doDisconnect();
}
const url = this._internal.url;
if (!url) {
this.setError('URL is required');
return;
}
if (!url.startsWith('ws://') && !url.startsWith('wss://')) {
this.setError('URL must start with ws:// or wss://');
return;
}
try {
// Parse protocols
const protocols = this._internal.protocols
? this._internal.protocols.split(',').map(p => p.trim())
: undefined;
const socket = new WebSocket(url, protocols);
this._internal.socket = socket;
// Connection opened
socket.onopen = () => {
this._internal.isConnected = true;
this._internal.reconnectAttempts = 0;
this._internal.lastError = null;
this.flagOutputDirty('isConnected');
this.flagOutputDirty('error');
this.sendSignalOnOutput('connected');
// Start heartbeat
this.startHeartbeat();
// Flush queued messages
this.flushMessageQueue();
console.log('[WebSocket] Connected to', url);
};
// Message received
socket.onmessage = (event) => {
this.handleMessage(event.data);
};
// Connection closed
socket.onclose = (event) => {
console.log('[WebSocket] Closed:', event.code, event.reason);
this._internal.isConnected = false;
this.flagOutputDirty('isConnected');
this.sendSignalOnOutput('disconnected');
this.stopHeartbeat();
// Auto-reconnect
if (this._internal.autoReconnect && !event.wasClean) {
this.scheduleReconnect();
}
};
// Connection error
socket.onerror = (error) => {
console.error('[WebSocket] Error:', error);
this.setError('Connection error');
};
} catch (e) {
this.setError(e.message);
console.error('[WebSocket] Failed to connect:', e);
}
},
doDisconnect: function() {
// Clear timers
if (this._internal.reconnectTimer) {
clearTimeout(this._internal.reconnectTimer);
this._internal.reconnectTimer = null;
}
this.stopHeartbeat();
// Close socket
if (this._internal.socket) {
this._internal.socket.close(1000, 'Client disconnect');
this._internal.socket = null;
this._internal.isConnected = false;
this.flagOutputDirty('isConnected');
console.log('[WebSocket] Disconnected');
}
},
doSend: function() {
const message = this._internal.messageToSend;
if (message === undefined || message === null) {
return;
}
// If not connected, queue or drop
if (!this._internal.isConnected) {
if (this._internal.queueWhenDisconnected) {
this._internal.messageQueue.push(message);
this.flagOutputDirty('queueSize');
console.log('[WebSocket] Message queued (disconnected)');
} else {
this.setError('Cannot send: not connected');
}
return;
}
try {
const socket = this._internal.socket;
const messageType = this._internal.messageType || 'text';
// Serialize based on type
let data;
if (messageType === 'binary') {
// Convert to ArrayBuffer or Blob
if (typeof message === 'string') {
data = new TextEncoder().encode(message);
} else {
data = message;
}
} else {
// Text mode - serialize objects as JSON
if (typeof message === 'object') {
data = JSON.stringify(message);
} else {
data = String(message);
}
}
socket.send(data);
this.sendSignalOnOutput('messageSent');
} catch (e) {
this.setError('Send failed: ' + e.message);
}
},
handleMessage: function(data) {
this._internal.lastReceivedRaw = data;
// Try to parse as JSON
try {
this._internal.lastReceived = JSON.parse(data);
} catch (e) {
// Not JSON, use raw
this._internal.lastReceived = data;
}
// Check if it's a pong response
if (data === 'pong' && this._internal.lastPingTime) {
this._internal.latency = Date.now() - this._internal.lastPingTime;
this.flagOutputDirty('latency');
return; // Don't emit messageReceived for pong
}
this.flagOutputDirty('received');
this.flagOutputDirty('receivedRaw');
this.sendSignalOnOutput('messageReceived');
},
flushMessageQueue: function() {
const queue = this._internal.messageQueue;
if (queue.length === 0) return;
console.log(`[WebSocket] Flushing ${queue.length} queued messages`);
while (queue.length > 0) {
this._internal.messageToSend = queue.shift();
this.doSend();
}
this.flagOutputDirty('queueSize');
},
scheduleReconnect: function() {
const baseDelay = this._internal.reconnectDelay || 1000;
const maxDelay = this._internal.maxReconnectDelay || 30000;
const attempts = this._internal.reconnectAttempts;
// Exponential backoff: baseDelay * 2^attempts
const delay = Math.min(baseDelay * Math.pow(2, attempts), maxDelay);
console.log(`[WebSocket] Reconnecting in ${delay}ms (attempt ${attempts + 1})`);
this._internal.reconnectTimer = setTimeout(() => {
this._internal.reconnectAttempts++;
this.doConnect();
}, delay);
},
startHeartbeat: function() {
this.stopHeartbeat();
const interval = this._internal.heartbeatInterval;
if (!interval || interval <= 0) return;
this._internal.heartbeatTimer = setInterval(() => {
if (this._internal.isConnected) {
this._internal.lastPingTime = Date.now();
try {
this._internal.socket.send('ping');
} catch (e) {
console.error('[WebSocket] Heartbeat failed:', e);
}
}
}, interval);
},
stopHeartbeat: function() {
if (this._internal.heartbeatTimer) {
clearInterval(this._internal.heartbeatTimer);
this._internal.heartbeatTimer = null;
}
},
setError: function(message) {
this._internal.lastError = message;
this.flagOutputDirty('error');
},
_onNodeDeleted: function() {
this.doDisconnect();
}
},
getInspectInfo: function() {
if (this._internal.isConnected) {
return {
type: 'value',
value: {
status: 'Connected',
url: this._internal.url,
latency: this._internal.latency + 'ms',
queueSize: this._internal.messageQueue.length
}
};
}
return {
type: 'text',
value: 'Disconnected'
};
}
};
module.exports = {
node: WebSocketNode
};
```
---
## Usage Examples
### Example 1: Chat Application
```
[Text Input: message]
→ [WebSocket] message
[Send Button] clicked
→ [WebSocket] send signal
[WebSocket] connected
→ [Variable] isOnline = true
[WebSocket] received
→ [Array] add to messages
→ [Repeater] render chat
[WebSocket] disconnected
→ [Show Toast] "Connection lost, reconnecting..."
```
### Example 2: Collaborative Cursors
```
[Mouse Move Event]
→ [Debounce] 100ms
→ [Object] { x, y, userId }
→ [WebSocket] message
→ [WebSocket] send
[WebSocket] received → { x, y, userId }
→ [Array Filter] exclude own cursor
→ [Repeater] render other cursors
```
### Example 3: Real-Time Game
```
// Send player action
[Keyboard Event: space]
→ [Object] { action: "jump", timestamp: Date.now() }
→ [WebSocket] message
→ [WebSocket] send
// Receive game state
[WebSocket] received → { players: [], score: 100 }
→ [For Each] in players
→ [Sprite] update positions
```
### Example 4: IoT Device Control
```
// Send command
[Toggle Switch] changed
→ [Object] { device: "light-1", state: value }
→ [WebSocket] send
// Receive telemetry
[WebSocket] received → { temperature: 72, humidity: 45 }
→ [Number] temperature
→ [Gauge] display
```
---
## Testing Checklist
### Functional Tests
- [ ] Connection establishes to valid WebSocket server
- [ ] Connection fails gracefully with invalid URL
- [ ] Can send text messages
- [ ] Can send JSON objects (auto-serialized)
- [ ] Can receive text messages
- [ ] Can receive JSON messages (auto-parsed)
- [ ] `connected` signal fires on open
- [ ] `disconnected` signal fires on close
- [ ] `messageSent` signal fires after send
- [ ] `messageReceived` signal fires on receive
- [ ] `isConnected` reflects current state
- [ ] Auto-reconnect works after disconnect
- [ ] Exponential backoff increases delay
- [ ] Messages queue when disconnected
- [ ] Queued messages flush on reconnect
- [ ] Manual disconnect stops auto-reconnect
- [ ] Heartbeat sends ping messages
- [ ] Latency calculated from ping/pong
- [ ] Subprotocols negotiated correctly
- [ ] Node cleanup works (no memory leaks)
### Edge Cases
- [ ] Handles rapid connect/disconnect cycles
- [ ] Handles very large messages (>1MB)
- [ ] Handles binary data correctly
- [ ] Handles server closing connection unexpectedly
- [ ] Handles network going offline/online
- [ ] Queue doesn't grow unbounded
- [ ] Multiple WebSocket nodes don't interfere
- [ ] Connection closes cleanly on component unmount
### Performance
- [ ] Memory usage stable over 1000+ messages
- [ ] No visible UI lag during high-frequency messages
- [ ] Queue processing doesn't block main thread
- [ ] Heartbeat doesn't impact performance
---
## WebSocket vs SSE Decision Guide
Help users choose between WebSocket (AGENT-002) and SSE (AGENT-001):
| Feature | SSE | WebSocket |
|---------|-----|-----------|
| **Direction** | Server → Client only | Bidirectional |
| **Protocol** | HTTP/1.1, HTTP/2 | WebSocket protocol |
| **Auto-reconnect** | Native browser behavior | Manual implementation |
| **Message format** | Text (typically JSON) | Text or Binary |
| **Firewall friendly** | ✅ Yes (uses HTTP) | ⚠️ Sometimes blocked |
| **Complexity** | Simpler | More complex |
| **Use when** | Server pushes updates | Client needs to send data |
**Rule of thumb:**
- Need to **receive** updates → SSE
- Need to **send and receive** → WebSocket
---
## Browser Compatibility
WebSocket is supported in all modern browsers:
| Browser | Support |
|---------|---------|
| Chrome | ✅ Yes |
| Firefox | ✅ Yes |
| Safari | ✅ Yes |
| Edge | ✅ Yes |
| IE 11 | ⚠️ Partial (no binary frames) |
---
## Security Considerations
### 1. Use WSS (WebSocket Secure)
Always use `wss://` in production, not `ws://`. This encrypts traffic.
### 2. Authentication
WebSocket doesn't support custom headers in browser. Options:
- Send auth token as first message after connect
- Include token in URL query: `wss://example.com/ws?token=abc123`
- Use cookie-based authentication
### 3. Message Validation
Always validate received messages server-side. Don't trust client data.
### 4. Rate Limiting
Implement server-side rate limiting to prevent abuse.
---
## Documentation Requirements
### User-Facing Docs
Create: `docs/nodes/data/websocket.md`
```markdown
# WebSocket
Enable real-time bidirectional communication with WebSocket servers. Perfect for chat, collaborative editing, gaming, and live dashboards.
## When to Use
- **Chat**: Send and receive messages
- **Collaboration**: Real-time multi-user editing
- **Gaming**: Multiplayer interactions
- **IoT**: Device control and telemetry
- **Trading**: Price updates + order placement
## WebSocket vs Server-Sent Events
- Use **SSE** when server pushes updates, client doesn't send
- Use **WebSocket** when client needs to send data to server
## Basic Usage
1. Add WebSocket node
2. Set `URL` to your WebSocket endpoint (wss://...)
3. Send `Connect` signal
4. To send: Set `message`, trigger `send` signal
5. To receive: Listen to `messageReceived`, read `received`
## Example: Chat
[Screenshot showing WebSocket in chat application]
## Queuing
When disconnected, messages can queue automatically:
- Enable `Queue When Disconnected`
- Messages send when reconnected
## Heartbeat
Keep connection alive with automatic ping/pong:
- Set `Heartbeat Interval` (milliseconds)
- Monitor `Latency` for connection health
## Security
Always use `wss://` (secure) in production, not `ws://`.
For authentication:
```javascript
// Send auth on connect
[WebSocket] connected
→ [Object] { type: "auth", token: authToken }
→ [WebSocket] message
→ [WebSocket] send
```
```
---
## Success Criteria
1. ✅ WebSocket node successfully connects and exchanges messages
2. ✅ Auto-reconnect works reliably with exponential backoff
3. ✅ Message queuing prevents data loss during disconnects
4. ✅ Heartbeat detects dead connections
5. ✅ No memory leaks over extended usage
6. ✅ Clear documentation with examples
7. ✅ Works in Erleah for real-time backend communication
---
## Future Enhancements
Post-MVP features to consider:
1. **Compression** - Enable permessage-deflate extension
2. **Binary Frames** - Better binary data support
3. **Subprotocol Handling** - React to negotiated protocol
4. **Message Buffering** - Batch sends for performance
5. **Connection Pooling** - Share socket across components
6. **Custom Heartbeat Messages** - Beyond ping/pong
---
## References
- [MDN: WebSocket API](https://developer.mozilla.org/en-US/docs/Web/API/WebSocket)
- [WebSocket Protocol RFC 6455](https://tools.ietf.org/html/rfc6455)
- [WebSocket vs SSE](https://ably.com/topic/websockets-vs-sse)
---
## Dependencies
- None (uses native WebSocket API)
## Blocked By
- None
## Blocks
- Erleah development - requires bidirectional communication
---
## Estimated Effort Breakdown
| Phase | Estimate | Description |
|-------|----------|-------------|
| Core Implementation | 1.5 days | Basic WebSocket with send/receive |
| Auto-reconnect & Queue | 1 day | Exponential backoff, message queuing |
| Heartbeat | 0.5 day | Ping/pong latency tracking |
| Testing | 1 day | Unit tests, integration tests, edge cases |
| Documentation | 0.5 day | User docs, technical docs, examples |
| Polish | 0.5 day | Error handling, performance, cleanup |
**Total: 5 days**
Buffer: None needed (straightforward implementation)
**Final: 3-5 days** (depending on testing depth)

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,883 @@
# AGENT-004: Optimistic Update Pattern
## Overview
Create a pattern and helper nodes for implementing optimistic UI updates - updating the UI immediately before the server confirms the change, then rolling back if the server rejects it. This creates a more responsive user experience for network operations.
**Phase:** 3.5 (Real-Time Agentic UI)
**Priority:** MEDIUM
**Effort:** 2-3 days
**Risk:** Low
---
## Problem Statement
### Current Pattern: Slow & Blocking
```
User clicks "Accept Connection"
Show loading spinner
Wait for server... (300-1000ms)
Update UI to show "Connected"
Hide spinner
```
**Problem:** User waits, UI feels sluggish.
### Desired Pattern: Fast & Optimistic
```
User clicks "Accept Connection"
Immediately show "Connected" (optimistic)
Send request to server (background)
IF success: Do nothing (already updated!)
IF failure: Roll back to "Pending", show error
```
**Benefit:** UI feels instant, even with slow network.
### Real-World Use Cases (Erleah)
1. **Accept Connection Request** - Show "Accepted" immediately
2. **Add to Timeline** - Item appears instantly
3. **Send Chat Message** - Message shows while sending
4. **Toggle Bookmark** - Star fills immediately
5. **Drag-Drop Reorder** - Items reorder before server confirms
---
## Goals
1. ✅ Apply optimistic update to variable/store
2. ✅ Commit if backend succeeds
3. ✅ Rollback if backend fails
4. ✅ Show pending state (optional)
5. ✅ Queue multiple optimistic updates
6. ✅ Handle race conditions (out-of-order responses)
7. ✅ Integrate with Global Store (AGENT-003)
---
## Technical Design
### Node Specification
```javascript
{
name: 'net.noodl.OptimisticUpdate',
displayNodeName: 'Optimistic Update',
category: 'Data',
color: 'orange',
docs: 'https://docs.noodl.net/nodes/data/optimistic-update'
}
```
### Port Schema
#### Inputs
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| `apply` | signal | Actions | Apply optimistic update |
| `commit` | signal | Actions | Confirm update succeeded |
| `rollback` | signal | Actions | Revert update |
| `optimisticValue` | * | Update | Value to apply optimistically |
| `storeName` | string | Store | Global store name (if using store) |
| `key` | string | Store | Store key to update |
| `variableName` | string | Variable | Or Variable node to update |
| `transactionId` | string | Transaction | Unique ID for this update |
| `timeout` | number | Config | Auto-rollback after ms (default: 30000) |
#### Outputs
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| `value` | * | Data | Current value (optimistic or committed) |
| `isPending` | boolean | Status | Update awaiting confirmation |
| `isCommitted` | boolean | Status | Update confirmed |
| `isRolledBack` | boolean | Status | Update reverted |
| `applied` | signal | Events | Fires after optimistic apply |
| `committed` | signal | Events | Fires after commit |
| `rolledBack` | signal | Events | Fires after rollback |
| `timedOut` | signal | Events | Fires if timeout reached |
| `previousValue` | * | Data | Value before optimistic update |
| `error` | string | Events | Error message on rollback |
### State Machine
```
┌─────────┐
START │ │
────┬─→│ IDLE │←──────────┐
│ │ │ │
│ └─────────┘ │
│ │ │
│ [apply] │
│ │ │
│ ▼ │
│ ┌─────────┐ [commit]
│ │ PENDING │───────────┘
│ │ │
│ └─────────┘
│ │
│ [rollback]
│ [timeout]
│ │
│ ▼
│ ┌─────────┐
└──│ROLLED │
│BACK │
└─────────┘
```
---
## Implementation Details
### File Structure
```
packages/noodl-runtime/src/nodes/std-library/data/
├── optimisticupdatenode.js # Main node
├── optimisticmanager.js # Transaction manager
└── optimisticupdate.test.js # Tests
```
### Transaction Manager
```javascript
// optimisticmanager.js
class OptimisticUpdateManager {
constructor() {
this.transactions = new Map();
}
/**
* Apply optimistic update
*/
apply(transactionId, currentValue, optimisticValue, options = {}) {
if (this.transactions.has(transactionId)) {
throw new Error(`Transaction ${transactionId} already exists`);
}
const transaction = {
id: transactionId,
previousValue: currentValue,
optimisticValue: optimisticValue,
appliedAt: Date.now(),
status: 'pending',
timeout: options.timeout || 30000,
timer: null
};
// Set timeout for auto-rollback
if (transaction.timeout > 0) {
transaction.timer = setTimeout(() => {
this.timeout(transactionId);
}, transaction.timeout);
}
this.transactions.set(transactionId, transaction);
console.log(`[OptimisticUpdate] Applied: ${transactionId}`);
return {
value: optimisticValue,
isPending: true
};
}
/**
* Commit transaction (success)
*/
commit(transactionId) {
const transaction = this.transactions.get(transactionId);
if (!transaction) {
console.warn(`[OptimisticUpdate] Transaction not found: ${transactionId}`);
return null;
}
// Clear timeout
if (transaction.timer) {
clearTimeout(transaction.timer);
}
transaction.status = 'committed';
this.transactions.delete(transactionId);
console.log(`[OptimisticUpdate] Committed: ${transactionId}`);
return {
value: transaction.optimisticValue,
isPending: false,
isCommitted: true
};
}
/**
* Rollback transaction (failure)
*/
rollback(transactionId, error = null) {
const transaction = this.transactions.get(transactionId);
if (!transaction) {
console.warn(`[OptimisticUpdate] Transaction not found: ${transactionId}`);
return null;
}
// Clear timeout
if (transaction.timer) {
clearTimeout(transaction.timer);
}
transaction.status = 'rolled_back';
transaction.error = error;
const result = {
value: transaction.previousValue,
isPending: false,
isRolledBack: true,
error: error
};
this.transactions.delete(transactionId);
console.log(`[OptimisticUpdate] Rolled back: ${transactionId}`, error);
return result;
}
/**
* Auto-rollback on timeout
*/
timeout(transactionId) {
const transaction = this.transactions.get(transactionId);
if (!transaction) return null;
console.warn(`[OptimisticUpdate] Timeout: ${transactionId}`);
return this.rollback(transactionId, 'Request timed out');
}
/**
* Check if transaction is pending
*/
isPending(transactionId) {
const transaction = this.transactions.get(transactionId);
return transaction && transaction.status === 'pending';
}
/**
* Get transaction info
*/
getTransaction(transactionId) {
return this.transactions.get(transactionId);
}
/**
* Clear all transactions (cleanup)
*/
clear() {
this.transactions.forEach(transaction => {
if (transaction.timer) {
clearTimeout(transaction.timer);
}
});
this.transactions.clear();
}
}
const optimisticUpdateManager = new OptimisticUpdateManager();
module.exports = { optimisticUpdateManager };
```
### Optimistic Update Node
```javascript
// optimisticupdatenode.js
const { optimisticUpdateManager } = require('./optimisticmanager');
const { globalStoreManager } = require('./globalstore');
var OptimisticUpdateNode = {
name: 'net.noodl.OptimisticUpdate',
displayNodeName: 'Optimistic Update',
category: 'Data',
color: 'orange',
initialize: function() {
this._internal.transactionId = null;
this._internal.currentValue = null;
this._internal.isPending = false;
},
inputs: {
apply: {
type: 'signal',
displayName: 'Apply',
group: 'Actions',
valueChangedToTrue: function() {
this.doApply();
}
},
commit: {
type: 'signal',
displayName: 'Commit',
group: 'Actions',
valueChangedToTrue: function() {
this.doCommit();
}
},
rollback: {
type: 'signal',
displayName: 'Rollback',
group: 'Actions',
valueChangedToTrue: function() {
this.doRollback();
}
},
optimisticValue: {
type: '*',
displayName: 'Optimistic Value',
group: 'Update',
set: function(value) {
this._internal.optimisticValue = value;
}
},
storeName: {
type: 'string',
displayName: 'Store Name',
group: 'Store',
set: function(value) {
this._internal.storeName = value;
}
},
key: {
type: 'string',
displayName: 'Key',
group: 'Store',
set: function(value) {
this._internal.key = value;
}
},
transactionId: {
type: 'string',
displayName: 'Transaction ID',
group: 'Transaction',
set: function(value) {
this._internal.transactionId = value;
}
},
timeout: {
type: 'number',
displayName: 'Timeout (ms)',
group: 'Config',
default: 30000,
set: function(value) {
this._internal.timeout = value;
}
}
},
outputs: {
value: {
type: '*',
displayName: 'Value',
group: 'Data',
getter: function() {
return this._internal.currentValue;
}
},
isPending: {
type: 'boolean',
displayName: 'Is Pending',
group: 'Status',
getter: function() {
return this._internal.isPending;
}
},
isCommitted: {
type: 'boolean',
displayName: 'Is Committed',
group: 'Status',
getter: function() {
return this._internal.isCommitted;
}
},
isRolledBack: {
type: 'boolean',
displayName: 'Is Rolled Back',
group: 'Status',
getter: function() {
return this._internal.isRolledBack;
}
},
applied: {
type: 'signal',
displayName: 'Applied',
group: 'Events'
},
committed: {
type: 'signal',
displayName: 'Committed',
group: 'Events'
},
rolledBack: {
type: 'signal',
displayName: 'Rolled Back',
group: 'Events'
},
timedOut: {
type: 'signal',
displayName: 'Timed Out',
group: 'Events'
},
previousValue: {
type: '*',
displayName: 'Previous Value',
group: 'Data',
getter: function() {
return this._internal.previousValue;
}
},
error: {
type: 'string',
displayName: 'Error',
group: 'Events',
getter: function() {
return this._internal.error;
}
}
},
methods: {
doApply: function() {
const transactionId = this._internal.transactionId || this.generateTransactionId();
const optimisticValue = this._internal.optimisticValue;
// Get current value from store
let currentValue;
if (this._internal.storeName && this._internal.key) {
const store = globalStoreManager.getState(this._internal.storeName);
currentValue = store[this._internal.key];
} else {
currentValue = this._internal.currentValue;
}
// Apply optimistic update
const result = optimisticUpdateManager.apply(
transactionId,
currentValue,
optimisticValue,
{ timeout: this._internal.timeout }
);
// Update store if configured
if (this._internal.storeName && this._internal.key) {
globalStoreManager.setKey(
this._internal.storeName,
this._internal.key,
optimisticValue
);
}
// Update internal state
this._internal.previousValue = currentValue;
this._internal.currentValue = optimisticValue;
this._internal.isPending = true;
this._internal.isCommitted = false;
this._internal.isRolledBack = false;
this._internal.transactionId = transactionId;
this.flagOutputDirty('value');
this.flagOutputDirty('isPending');
this.flagOutputDirty('previousValue');
this.sendSignalOnOutput('applied');
},
doCommit: function() {
const transactionId = this._internal.transactionId;
if (!transactionId) {
console.warn('[OptimisticUpdate] No transaction to commit');
return;
}
const result = optimisticUpdateManager.commit(transactionId);
if (!result) return;
this._internal.isPending = false;
this._internal.isCommitted = true;
this.flagOutputDirty('isPending');
this.flagOutputDirty('isCommitted');
this.sendSignalOnOutput('committed');
},
doRollback: function(error = null) {
const transactionId = this._internal.transactionId;
if (!transactionId) {
console.warn('[OptimisticUpdate] No transaction to rollback');
return;
}
const result = optimisticUpdateManager.rollback(transactionId, error);
if (!result) return;
// Revert store if configured
if (this._internal.storeName && this._internal.key) {
globalStoreManager.setKey(
this._internal.storeName,
this._internal.key,
result.value
);
}
this._internal.currentValue = result.value;
this._internal.isPending = false;
this._internal.isRolledBack = true;
this._internal.error = result.error;
this.flagOutputDirty('value');
this.flagOutputDirty('isPending');
this.flagOutputDirty('isRolledBack');
this.flagOutputDirty('error');
this.sendSignalOnOutput('rolledBack');
if (result.error === 'Request timed out') {
this.sendSignalOnOutput('timedOut');
}
},
generateTransactionId: function() {
return 'tx_' + Date.now() + '_' + Math.random().toString(36).substr(2, 9);
},
_onNodeDeleted: function() {
// Clean up pending transaction
if (this._internal.transactionId) {
optimisticUpdateManager.rollback(this._internal.transactionId);
}
}
},
getInspectInfo: function() {
return {
type: 'value',
value: {
status: this._internal.isPending ? 'Pending' : 'Idle',
value: this._internal.currentValue,
transactionId: this._internal.transactionId
}
};
}
};
module.exports = {
node: OptimisticUpdateNode
};
```
---
## Usage Examples
### Example 1: Accept Connection (Erleah)
```
[Button: "Accept"] clicked
[Optimistic Update]
optimisticValue: { status: "accepted" }
storeName: "connections"
key: "connection-{id}"
timeout: 5000
[Optimistic Update] apply signal
[HTTP Request] POST /connections/{id}/accept
[HTTP Request] success
→ [Optimistic Update] commit
[HTTP Request] failure
→ [Optimistic Update] rollback
→ [Show Toast] "Failed to accept connection"
```
### Example 2: Add to Timeline
```
[AI Agent] suggests session
[Button: "Add to Timeline"] clicked
[Optimistic Update]
optimisticValue: { ...sessionData, id: tempId }
storeName: "agenda"
key: "timeline"
[Optimistic Update] value
→ [Array] push to timeline array
→ [Optimistic Update] optimisticValue
→ [Optimistic Update] apply
// Timeline immediately shows item!
[HTTP Request] POST /agenda/sessions
→ returns real session ID
[Success]
→ [Replace temp ID with real ID]
→ [Optimistic Update] commit
[Failure]
→ [Optimistic Update] rollback
→ [Show Error] "Could not add session"
```
### Example 3: Chat Message Send
```
[Text Input] → messageText
[Send Button] clicked
[Object] create message
id: tempId
text: messageText
status: "sending"
timestamp: now
[Optimistic Update]
storeName: "chat"
key: "messages"
[Optimistic Update] value (current messages)
→ [Array] push new message
→ [Optimistic Update] optimisticValue
→ [Optimistic Update] apply
// Message appears immediately with "sending" indicator
[HTTP Request] POST /messages
[Success] → real message from server
→ [Update message status to "sent"]
→ [Optimistic Update] commit
[Failure]
→ [Optimistic Update] rollback
→ [Update message status to "failed"]
→ [Show Retry Button]
```
### Example 4: Toggle Bookmark
```
[Star Icon] clicked
[Variable: isBookmarked] current value
→ [Expression] !value (toggle)
→ [Optimistic Update] optimisticValue
[Optimistic Update] apply
[Star Icon] filled = [Optimistic Update] value
// Star fills immediately
[HTTP Request] POST /bookmarks
[Success]
→ [Optimistic Update] commit
[Failure]
→ [Optimistic Update] rollback
→ [Show Toast] "Couldn't save bookmark"
// Star unfills on failure
```
---
## Testing Checklist
### Functional Tests
- [ ] Apply sets value optimistically
- [ ] Commit keeps optimistic value
- [ ] Rollback reverts to previous value
- [ ] Timeout triggers automatic rollback
- [ ] isPending reflects correct state
- [ ] Store integration works
- [ ] Multiple transactions don't interfere
- [ ] Transaction IDs are unique
- [ ] Signals fire at correct times
### Edge Cases
- [ ] Commit without apply (no-op)
- [ ] Rollback without apply (no-op)
- [ ] Apply twice with same transaction ID (error)
- [ ] Commit after timeout (no-op)
- [ ] Very fast success (commit before timeout)
- [ ] Network reconnect scenarios
- [ ] Component unmount cleans up transaction
### Performance
- [ ] No memory leaks with many transactions
- [ ] Timeout cleanup works correctly
- [ ] Multiple optimistic updates in quick succession
---
## Documentation Requirements
### User-Facing Docs
Create: `docs/nodes/data/optimistic-update.md`
```markdown
# Optimistic Update
Make your UI feel instant by updating immediately, then confirming with the server later. Perfect for actions like liking, bookmarking, or accepting requests.
## The Problem
Without optimistic updates:
```
User clicks button → Show spinner → Wait... → Update UI
(feels slow)
```
With optimistic updates:
```
User clicks button → Update UI immediately → Confirm in background
(feels instant!)
```
## Basic Pattern
1. Apply optimistic update (UI changes immediately)
2. Send request to server (background)
3. If success: Commit (keep the change)
4. If failure: Rollback (undo the change)
## Example: Like Button
[Full example with visual diagrams]
## With Global Store
For shared state, use with Global Store:
```
[Optimistic Update]
storeName: "posts"
key: "likes"
optimisticValue: likesCount + 1
```
All components subscribing to the store update automatically!
## Timeout
If server doesn't respond:
```
[Optimistic Update]
timeout: 5000 // Auto-rollback after 5s
```
## Best Practices
1. **Always handle rollback**: Show error message
2. **Show pending state**: "Saving..." indicator (optional)
3. **Use unique IDs**: Let node generate, or provide your own
4. **Set reasonable timeout**: 5-30 seconds depending on operation
```
---
## Success Criteria
1. ✅ Optimistic updates apply immediately
2. ✅ Rollback works on failure
3. ✅ Timeout prevents stuck pending states
4. ✅ Integrates with Global Store
5. ✅ No memory leaks
6. ✅ Clear documentation with examples
7. ✅ Works in Erleah for responsive interactions
---
## Future Enhancements
1. **Retry Logic** - Auto-retry failed operations
2. **Conflict Resolution** - Handle concurrent updates
3. **Offline Queue** - Queue updates when offline
4. **Animation Hooks** - Smooth transitions on rollback
5. **Batch Commits** - Commit multiple related transactions
---
## References
- [Optimistic UI](https://www.apollographql.com/docs/react/performance/optimistic-ui/) - Apollo GraphQL docs
- [React Query Optimistic Updates](https://tanstack.com/query/latest/docs/framework/react/guides/optimistic-updates)
---
## Dependencies
- AGENT-003 (Global State Store) - for store integration
## Blocked By
- AGENT-003
## Blocks
- None (optional enhancement for Erleah)
---
## Estimated Effort Breakdown
| Phase | Estimate | Description |
|-------|----------|-------------|
| Transaction Manager | 0.5 day | Core state machine |
| Optimistic Update Node | 1 day | Main node with store integration |
| Testing | 0.5 day | Unit tests, edge cases |
| Documentation | 0.5 day | User docs, examples |
**Total: 2.5 days**
Buffer: +0.5 day for edge cases = **3 days**
**Final: 2-3 days**

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,786 @@
# AGENT-006: State History & Time Travel
## Overview
Create a state history tracking system that enables undo/redo, time-travel debugging, and state snapshots. This helps users recover from mistakes and developers debug complex state interactions.
**Phase:** 3.5 (Real-Time Agentic UI)
**Priority:** LOW (nice-to-have)
**Effort:** 1-2 days
**Risk:** Low
---
## Problem Statement
### Current Limitation
State changes are permanent:
```
User makes mistake → State changes → Can't undo
Developer debugging → State changed 10 steps ago → Can't replay
```
No way to go back in time.
### Desired Pattern
```
User action → State snapshot → Change state
User: "Undo" → Restore previous snapshot
Developer: "Go back 5 steps" → Time travel to that state
```
### Real-World Use Cases
1. **Undo Mistakes** - User accidentally removes item from timeline
2. **Debug State** - Developer replays sequence that caused bug
3. **A/B Comparison** - Save state, test changes, restore to compare
4. **Session Recovery** - Reload state after browser crash
5. **Feature Flags** - Toggle features on/off with instant rollback
---
## Goals
1. ✅ Track state changes automatically
2. ✅ Undo/redo state changes
3. ✅ Jump to specific state in history
4. ✅ Save/restore state snapshots
5. ✅ Limit history size (memory management)
6. ✅ Integrate with Global Store (AGENT-003)
7. ✅ Export/import history (debugging)
---
## Technical Design
### Node Specifications
We'll create THREE nodes:
1. **State History** - Track and manage history
2. **Undo** - Revert to previous state
3. **State Snapshot** - Save/restore snapshots
### State History Node
```javascript
{
name: 'net.noodl.StateHistory',
displayNodeName: 'State History',
category: 'Data',
color: 'blue',
docs: 'https://docs.noodl.net/nodes/data/state-history'
}
```
#### Ports: State History
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| **Inputs** |
| `storeName` | string | Store | Global store to track |
| `trackKeys` | string | Config | Comma-separated keys to track (blank = all) |
| `maxHistory` | number | Config | Max history entries (default: 50) |
| `enabled` | boolean | Config | Enable/disable tracking (default: true) |
| `clearHistory` | signal | Actions | Clear history |
| **Outputs** |
| `historySize` | number | Status | Number of entries in history |
| `canUndo` | boolean | Status | Can go back |
| `canRedo` | boolean | Status | Can go forward |
| `currentIndex` | number | Status | Position in history |
| `history` | array | Data | Full history array |
| `stateChanged` | signal | Events | Fires on any state change |
### Undo Node
```javascript
{
name: 'net.noodl.StateHistory.Undo',
displayNodeName: 'Undo',
category: 'Data',
color: 'blue'
}
```
#### Ports: Undo Node
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| **Inputs** |
| `storeName` | string | Store | Store to undo |
| `undo` | signal | Actions | Go back one step |
| `redo` | signal | Actions | Go forward one step |
| `jumpTo` | signal | Actions | Jump to specific index |
| `targetIndex` | number | Jump | Index to jump to |
| **Outputs** |
| `undone` | signal | Events | Fires after undo |
| `redone` | signal | Events | Fires after redo |
| `jumped` | signal | Events | Fires after jump |
### State Snapshot Node
```javascript
{
name: 'net.noodl.StateSnapshot',
displayNodeName: 'State Snapshot',
category: 'Data',
color: 'blue'
}
```
#### Ports: State Snapshot
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| **Inputs** |
| `storeName` | string | Store | Store to snapshot |
| `save` | signal | Actions | Save current state |
| `restore` | signal | Actions | Restore saved state |
| `snapshotName` | string | Snapshot | Name for this snapshot |
| `snapshotData` | object | Snapshot | Snapshot to restore (from export) |
| **Outputs** |
| `snapshot` | object | Data | Current saved snapshot |
| `saved` | signal | Events | Fires after save |
| `restored` | signal | Events | Fires after restore |
---
## Implementation Details
### File Structure
```
packages/noodl-runtime/src/nodes/std-library/data/
├── statehistorymanager.js # History tracking
├── statehistorynode.js # State History node
├── undonode.js # Undo/Redo node
├── statesnapshotnode.js # Snapshot node
└── statehistory.test.js # Tests
```
### State History Manager
```javascript
// statehistorymanager.js
const { globalStoreManager } = require('./globalstore');
class StateHistoryManager {
constructor() {
this.histories = new Map(); // storeName -> history
this.snapshots = new Map(); // snapshotName -> state
}
/**
* Start tracking a store
*/
trackStore(storeName, options = {}) {
if (this.histories.has(storeName)) {
return; // Already tracking
}
const { maxHistory = 50, trackKeys = [] } = options;
const history = {
entries: [],
currentIndex: -1,
maxHistory,
trackKeys,
unsubscribe: null
};
// Get initial state
const initialState = globalStoreManager.getState(storeName);
history.entries.push({
state: JSON.parse(JSON.stringify(initialState)),
timestamp: Date.now(),
description: 'Initial state'
});
history.currentIndex = 0;
// Subscribe to changes
history.unsubscribe = globalStoreManager.subscribe(
storeName,
(nextState, prevState, changedKeys) => {
this.recordStateChange(storeName, nextState, changedKeys);
},
trackKeys
);
this.histories.set(storeName, history);
console.log(`[StateHistory] Tracking store: ${storeName}`);
}
/**
* Stop tracking a store
*/
stopTracking(storeName) {
const history = this.histories.get(storeName);
if (!history) return;
if (history.unsubscribe) {
history.unsubscribe();
}
this.histories.delete(storeName);
}
/**
* Record a state change
*/
recordStateChange(storeName, newState, changedKeys) {
const history = this.histories.get(storeName);
if (!history) return;
// If we're not at the end, truncate future
if (history.currentIndex < history.entries.length - 1) {
history.entries = history.entries.slice(0, history.currentIndex + 1);
}
// Add new entry
history.entries.push({
state: JSON.parse(JSON.stringify(newState)),
timestamp: Date.now(),
changedKeys: changedKeys,
description: `Changed: ${changedKeys.join(', ')}`
});
// Enforce max history
if (history.entries.length > history.maxHistory) {
history.entries.shift();
} else {
history.currentIndex++;
}
console.log(`[StateHistory] Recorded change in ${storeName}`, changedKeys);
}
/**
* Undo (go back)
*/
undo(storeName) {
const history = this.histories.get(storeName);
if (!history || history.currentIndex <= 0) {
return null; // Can't undo
}
history.currentIndex--;
const entry = history.entries[history.currentIndex];
// Restore state
globalStoreManager.setState(storeName, entry.state);
console.log(`[StateHistory] Undo in ${storeName} to index ${history.currentIndex}`);
return {
state: entry.state,
index: history.currentIndex,
canUndo: history.currentIndex > 0,
canRedo: history.currentIndex < history.entries.length - 1
};
}
/**
* Redo (go forward)
*/
redo(storeName) {
const history = this.histories.get(storeName);
if (!history || history.currentIndex >= history.entries.length - 1) {
return null; // Can't redo
}
history.currentIndex++;
const entry = history.entries[history.currentIndex];
// Restore state
globalStoreManager.setState(storeName, entry.state);
console.log(`[StateHistory] Redo in ${storeName} to index ${history.currentIndex}`);
return {
state: entry.state,
index: history.currentIndex,
canUndo: history.currentIndex > 0,
canRedo: history.currentIndex < history.entries.length - 1
};
}
/**
* Jump to specific point in history
*/
jumpTo(storeName, index) {
const history = this.histories.get(storeName);
if (!history || index < 0 || index >= history.entries.length) {
return null;
}
history.currentIndex = index;
const entry = history.entries[index];
// Restore state
globalStoreManager.setState(storeName, entry.state);
console.log(`[StateHistory] Jump to index ${index} in ${storeName}`);
return {
state: entry.state,
index: history.currentIndex,
canUndo: history.currentIndex > 0,
canRedo: history.currentIndex < history.entries.length - 1
};
}
/**
* Get history info
*/
getHistoryInfo(storeName) {
const history = this.histories.get(storeName);
if (!history) return null;
return {
size: history.entries.length,
currentIndex: history.currentIndex,
canUndo: history.currentIndex > 0,
canRedo: history.currentIndex < history.entries.length - 1,
entries: history.entries.map((e, i) => ({
index: i,
timestamp: e.timestamp,
description: e.description,
isCurrent: i === history.currentIndex
}))
};
}
/**
* Clear history
*/
clearHistory(storeName) {
const history = this.histories.get(storeName);
if (!history) return;
const currentState = globalStoreManager.getState(storeName);
history.entries = [{
state: JSON.parse(JSON.stringify(currentState)),
timestamp: Date.now(),
description: 'Reset'
}];
history.currentIndex = 0;
}
/**
* Save snapshot
*/
saveSnapshot(snapshotName, storeName) {
const state = globalStoreManager.getState(storeName);
const snapshot = {
name: snapshotName,
storeName: storeName,
state: JSON.parse(JSON.stringify(state)),
timestamp: Date.now()
};
this.snapshots.set(snapshotName, snapshot);
console.log(`[StateHistory] Saved snapshot: ${snapshotName}`);
return snapshot;
}
/**
* Restore snapshot
*/
restoreSnapshot(snapshotName) {
const snapshot = this.snapshots.get(snapshotName);
if (!snapshot) {
throw new Error(`Snapshot not found: ${snapshotName}`);
}
globalStoreManager.setState(snapshot.storeName, snapshot.state);
console.log(`[StateHistory] Restored snapshot: ${snapshotName}`);
return snapshot;
}
/**
* Export history (for debugging)
*/
exportHistory(storeName) {
const history = this.histories.get(storeName);
if (!history) return null;
return {
storeName,
entries: history.entries,
currentIndex: history.currentIndex,
exportedAt: Date.now()
};
}
/**
* Import history (for debugging)
*/
importHistory(historyData) {
const { storeName, entries, currentIndex } = historyData;
// Stop current tracking
this.stopTracking(storeName);
// Create new history
const history = {
entries: entries,
currentIndex: currentIndex,
maxHistory: 50,
trackKeys: [],
unsubscribe: null
};
this.histories.set(storeName, history);
// Restore to current index
const entry = entries[currentIndex];
globalStoreManager.setState(storeName, entry.state);
}
}
const stateHistoryManager = new StateHistoryManager();
module.exports = { stateHistoryManager };
```
### State History Node (abbreviated)
```javascript
// statehistorynode.js
const { stateHistoryManager } = require('./statehistorymanager');
var StateHistoryNode = {
name: 'net.noodl.StateHistory',
displayNodeName: 'State History',
category: 'Data',
color: 'blue',
initialize: function() {
this._internal.tracking = false;
},
inputs: {
storeName: {
type: 'string',
displayName: 'Store Name',
default: 'app',
set: function(value) {
this._internal.storeName = value;
this.startTracking();
}
},
enabled: {
type: 'boolean',
displayName: 'Enabled',
default: true,
set: function(value) {
if (value) {
this.startTracking();
} else {
this.stopTracking();
}
}
},
maxHistory: {
type: 'number',
displayName: 'Max History',
default: 50
},
clearHistory: {
type: 'signal',
displayName: 'Clear History',
valueChangedToTrue: function() {
stateHistoryManager.clearHistory(this._internal.storeName);
this.updateOutputs();
}
}
},
outputs: {
historySize: {
type: 'number',
displayName: 'History Size',
getter: function() {
const info = stateHistoryManager.getHistoryInfo(this._internal.storeName);
return info ? info.size : 0;
}
},
canUndo: {
type: 'boolean',
displayName: 'Can Undo',
getter: function() {
const info = stateHistoryManager.getHistoryInfo(this._internal.storeName);
return info ? info.canUndo : false;
}
},
canRedo: {
type: 'boolean',
displayName: 'Can Redo',
getter: function() {
const info = stateHistoryManager.getHistoryInfo(this._internal.storeName);
return info ? info.canRedo : false;
}
}
},
methods: {
startTracking: function() {
if (this._internal.tracking) return;
stateHistoryManager.trackStore(this._internal.storeName, {
maxHistory: this._internal.maxHistory,
trackKeys: this._internal.trackKeys
});
this._internal.tracking = true;
this.updateOutputs();
},
stopTracking: function() {
if (!this._internal.tracking) return;
stateHistoryManager.stopTracking(this._internal.storeName);
this._internal.tracking = false;
},
updateOutputs: function() {
this.flagOutputDirty('historySize');
this.flagOutputDirty('canUndo');
this.flagOutputDirty('canRedo');
}
}
};
```
---
## Usage Examples
### Example 1: Undo Button
```
[Button: "Undo"] clicked
[Undo]
storeName: "app"
undo signal
[Undo] undone
→ [Show Toast] "Undone"
[State History] canUndo
→ [Button] disabled = !canUndo
```
### Example 2: Timeline Slider
```
[State History]
storeName: "app"
historySize → maxValue
currentIndex → value
[Slider] value changed
→ [Undo] targetIndex
→ [Undo] jumpTo signal
// User can scrub through history!
```
### Example 3: Save/Restore Checkpoint
```
[Button: "Save Checkpoint"] clicked
[State Snapshot]
storeName: "app"
snapshotName: "checkpoint-1"
save
// Later...
[Button: "Restore Checkpoint"] clicked
[State Snapshot]
snapshotName: "checkpoint-1"
restore
```
### Example 4: Debug Mode
```
// Dev tools panel
[State History] history
→ [Repeater] show each entry
[Entry] clicked
→ [Undo] jumpTo with entry.index
[Button: "Export History"]
→ [State History] exportHistory
→ [File Download] history.json
```
---
## Testing Checklist
### Functional Tests
- [ ] History tracks state changes
- [ ] Undo reverts to previous state
- [ ] Redo goes forward
- [ ] Jump to specific index works
- [ ] Max history limit enforced
- [ ] Clear history works
- [ ] Snapshots save/restore correctly
- [ ] Export/import preserves history
### Edge Cases
- [ ] Undo at beginning (no-op)
- [ ] Redo at end (no-op)
- [ ] Jump to invalid index
- [ ] Change state while not at end (truncate future)
- [ ] Track empty store
- [ ] Very rapid state changes
- [ ] Large state objects (>1MB)
### Performance
- [ ] No memory leaks with long history
- [ ] History doesn't slow down app
- [ ] Deep cloning doesn't block UI
---
## Documentation Requirements
### User-Facing Docs
Create: `docs/nodes/data/state-history.md`
```markdown
# State History
Add undo/redo and time travel to your app. Track state changes and let users go back in time.
## Use Cases
- **Undo Mistakes**: User accidentally deletes something
- **Debug Complex State**: Developer traces bug through history
- **A/B Testing**: Save state, test, restore to compare
- **Session Recovery**: Reload after crash
## Basic Usage
**Step 1: Track State**
```
[State History]
storeName: "app"
maxHistory: 50
```
**Step 2: Add Undo**
```
[Button: "Undo"] clicked
→ [Undo] storeName: "app", undo signal
```
**Step 3: Disable When Can't Undo**
```
[State History] canUndo
→ [Button] disabled = !canUndo
```
## Time Travel
Build a history slider:
```
[State History] history → entries
→ [Slider] 0 to historySize
→ value changed
→ [Undo] jumpTo
```
## Snapshots
Save points you can return to:
```
[State Snapshot] save → checkpoint
[State Snapshot] restore ← checkpoint
```
## Best Practices
1. **Limit history size**: 50 entries prevents memory issues
2. **Track only what you need**: Use trackKeys for large stores
3. **Disable in production**: Enable only for dev/debug
```
---
## Success Criteria
1. ✅ Undo/redo works reliably
2. ✅ History doesn't leak memory
3. ✅ Snapshots save/restore correctly
4. ✅ Export/import for debugging
5. ✅ Clear documentation
6. ✅ Optional enhancement for Erleah
---
## Future Enhancements
1. **Diff Viewer** - Show what changed between states
2. **Branching History** - Tree instead of linear
3. **Selective Undo** - Undo specific changes only
4. **Persistence** - Save history to localStorage
5. **Collaborative Undo** - Undo others' changes
---
## Dependencies
- AGENT-003 (Global State Store)
## Blocked By
- AGENT-003
## Blocks
- None (optional feature)
---
## Estimated Effort Breakdown
| Phase | Estimate | Description |
|-------|----------|-------------|
| History Manager | 0.5 day | Core tracking system |
| Undo/Redo Node | 0.5 day | Node implementation |
| Snapshot Node | 0.5 day | Save/restore system |
| Testing | 0.5 day | Edge cases, memory leaks |
| Documentation | 0.5 day | User docs, examples |
**Total: 2.5 days**
Buffer: None needed
**Final: 1-2 days** (if scope kept minimal)

View File

@@ -0,0 +1,877 @@
# AGENT-007: Stream Parser Utilities
## Overview
Create utility nodes for parsing streaming data, particularly JSON streams, chunked responses, and incremental text accumulation. These utilities help process SSE and WebSocket messages that arrive in fragments.
**Phase:** 3.5 (Real-Time Agentic UI)
**Priority:** MEDIUM
**Effort:** 2-3 days
**Risk:** Low
---
## Problem Statement
### Current Limitation
Streaming data often arrives in fragments:
```
Chunk 1: {"type":"mes
Chunk 2: sage","conte
Chunk 3: nt":"Hello"}
```
Without parsing utilities, developers must manually:
1. Accumulate chunks
2. Detect message boundaries
3. Parse JSON safely
4. Handle malformed data
This is error-prone and repetitive.
### Desired Pattern
```
[SSE] data → raw chunks
[Stream Parser] accumulate & parse
[Complete JSON Object] → use in app
```
### Real-World Use Cases (Erleah)
1. **AI Chat Streaming** - Accumulate tokens into messages
2. **JSON Streaming** - Parse newline-delimited JSON (NDJSON)
3. **Progress Updates** - Extract percentages from stream
4. **CSV Streaming** - Parse CSV row-by-row
5. **Log Streaming** - Parse structured logs
---
## Goals
1. ✅ Accumulate text chunks into complete messages
2. ✅ Parse NDJSON (newline-delimited JSON)
3. ✅ Parse JSON chunks safely (handle incomplete JSON)
4. ✅ Extract values from streaming text (regex patterns)
5. ✅ Detect message boundaries (delimiters)
6. ✅ Buffer and flush patterns
7. ✅ Handle encoding/decoding
---
## Technical Design
### Node Specifications
We'll create FOUR utility nodes:
1. **Text Accumulator** - Accumulate chunks into complete text
2. **JSON Stream Parser** - Parse NDJSON or chunked JSON
3. **Pattern Extractor** - Extract values using regex
4. **Stream Buffer** - Buffer with custom flush logic
### Text Accumulator Node
```javascript
{
name: 'net.noodl.TextAccumulator',
displayNodeName: 'Text Accumulator',
category: 'Data',
color: 'green',
docs: 'https://docs.noodl.net/nodes/data/text-accumulator'
}
```
#### Ports: Text Accumulator
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| **Inputs** |
| `chunk` | string | Data | Text chunk to add |
| `add` | signal | Actions | Add chunk to buffer |
| `clear` | signal | Actions | Clear accumulated text |
| `delimiter` | string | Config | Message delimiter (default: "\\n") |
| `maxLength` | number | Config | Max buffer size (default: 1MB) |
| **Outputs** |
| `accumulated` | string | Data | Current accumulated text |
| `messages` | array | Data | Complete messages (split by delimiter) |
| `messageCount` | number | Status | Number of complete messages |
| `messageReceived` | signal | Events | Fires when complete message |
| `bufferSize` | number | Status | Current buffer size (bytes) |
| `cleared` | signal | Events | Fires after clear |
### JSON Stream Parser Node
```javascript
{
name: 'net.noodl.JSONStreamParser',
displayNodeName: 'JSON Stream Parser',
category: 'Data',
color: 'green'
}
```
#### Ports: JSON Stream Parser
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| **Inputs** |
| `chunk` | string | Data | JSON chunk |
| `parse` | signal | Actions | Trigger parse |
| `clear` | signal | Actions | Clear buffer |
| `format` | enum | Config | 'ndjson', 'array', 'single' |
| **Outputs** |
| `parsed` | * | Data | Parsed object |
| `success` | signal | Events | Fires on successful parse |
| `error` | string | Events | Parse error message |
| `isComplete` | boolean | Status | Object is complete |
### Pattern Extractor Node
```javascript
{
name: 'net.noodl.PatternExtractor',
displayNodeName: 'Pattern Extractor',
category: 'Data',
color: 'green'
}
```
#### Ports: Pattern Extractor
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| **Inputs** |
| `text` | string | Data | Text to extract from |
| `pattern` | string | Pattern | Regex pattern |
| `extract` | signal | Actions | Trigger extraction |
| `extractAll` | boolean | Config | Extract all matches vs first |
| **Outputs** |
| `match` | string | Data | Matched text |
| `matches` | array | Data | All matches |
| `groups` | array | Data | Capture groups |
| `found` | signal | Events | Fires when match found |
| `notFound` | signal | Events | Fires when no match |
### Stream Buffer Node
```javascript
{
name: 'net.noodl.StreamBuffer',
displayNodeName: 'Stream Buffer',
category: 'Data',
color: 'green'
}
```
#### Ports: Stream Buffer
| Port Name | Type | Group | Description |
|-----------|------|-------|-------------|
| **Inputs** |
| `data` | * | Data | Data to buffer |
| `add` | signal | Actions | Add to buffer |
| `flush` | signal | Actions | Flush buffer |
| `clear` | signal | Actions | Clear buffer |
| `flushSize` | number | Config | Auto-flush after N items |
| `flushInterval` | number | Config | Auto-flush after ms |
| **Outputs** |
| `buffer` | array | Data | Current buffer contents |
| `bufferSize` | number | Status | Items in buffer |
| `flushed` | signal | Events | Fires after flush |
| `flushedData` | array | Data | Data that was flushed |
---
## Implementation Details
### File Structure
```
packages/noodl-runtime/src/nodes/std-library/data/
├── textaccumulatornode.js # Text accumulator
├── jsonstreamparsernode.js # JSON parser
├── patternextractornode.js # Regex extractor
├── streambuffernode.js # Generic buffer
└── streamutils.test.js # Tests
```
### Text Accumulator Implementation
```javascript
// textaccumulatornode.js
var TextAccumulatorNode = {
name: 'net.noodl.TextAccumulator',
displayNodeName: 'Text Accumulator',
category: 'Data',
color: 'green',
initialize: function() {
this._internal.buffer = '';
this._internal.messages = [];
},
inputs: {
chunk: {
type: 'string',
displayName: 'Chunk',
group: 'Data',
set: function(value) {
this._internal.pendingChunk = value;
}
},
add: {
type: 'signal',
displayName: 'Add',
group: 'Actions',
valueChangedToTrue: function() {
this.addChunk();
}
},
clear: {
type: 'signal',
displayName: 'Clear',
group: 'Actions',
valueChangedToTrue: function() {
this.clearBuffer();
}
},
delimiter: {
type: 'string',
displayName: 'Delimiter',
group: 'Config',
default: '\n',
set: function(value) {
this._internal.delimiter = value;
}
},
maxLength: {
type: 'number',
displayName: 'Max Length (bytes)',
group: 'Config',
default: 1024 * 1024, // 1MB
set: function(value) {
this._internal.maxLength = value;
}
}
},
outputs: {
accumulated: {
type: 'string',
displayName: 'Accumulated',
group: 'Data',
getter: function() {
return this._internal.buffer;
}
},
messages: {
type: 'array',
displayName: 'Messages',
group: 'Data',
getter: function() {
return this._internal.messages;
}
},
messageCount: {
type: 'number',
displayName: 'Message Count',
group: 'Status',
getter: function() {
return this._internal.messages.length;
}
},
messageReceived: {
type: 'signal',
displayName: 'Message Received',
group: 'Events'
},
bufferSize: {
type: 'number',
displayName: 'Buffer Size',
group: 'Status',
getter: function() {
return new Blob([this._internal.buffer]).size;
}
},
cleared: {
type: 'signal',
displayName: 'Cleared',
group: 'Events'
}
},
methods: {
addChunk: function() {
const chunk = this._internal.pendingChunk;
if (!chunk) return;
// Add to buffer
this._internal.buffer += chunk;
// Check max length
const maxLength = this._internal.maxLength || (1024 * 1024);
if (this._internal.buffer.length > maxLength) {
console.warn('[TextAccumulator] Buffer overflow, truncating');
this._internal.buffer = this._internal.buffer.slice(-maxLength);
}
// Check for complete messages
const delimiter = this._internal.delimiter || '\n';
const parts = this._internal.buffer.split(delimiter);
// Keep last incomplete part in buffer
this._internal.buffer = parts.pop();
// Add complete messages
if (parts.length > 0) {
this._internal.messages = this._internal.messages.concat(parts);
this.flagOutputDirty('messages');
this.flagOutputDirty('messageCount');
this.sendSignalOnOutput('messageReceived');
}
this.flagOutputDirty('accumulated');
this.flagOutputDirty('bufferSize');
},
clearBuffer: function() {
this._internal.buffer = '';
this._internal.messages = [];
this.flagOutputDirty('accumulated');
this.flagOutputDirty('messages');
this.flagOutputDirty('messageCount');
this.flagOutputDirty('bufferSize');
this.sendSignalOnOutput('cleared');
}
},
getInspectInfo: function() {
return {
type: 'value',
value: {
bufferSize: this._internal.buffer.length,
messageCount: this._internal.messages.length,
lastMessage: this._internal.messages[this._internal.messages.length - 1]
}
};
}
};
module.exports = {
node: TextAccumulatorNode
};
```
### JSON Stream Parser Implementation
```javascript
// jsonstreamparsernode.js
var JSONStreamParserNode = {
name: 'net.noodl.JSONStreamParser',
displayNodeName: 'JSON Stream Parser',
category: 'Data',
color: 'green',
initialize: function() {
this._internal.buffer = '';
},
inputs: {
chunk: {
type: 'string',
displayName: 'Chunk',
group: 'Data',
set: function(value) {
this._internal.chunk = value;
}
},
parse: {
type: 'signal',
displayName: 'Parse',
group: 'Actions',
valueChangedToTrue: function() {
this.doParse();
}
},
clear: {
type: 'signal',
displayName: 'Clear',
group: 'Actions',
valueChangedToTrue: function() {
this._internal.buffer = '';
}
},
format: {
type: {
name: 'enum',
enums: [
{ label: 'NDJSON', value: 'ndjson' },
{ label: 'JSON Array', value: 'array' },
{ label: 'Single Object', value: 'single' }
]
},
displayName: 'Format',
group: 'Config',
default: 'ndjson'
}
},
outputs: {
parsed: {
type: '*',
displayName: 'Parsed',
group: 'Data',
getter: function() {
return this._internal.parsed;
}
},
success: {
type: 'signal',
displayName: 'Success',
group: 'Events'
},
error: {
type: 'string',
displayName: 'Error',
group: 'Events',
getter: function() {
return this._internal.error;
}
},
isComplete: {
type: 'boolean',
displayName: 'Is Complete',
group: 'Status',
getter: function() {
return this._internal.isComplete;
}
}
},
methods: {
doParse: function() {
const chunk = this._internal.chunk;
if (!chunk) return;
this._internal.buffer += chunk;
const format = this._internal.format || 'ndjson';
try {
if (format === 'ndjson') {
this.parseNDJSON();
} else if (format === 'single') {
this.parseSingleJSON();
} else if (format === 'array') {
this.parseJSONArray();
}
} catch (e) {
this._internal.error = e.message;
this._internal.isComplete = false;
this.flagOutputDirty('error');
this.flagOutputDirty('isComplete');
}
},
parseNDJSON: function() {
// NDJSON: one JSON per line
const lines = this._internal.buffer.split('\n');
// Keep last incomplete line
this._internal.buffer = lines.pop();
// Parse complete lines
const parsed = [];
for (const line of lines) {
if (line.trim()) {
try {
parsed.push(JSON.parse(line));
} catch (e) {
console.warn('[JSONStreamParser] Failed to parse line:', line);
}
}
}
if (parsed.length > 0) {
this._internal.parsed = parsed;
this._internal.isComplete = true;
this.flagOutputDirty('parsed');
this.flagOutputDirty('isComplete');
this.sendSignalOnOutput('success');
}
},
parseSingleJSON: function() {
// Try to parse complete JSON object
try {
const parsed = JSON.parse(this._internal.buffer);
this._internal.parsed = parsed;
this._internal.isComplete = true;
this._internal.buffer = '';
this.flagOutputDirty('parsed');
this.flagOutputDirty('isComplete');
this.sendSignalOnOutput('success');
} catch (e) {
// Not complete yet, wait for more chunks
this._internal.isComplete = false;
this.flagOutputDirty('isComplete');
}
},
parseJSONArray: function() {
// JSON array, possibly incomplete: [{"a":1},{"b":2}...
// Try to parse as-is, or add closing bracket
let buffer = this._internal.buffer.trim();
// Try parsing complete array
try {
const parsed = JSON.parse(buffer);
if (Array.isArray(parsed)) {
this._internal.parsed = parsed;
this._internal.isComplete = true;
this._internal.buffer = '';
this.flagOutputDirty('parsed');
this.flagOutputDirty('isComplete');
this.sendSignalOnOutput('success');
return;
}
} catch (e) {
// Not complete, try adding closing bracket
try {
const parsed = JSON.parse(buffer + ']');
if (Array.isArray(parsed)) {
this._internal.parsed = parsed;
this._internal.isComplete = false; // Partial
this.flagOutputDirty('parsed');
this.flagOutputDirty('isComplete');
this.sendSignalOnOutput('success');
}
} catch (e2) {
// Still not parseable
this._internal.isComplete = false;
this.flagOutputDirty('isComplete');
}
}
}
}
};
module.exports = {
node: JSONStreamParserNode
};
```
### Pattern Extractor Implementation (abbreviated)
```javascript
// patternextractornode.js
var PatternExtractorNode = {
name: 'net.noodl.PatternExtractor',
displayNodeName: 'Pattern Extractor',
category: 'Data',
color: 'green',
inputs: {
text: { type: 'string', displayName: 'Text' },
pattern: { type: 'string', displayName: 'Pattern' },
extract: { type: 'signal', displayName: 'Extract' },
extractAll: { type: 'boolean', displayName: 'Extract All', default: false }
},
outputs: {
match: { type: 'string', displayName: 'Match' },
matches: { type: 'array', displayName: 'Matches' },
groups: { type: 'array', displayName: 'Groups' },
found: { type: 'signal', displayName: 'Found' },
notFound: { type: 'signal', displayName: 'Not Found' }
},
methods: {
doExtract: function() {
const text = this._internal.text;
const pattern = this._internal.pattern;
if (!text || !pattern) return;
try {
const regex = new RegExp(pattern, this._internal.extractAll ? 'g' : '');
const matches = this._internal.extractAll
? [...text.matchAll(new RegExp(pattern, 'g'))]
: [text.match(regex)];
if (matches && matches[0]) {
this._internal.match = matches[0][0];
this._internal.matches = matches.map(m => m[0]);
this._internal.groups = matches[0].slice(1);
this.flagOutputDirty('match');
this.flagOutputDirty('matches');
this.flagOutputDirty('groups');
this.sendSignalOnOutput('found');
} else {
this.sendSignalOnOutput('notFound');
}
} catch (e) {
console.error('[PatternExtractor] Invalid regex:', e);
}
}
}
};
```
---
## Usage Examples
### Example 1: AI Chat Streaming (Erleah)
```
[SSE] data → text chunks
[Text Accumulator]
delimiter: "" // No delimiter, accumulate all
chunk: data
add
[Text Accumulator] accumulated
→ [Text] display streaming message
// Real-time text appears as AI types!
```
### Example 2: NDJSON Stream
```
[SSE] data → NDJSON chunks
[JSON Stream Parser]
format: "ndjson"
chunk: data
parse
[JSON Stream Parser] parsed → array of objects
[For Each] object in array
→ [Process each object]
```
### Example 3: Extract Progress
```
[SSE] data → "Processing... 45% complete"
[Pattern Extractor]
text: data
pattern: "(\d+)%"
extract
[Pattern Extractor] groups → [0] = "45"
→ [Number] 45
→ [Progress Bar] value
```
### Example 4: Buffered Updates
```
[SSE] data → frequent updates
[Stream Buffer]
data: item
add
flushInterval: 1000 // Flush every second
[Stream Buffer] flushed
[Stream Buffer] flushedData → batched items
→ [Process batch at once]
// Reduces processing overhead
```
---
## Testing Checklist
### Functional Tests
- [ ] Text accumulator handles chunks correctly
- [ ] NDJSON parser splits on newlines
- [ ] Single JSON waits for complete object
- [ ] Array JSON handles incomplete arrays
- [ ] Pattern extractor finds matches
- [ ] Capture groups extracted correctly
- [ ] Buffer flushes on size/interval
- [ ] Clear operations work
### Edge Cases
- [ ] Empty chunks
- [ ] Very large chunks (>1MB)
- [ ] Malformed JSON
- [ ] Invalid regex patterns
- [ ] No matches found
- [ ] Buffer overflow
- [ ] Rapid chunks (stress test)
- [ ] Unicode/emoji handling
### Performance
- [ ] No memory leaks with long streams
- [ ] Regex doesn't cause ReDoS
- [ ] Large buffer doesn't freeze UI
---
## Documentation Requirements
### User-Facing Docs
Create: `docs/nodes/data/stream-utilities.md`
```markdown
# Stream Utilities
Tools for working with streaming data from SSE, WebSocket, or chunked HTTP responses.
## Text Accumulator
Collect text chunks into complete messages:
```
[SSE] → chunks
[Text Accumulator] → complete messages
```
Use cases:
- AI chat streaming
- Log streaming
- Progress messages
## JSON Stream Parser
Parse streaming JSON in various formats:
- **NDJSON**: One JSON per line
- **Single**: Wait for complete object
- **Array**: Parse partial JSON arrays
## Pattern Extractor
Extract values using regex:
```
Text: "Status: 45% complete"
Pattern: "(\d+)%"
→ Match: "45"
```
Use cases:
- Extract progress percentages
- Parse structured logs
- Find specific values
## Stream Buffer
Batch frequent updates:
```
[Rapid Updates] → [Buffer] → [Batch Process]
```
Reduces processing overhead.
## Best Practices
1. **Set max lengths**: Prevent memory issues
2. **Handle parse errors**: JSON might be incomplete
3. **Use delimiters**: For message boundaries
4. **Batch when possible**: Reduce processing
```
---
## Success Criteria
1. ✅ Handles streaming data reliably
2. ✅ Parses NDJSON correctly
3. ✅ Regex extraction works
4. ✅ No memory leaks
5. ✅ Clear documentation
6. ✅ Works with AGENT-001 (SSE) for Erleah
---
## Future Enhancements
1. **XML Stream Parser** - Parse chunked XML
2. **CSV Stream Parser** - Parse CSV row-by-row
3. **Binary Parsers** - Protocol buffers, msgpack
4. **Compression** - Decompress gzip/deflate streams
5. **Encoding Detection** - Auto-detect UTF-8/UTF-16
---
## References
- [NDJSON Spec](http://ndjson.org/)
- [JSON Streaming](https://en.wikipedia.org/wiki/JSON_streaming)
---
## Dependencies
- None (pure utilities)
## Blocked By
- None (can be developed independently)
## Blocks
- None (but enhances AGENT-001, AGENT-002)
---
## Estimated Effort Breakdown
| Phase | Estimate | Description |
|-------|----------|-------------|
| Text Accumulator | 0.5 day | Basic chunking logic |
| JSON Parser | 1 day | NDJSON, single, array formats |
| Pattern Extractor | 0.5 day | Regex wrapper |
| Stream Buffer | 0.5 day | Time/size-based flushing |
| Testing | 0.5 day | Edge cases, performance |
| Documentation | 0.5 day | User docs, examples |
**Total: 3.5 days**
Buffer: None needed
**Final: 2-3 days**

View File

@@ -0,0 +1,826 @@
# Can Noodl Build the New Agentic Erleah?
## Strategic Analysis & Roadmap
**Date:** December 30, 2025
**Author:** Strategic Analysis
**Status:** Recommendation - YES, with Phase 3.5 additions
---
## Executive Summary
**TL;DR:** Yes, Noodl CAN build the new agentic Erleah, but it requires adding a focused "Phase 3.5: Real-Time Agentic UI" series of nodes and features. This is actually a PERFECT test case for Noodl's capabilities and would result in features that benefit the entire Noodl ecosystem.
**Key Insights:**
1. **Foundation is solid** - Phases 1 & 2 created a modern React 19 + TypeScript base
2. **Core patterns exist** - HTTP nodes, state management, and event systems are already there
3. **Missing pieces are specific** - SSE streams, optimistic updates, and action dispatching
4. **High ROI** - Building these features makes Noodl better for ALL modern web apps
5. **Validation opportunity** - If Noodl can build Erleah, it proves the platform's maturity
---
## Current Capabilities Assessment
### ✅ What Noodl ALREADY Has
#### 1. Modern Foundation (Phase 1 Complete)
- **React 19** in both editor and runtime
- **TypeScript 5** with full type inference
- **Modern tooling** - webpack 5, Storybook 8
- **Performance** - Build times improved, hot reload snappy
#### 2. HTTP & API Integration (Phase 2 In Progress)
```javascript
// Current HTTP Node capabilities:
- GET/POST/PUT/DELETE/PATCH methods
- Authentication presets (Bearer, Basic, API Key)
- JSONPath response mapping
- Header and query parameter management
- Form data and URL-encoded bodies
- Timeout configuration
- Cancel requests
```
#### 3. State Management
```javascript
- Variable nodes for local state
- Object/Array manipulation nodes
- Component Inputs/Outputs for prop drilling
- Send Event/Receive Event for pub-sub
- States node for state machines
```
#### 4. Visual Components
```javascript
- Full React component library
- Responsive breakpoints (planned in NODES-001)
- Visual states (hover, pressed, disabled)
- Conditional rendering
- Repeater for dynamic lists
```
#### 5. Event System
```javascript
- Signal-based event propagation
- EventDispatcher for pub-sub patterns
- Connection-based data flow
- Debounce/Delay nodes for timing
```
### ❌ What Noodl Is MISSING for Erleah
#### 1. **Server-Sent Events (SSE) Support**
**Current Gap:** HTTP node only does request-response, no streaming
**Erleah Needs:**
```javascript
// Chat messages streaming in real-time
AI Agent: "I'm searching attendees..." [streaming]
AI Agent: "Found 8 matches..." [streaming]
AI Agent: "Adding to your plan..." [streaming]
```
**What's Required:**
- SSE connection node
- Stream parsing (JSON chunks)
- Progressive message accumulation
- Automatic reconnection on disconnect
#### 2. **WebSocket Support**
**Current Gap:** No WebSocket node exists
**Erleah Needs:**
```javascript
// Real-time bidirectional communication
User Backend: "Add this to timeline"
Backend User: "Timeline updated" [instant]
Backend User: "Connection request accepted" [push]
```
#### 3. **Optimistic UI Updates**
**Current Gap:** No pattern for "update UI first, sync later"
**Erleah Needs:**
```javascript
// Click "Accept" → immediate UI feedback
// Then backend call → roll back if it fails
```
**What's Required:**
- Transaction/rollback state management
- Pending state indicators
- Error recovery patterns
#### 4. **Action Dispatcher Pattern**
**Current Gap:** No concept of backend-triggered UI actions
**Erleah Needs:**
```javascript
// Backend (AI Agent) sends:
{
type: "OPEN_VIEW",
view: "agenda",
id: "session-123"
}
// Frontend automatically navigates
```
**What's Required:**
- Action queue/processor
- UI action vocabulary
- Safe execution sandbox
#### 5. **State Synchronization Across Views**
**Current Gap:** Component state is isolated, no global reactive store
**Erleah Needs:**
```javascript
// Chat sidebar updates → Timeline view updates
// Timeline view updates → Parking Lot updates
// All views stay in sync automatically
```
**What's Required:**
- Global observable store (like Zustand)
- Subscription mechanism
- Selective re-rendering
---
## Gap Analysis: Erleah Requirements vs Noodl Capabilities
### Feature Comparison Matrix
| Erleah Feature | Noodl Today | Gap Size | Effort to Add |
|----------------|-------------|----------|---------------|
| **Timeline View** | ✅ Repeater + Cards | None | 0 days |
| **Chat Sidebar** | ✅ Components | None | 0 days |
| **Parking Lot Sidebar** | ✅ Components | None | 0 days |
| **Card Layouts** | ✅ Visual nodes | None | 0 days |
| **HTTP API Calls** | ✅ HTTP Node | None | 0 days |
| **Authentication** | ✅ Auth presets | None | 0 days |
| **SSE Streaming** | ❌ None | Large | 3-5 days |
| **WebSocket** | ❌ None | Large | 3-5 days |
| **Optimistic Updates** | ❌ None | Medium | 2-3 days |
| **Action Dispatcher** | ⚠️ Partial | Medium | 2-4 days |
| **Global State** | ⚠️ Workarounds | Small | 2-3 days |
| **State History** | ❌ None | Small | 1-2 days |
| **Real-time Preview** | ✅ Existing | None | 0 days |
**Total New Development:** ~15-24 days
---
## Proposed Phase 3.5: Real-Time Agentic UI
Insert this between current Phase 3 and the rest of the roadmap.
### Task Series: AGENT (AI Agent Integration)
**Total Estimated:** 15-24 days (3-4 weeks)
| Task ID | Name | Estimate | Description |
|---------|------|----------|-------------|
| **AGENT-001** | Server-Sent Events Node | 3-5 days | SSE connection, streaming, auto-reconnect |
| **AGENT-002** | WebSocket Node | 3-5 days | Bidirectional real-time communication |
| **AGENT-003** | Global State Store | 2-3 days | Observable store like Zustand, cross-component |
| **AGENT-004** | Optimistic Update Pattern | 2-3 days | Transaction wrapper, rollback support |
| **AGENT-005** | Action Dispatcher | 2-4 days | Backend-to-frontend command execution |
| **AGENT-006** | State History & Time Travel | 1-2 days | Undo/redo, state snapshots |
| **AGENT-007** | Stream Parser Utilities | 2-3 days | JSON streaming, chunk assembly |
### AGENT-001: Server-Sent Events Node
**File:** `packages/noodl-runtime/src/nodes/std-library/data/ssenode.js`
```javascript
var SSENode = {
name: 'net.noodl.SSE',
displayNodeName: 'Server-Sent Events',
docs: 'https://docs.noodl.net/nodes/data/sse',
category: 'Data',
color: 'data',
searchTags: ['sse', 'stream', 'server-sent', 'events', 'realtime'],
initialize: function() {
this._internal.eventSource = null;
this._internal.isConnected = false;
this._internal.messageBuffer = [];
},
inputs: {
url: {
type: 'string',
displayName: 'URL',
group: 'Connection',
set: function(value) {
this._internal.url = value;
}
},
connect: {
type: 'signal',
displayName: 'Connect',
group: 'Actions',
valueChangedToTrue: function() {
this.doConnect();
}
},
disconnect: {
type: 'signal',
displayName: 'Disconnect',
group: 'Actions',
valueChangedToTrue: function() {
this.doDisconnect();
}
},
autoReconnect: {
type: 'boolean',
displayName: 'Auto Reconnect',
group: 'Connection',
default: true
},
reconnectDelay: {
type: 'number',
displayName: 'Reconnect Delay (ms)',
group: 'Connection',
default: 3000
}
},
outputs: {
message: {
type: 'object',
displayName: 'Message',
group: 'Data'
},
data: {
type: '*',
displayName: 'Parsed Data',
group: 'Data'
},
connected: {
type: 'signal',
displayName: 'Connected',
group: 'Events'
},
disconnected: {
type: 'signal',
displayName: 'Disconnected',
group: 'Events'
},
error: {
type: 'string',
displayName: 'Error',
group: 'Events'
},
isConnected: {
type: 'boolean',
displayName: 'Is Connected',
group: 'Status',
getter: function() {
return this._internal.isConnected;
}
}
},
methods: {
doConnect: function() {
if (this._internal.eventSource) {
this.doDisconnect();
}
const url = this._internal.url;
if (!url) {
this.setError('URL is required');
return;
}
try {
const eventSource = new EventSource(url);
this._internal.eventSource = eventSource;
eventSource.onopen = () => {
this._internal.isConnected = true;
this.flagOutputDirty('isConnected');
this.sendSignalOnOutput('connected');
};
eventSource.onmessage = (event) => {
this.handleMessage(event);
};
eventSource.onerror = (error) => {
this._internal.isConnected = false;
this.flagOutputDirty('isConnected');
this.sendSignalOnOutput('disconnected');
if (this._internal.autoReconnect) {
setTimeout(() => {
if (!this._internal.eventSource || this._internal.eventSource.readyState === EventSource.CLOSED) {
this.doConnect();
}
}, this._internal.reconnectDelay || 3000);
}
};
} catch (e) {
this.setError(e.message);
}
},
doDisconnect: function() {
if (this._internal.eventSource) {
this._internal.eventSource.close();
this._internal.eventSource = null;
this._internal.isConnected = false;
this.flagOutputDirty('isConnected');
this.sendSignalOnOutput('disconnected');
}
},
handleMessage: function(event) {
try {
// Try to parse as JSON
const data = JSON.parse(event.data);
this._internal.message = event;
this._internal.data = data;
} catch (e) {
// Not JSON, use raw data
this._internal.message = event;
this._internal.data = event.data;
}
this.flagOutputDirty('message');
this.flagOutputDirty('data');
},
setError: function(message) {
this._internal.error = message;
this.flagOutputDirty('error');
},
_onNodeDeleted: function() {
this.doDisconnect();
}
}
};
module.exports = {
node: SSENode
};
```
### AGENT-003: Global State Store Node
**File:** `packages/noodl-runtime/src/nodes/std-library/data/globalstorenode.js`
```javascript
// Global store instance (singleton)
class GlobalStore {
constructor() {
this.stores = new Map();
this.subscribers = new Map();
}
createStore(name, initialState = {}) {
if (!this.stores.has(name)) {
this.stores.set(name, initialState);
this.subscribers.set(name, new Set());
}
return this.stores.get(name);
}
getState(name) {
return this.stores.get(name) || {};
}
setState(name, updates) {
const current = this.stores.get(name) || {};
const next = { ...current, ...updates };
this.stores.set(name, next);
this.notify(name, next);
}
subscribe(name, callback) {
if (!this.subscribers.has(name)) {
this.subscribers.set(name, new Set());
}
this.subscribers.get(name).add(callback);
// Return unsubscribe function
return () => {
this.subscribers.get(name).delete(callback);
};
}
notify(name, state) {
const subscribers = this.subscribers.get(name);
if (subscribers) {
subscribers.forEach(cb => cb(state));
}
}
}
const globalStoreInstance = new GlobalStore();
var GlobalStoreNode = {
name: 'net.noodl.GlobalStore',
displayNodeName: 'Global Store',
category: 'Data',
color: 'data',
initialize: function() {
this._internal.storeName = 'default';
this._internal.unsubscribe = null;
},
inputs: {
storeName: {
type: 'string',
displayName: 'Store Name',
default: 'default',
set: function(value) {
if (this._internal.unsubscribe) {
this._internal.unsubscribe();
}
this._internal.storeName = value;
this.setupSubscription();
}
},
set: {
type: 'signal',
displayName: 'Set',
valueChangedToTrue: function() {
this.doSet();
}
},
key: {
type: 'string',
displayName: 'Key'
},
value: {
type: '*',
displayName: 'Value'
}
},
outputs: {
state: {
type: 'object',
displayName: 'State',
getter: function() {
return globalStoreInstance.getState(this._internal.storeName);
}
},
stateChanged: {
type: 'signal',
displayName: 'State Changed'
}
},
methods: {
setupSubscription: function() {
const storeName = this._internal.storeName;
this._internal.unsubscribe = globalStoreInstance.subscribe(
storeName,
(newState) => {
this.flagOutputDirty('state');
this.sendSignalOnOutput('stateChanged');
}
);
// Trigger initial state
this.flagOutputDirty('state');
},
doSet: function() {
const key = this._internal.key;
const value = this._internal.value;
const storeName = this._internal.storeName;
if (key) {
globalStoreInstance.setState(storeName, { [key]: value });
}
},
_onNodeDeleted: function() {
if (this._internal.unsubscribe) {
this._internal.unsubscribe();
}
}
}
};
```
### AGENT-005: Action Dispatcher Node
**File:** `packages/noodl-runtime/src/nodes/std-library/data/actiondispatchernode.js`
```javascript
var ActionDispatcherNode = {
name: 'net.noodl.ActionDispatcher',
displayNodeName: 'Action Dispatcher',
category: 'Events',
color: 'purple',
initialize: function() {
this._internal.actionHandlers = new Map();
this._internal.pendingActions = [];
},
inputs: {
action: {
type: 'object',
displayName: 'Action',
set: function(value) {
this._internal.currentAction = value;
this.dispatch(value);
}
},
// Register handlers
registerHandler: {
type: 'signal',
displayName: 'Register Handler',
valueChangedToTrue: function() {
this.doRegisterHandler();
}
},
handlerType: {
type: 'string',
displayName: 'Handler Type'
},
handlerCallback: {
type: 'signal',
displayName: 'Handler Callback'
}
},
outputs: {
actionType: {
type: 'string',
displayName: 'Action Type'
},
actionData: {
type: 'object',
displayName: 'Action Data'
},
dispatched: {
type: 'signal',
displayName: 'Dispatched'
}
},
methods: {
dispatch: function(action) {
if (!action || !action.type) return;
this._internal.actionType = action.type;
this._internal.actionData = action.data || {};
this.flagOutputDirty('actionType');
this.flagOutputDirty('actionData');
this.sendSignalOnOutput('dispatched');
// Execute registered handlers
const handler = this._internal.actionHandlers.get(action.type);
if (handler) {
handler(action.data);
}
},
doRegisterHandler: function() {
const type = this._internal.handlerType;
if (!type) return;
this._internal.actionHandlers.set(type, (data) => {
this._internal.actionData = data;
this.flagOutputDirty('actionData');
this.sendSignalOnOutput('handlerCallback');
});
}
}
};
```
---
## Implementation Strategy: Phases 1-2-3.5-3-4-5
### Revised Roadmap
```
Phase 1: Foundation ✅ COMPLETE
├─ React 19 migration
├─ TypeScript 5 upgrade
└─ Storybook 8 migration
Phase 2: Core Features ⚙️ IN PROGRESS
├─ HTTP Node improvements ✅ COMPLETE
├─ Responsive breakpoints 🔄 ACTIVE
├─ Component migrations 🔄 ACTIVE
└─ EventDispatcher React bridge ⚠️ BLOCKED
Phase 3.5: Real-Time Agentic UI 🆕 PROPOSED
├─ AGENT-001: SSE Node (3-5 days)
├─ AGENT-002: WebSocket Node (3-5 days)
├─ AGENT-003: Global State Store (2-3 days)
├─ AGENT-004: Optimistic Updates (2-3 days)
├─ AGENT-005: Action Dispatcher (2-4 days)
├─ AGENT-006: State History (1-2 days)
└─ AGENT-007: Stream Utilities (2-3 days)
Total: 15-24 days (3-4 weeks)
Phase 3: Advanced Features
├─ Dashboard UX (DASH series)
├─ Git Integration (GIT series)
├─ Shared Components (COMP series)
├─ AI Features (AI series)
└─ Deployment (DEPLOY series)
```
### Critical Path for Erleah
To build Erleah, this is the minimum required path:
**Week 1-2: Phase 3.5 Core**
- AGENT-001 (SSE) - Absolutely critical for streaming AI responses
- AGENT-003 (Global Store) - Required for synchronized state
- AGENT-007 (Stream Utils) - Need to parse SSE JSON chunks
**Week 3: Phase 3.5 Enhancement**
- AGENT-004 (Optimistic Updates) - Better UX for user interactions
- AGENT-005 (Action Dispatcher) - AI agent can control UI
**Week 4: Erleah Development**
- Build Timeline view
- Build Chat sidebar with SSE
- Build Parking Lot sidebar
- Connect to backend
**Total:** 4 weeks to validated Erleah prototype in Noodl
---
## Why This Is GOOD for Noodl
### 1. **Validates Modern Architecture**
Building a complex, agentic UI proves that Noodl's React 19 + TypeScript migration was worth it. This is a real-world stress test.
### 2. **Features Benefit Everyone**
SSE, WebSocket, and Global Store aren't "Erleah-specific" - every modern web app needs these:
- Chat applications
- Real-time dashboards
- Collaborative tools
- Live notifications
- Streaming data visualization
### 3. **Competitive Advantage**
Flutterflow, Bubble, Webflow - none have agentic UI patterns built in. This would be a differentiator.
### 4. **Dogfooding**
Using Noodl to build a complex AI-powered app exposes UX issues and missing features that users face daily.
### 5. **Marketing Asset**
"Built with Noodl" becomes a powerful case study. Erleah is a sophisticated, modern web app that competes with pure-code solutions.
---
## Risks & Mitigations
### Risk 1: "We're adding too much complexity"
**Mitigation:** Phase 3.5 features are optional. Existing Noodl projects continue working. These are additive, not disruptive.
### Risk 2: "What if we hit a fundamental limitation?"
**Mitigation:** Start with Phase 3.5 AGENT-001 (SSE) as a proof-of-concept. If that works smoothly, continue. If it's a nightmare, reconsider.
### Risk 3: "We're delaying Phase 3 features"
**Mitigation:** Phase 3.5 is only 3-4 weeks. The learnings will inform Phase 3 (especially AI-001 AI Project Scaffolding).
### Risk 4: "SSE/WebSocket are complex to implement correctly"
**Mitigation:** Leverage existing libraries (EventSource is native, WebSocket is native). Focus on the Noodl integration layer, not reinventing protocols.
---
## Alternative: Hybrid Approach
If pure Noodl feels too risky, consider:
### Option A: Noodl Editor + React Runtime
- Build most of Erleah in Noodl
- Write 1-2 custom React components for SSE streaming in pure code
- Import as "Custom React Components" (already supported in Noodl)
**Pros:**
- Faster initial development
- No waiting for Phase 3.5
- Still validates Noodl for 90% of the app
**Cons:**
- Doesn't push Noodl forward
- Misses opportunity to build reusable features
### Option B: Erleah 1.0 in Code, Erleah 2.0 in Noodl
- Ship current Erleah version in pure React
- Use learnings to design Phase 3.5
- Rebuild Erleah 2.0 in Noodl with Phase 3.5 features
**Pros:**
- No business risk
- Informs Phase 3.5 design with real requirements
- Validates Noodl with second implementation
**Cons:**
- Slower validation loop
- Two separate codebases to maintain initially
---
## Recommendation
### ✅ Go Forward with Phase 3.5
**Rationale:**
1. **Timing is right** - Phases 1 & 2 created the foundation
2. **Scope is focused** - 7 tasks, 3-4 weeks, clear boundaries
3. **Value is high** - Erleah validates Noodl, features benefit everyone
4. **Risk is manageable** - Start with AGENT-001, can pivot if needed
### 📋 Action Plan
**Immediate (Next 2 weeks):**
1. Create AGENT-001 (SSE Node) task document
2. Implement SSE Node as proof-of-concept
3. Build simple streaming chat UI in Noodl to test
4. Evaluate: Did this feel natural? Were there blockers?
**If POC succeeds (Week 3-4):**
5. Complete AGENT-003 (Global Store)
6. Complete AGENT-007 (Stream Utils)
7. Build Erleah Timeline prototype in Noodl
**If POC struggles:**
8. Document specific pain points
9. Consider hybrid approach
10. Inform future node design
### 🎯 Success Criteria
Phase 3.5 is successful if:
- ✅ SSE Node can stream AI responses smoothly
- ✅ Global Store keeps views synchronized
- ✅ Building Erleah in Noodl feels productive, not painful
- ✅ The resulting app performs well (no visible lag)
- ✅ Code is maintainable (not a tangled node spaghetti)
---
## Conclusion
**Can Noodl build the new agentic Erleah?**
Yes - but only with Phase 3.5 additions. Without SSE, Global Store, and Action Dispatcher patterns, you'd be fighting the platform. With them, Noodl becomes a powerful tool for building modern, reactive, AI-powered web apps.
**Should you do it?**
Yes - this is a perfect validation moment. You've invested heavily in modernizing Noodl. Now prove it can build something cutting-edge. If Noodl struggles with Erleah, that's valuable feedback. If it succeeds, you have a compelling case study and a suite of new features.
**Timeline:**
- **Phase 3.5 Development:** 3-4 weeks
- **Erleah Prototype:** 1-2 weeks
- **Total to Validation:** 4-6 weeks
This is a strategic investment that pays dividends beyond just Erleah.
---
## Next Steps
1. **Review this document** with the team
2. **Decide on approach**: Full Phase 3.5, Hybrid, or Pure Code
3. **If Phase 3.5**: Start with AGENT-001 task creation
4. **If Hybrid**: Design the boundary between Noodl and custom React
5. **If Pure Code**: Document learnings for future Noodl improvements
**Question to answer:** What would prove to you that Noodl CAN'T build Erleah? Define failure criteria upfront so you can pivot quickly if needed.