8.9 KiB
VIEW-003: Trigger Chain Debugger - Known Issues
Status: ⚠️ UNSTABLE - REQUIRES INVESTIGATION
Last Updated: January 4, 2026
This document tracks critical bugs and issues discovered during testing that require investigation and fixing before VIEW-003 can be considered production-ready.
🔴 Critical Issues
Issue #1: Deduplication Too Aggressive
Status: CRITICAL - Causing data loss
Priority: HIGH
Discovered: January 4, 2026
Problem: The 5ms deduplication threshold implemented to fix "duplicate events" is now dropping legitimate signal steps. Real events that should be captured are being incorrectly filtered out.
Symptoms:
- Recording shows fewer events than actually occurred
- Missing steps in signal chains
- Incomplete trigger sequences in timeline
- User reports: "some actual signal steps missing from the recording"
Root Cause (Hypothesis):
- ViewerConnection's
connectiondebugpulsehandler may fire multiple times legitimately for:- Rapid sequential signals (e.g., button click → show toast → navigate)
- Fan-out patterns (one signal triggering multiple downstream nodes)
- Component instantiation events
- Our deduplication logic can't distinguish between:
- True duplicates: Same event sent multiple times by ViewerConnection bug
- Legitimate rapid events: Multiple distinct events happening within 5ms
Impact:
- Feature is unreliable for debugging
- Cannot trust recorded data
- May miss critical steps in complex flows
Required Investigation:
- Add verbose debug logging to
ViewerConnection.ts→connectiondebugpulsehandler - Capture ALL raw events before deduplication with timestamps
- Analyze patterns across different scenarios:
- Simple button click → single action
- Button click → multiple chained actions
- Data flow through multiple nodes
- Component navigation events
- Hover/focus events (should these even be recorded?)
- Determine if ViewerConnection bug exists vs legitimate high-frequency events
- Design smarter deduplication strategy (see Investigation Plan below)
Files Affected:
packages/noodl-editor/src/editor/src/utils/triggerChain/TriggerChainRecorder.tspackages/noodl-editor/src/editor/src/ViewerConnection.ts
Issue #2: Event Filtering Strategy Undefined
Status: CRITICAL - No clear design
Priority: HIGH
Discovered: January 4, 2026
Problem:
There is no defined strategy for what types of events should be captured vs ignored. We're recording everything that comes through connectiondebugpulse, which may include:
- Visual updates (not relevant to signal flow)
- Hover/mouse events (noise)
- Render cycles (noise)
- Layout recalculations (noise)
- Legitimate signal triggers (SIGNAL - what we want!)
Symptoms:
- Event count explosion (40 events for simple actions)
- Timeline cluttered with irrelevant events
- Hard to find actual signal flow in the noise
- Performance concerns with recording high-frequency events
Impact:
- Feature unusable for debugging complex flows
- Cannot distinguish signal from noise
- Recording performance may degrade with complex projects
Required Investigation:
-
Categorize all possible event types:
- What does
connectiondebugpulseactually send? - What are the characteristics of each event type?
- Can we identify event types from connectionId format?
- What does
-
Define filtering rules:
- What makes an event a "signal trigger"?
- What events should be ignored?
- Should we have recording modes (all vs signals-only)?
-
Test scenarios to document:
- Button click → Show Toast
- REST API call → Update UI
- Navigation between pages
- Data binding updates
- Component lifecycle events
- Timer triggers
- User input (typing, dragging)
-
Design decisions needed:
- Should we filter at capture time or display time?
- Should we expose filter controls to user?
- Should we categorize events visually in timeline?
Files Affected:
packages/noodl-editor/src/editor/src/utils/triggerChain/TriggerChainRecorder.tspackages/noodl-editor/src/editor/src/ViewerConnection.tspackages/noodl-editor/src/editor/src/views/panels/TriggerChainDebuggerPanel/(UI for filtering)
📋 Investigation Plan
Phase 1: Data Collection (1-2 days)
Goal: Understand what we're actually receiving
-
Add comprehensive debug logging:
// In ViewerConnection.ts if (triggerChainRecorder.isRecording()) { console.log('🔥 RAW EVENT:', { connectionId, timestamp: performance.now(), extracted_uuids: uuids, found_node: foundNode?.type?.name }); content.connectionsToPulse.forEach((connectionId: string) => { triggerChainRecorder.captureConnectionPulse(connectionId); }); } -
Create test scenarios:
- Simple: Button → Show Toast
- Medium: Button → REST API → Update Text
- Complex: Navigation → Load Data → Populate List
- Edge case: Rapid button clicks
- Edge case: Hover + Click interactions
-
Capture and analyze:
- Run each scenario
- Export console logs
- Count events by type
- Identify patterns in connectionId format
- Measure timing between events
Phase 2: Pattern Analysis (1 day)
Goal: Categorize events and identify duplicates vs signals
-
Categorize captured events:
- Group by connectionId patterns
- Group by timing (< 1ms, 1-5ms, 5-50ms, > 50ms apart)
- Group by node type
- Group by component
-
Identify true duplicates:
- Events with identical connectionId and data
- Events within < 1ms (same frame)
- Determine if ViewerConnection bug exists
-
Identify signal patterns:
- What do button click signals look like?
- What do data flow signals look like?
- What do navigation signals look like?
-
Identify noise patterns:
- Render updates?
- Hover events?
- Focus events?
- Animation frame callbacks?
Phase 3: Design Solution (1 day)
Goal: Design intelligent filtering strategy
-
Deduplication Strategy:
- Option A: Per-connectionId + timestamp threshold (current approach)
- Option B: Per-event-type + different thresholds
- Option C: Semantic deduplication (same source node + same data = duplicate)
- Decision: Choose based on Phase 1-2 findings
-
Filtering Strategy:
- Option A: Capture all, filter at display time (user control)
- Option B: Filter at capture time (performance)
- Option C: Hybrid (capture signals only, but allow "verbose mode")
- Decision: Choose based on performance measurements
-
Event Classification:
- Add
eventCategoryto TriggerEvent type - Categories:
'signal' | 'data-flow' | 'visual' | 'lifecycle' | 'noise' - Visual indicators in timeline (colors, icons)
- Add
Phase 4: Implementation (2-3 days)
- Implement chosen deduplication strategy
- Implement event filtering/classification
- Add UI controls for filter toggles (if needed)
- Update documentation
Phase 5: Testing & Validation (1 day)
- Test all scenarios from Phase 1
- Verify event counts are accurate
- Verify no legitimate signals are dropped
- Verify duplicates are eliminated
- Verify performance is acceptable
🎯 Success Criteria
Before marking VIEW-003 as stable:
- Can record button click → toast action with accurate event count (5-10 events max)
- No legitimate signal steps are dropped
- True duplicates are consistently filtered
- Event timeline is readable and useful
- Recording doesn't impact preview performance
- Deduplication strategy is documented and tested
- Event filtering rules are clear and documented
- User can distinguish signal flow from noise
📚 Related Documentation
CHANGELOG.md- Implementation historyENHANCEMENT-connection-highlighting.md- Visual flow feature proposalENHANCEMENT-step-by-step-debugger.md- Step-by-step execution proposaldev-docs/reference/DEBUG-INFRASTRUCTURE.md- ViewerConnection architecture
🔧 Current Workarounds
For developers testing VIEW-003:
- Expect inaccurate event counts - Feature is unstable
- Cross-reference with manual testing - Don't trust timeline alone
- Look for missing steps - Some events may be dropped
- Avoid rapid interactions - May trigger worst-case deduplication bugs
For users:
- Feature is marked experimental for a reason
- Use for general observation, not precise debugging
- Report anomalies to help with investigation
💡 Notes for Future Implementation
When fixing these issues, consider:
- Connection metadata: Can we get more info from ViewerConnection about event type?
- Runtime instrumentation: Should we add explicit "signal fired" events from runtime?
- Performance monitoring: Add metrics for recording overhead
- User feedback: Add UI indication when events are filtered
- Debug mode: Add "raw event log" panel for investigation