# TASK-007E: Migration & Export Tools ## Overview Implement tools for migrating data between backends and exporting schemas/data for production deployment. This includes a Parse-to-Local migration wizard and schema export for various production databases. **Parent Task:** TASK-007 (Integrated Local Backend) **Phase:** E (Migration & Export) **Effort:** 8-10 hours **Priority:** MEDIUM **Depends On:** TASK-007A, TASK-007B, TASK-007D --- ## Objectives 1. Create schema export wizard (PostgreSQL, Supabase, PocketBase, JSON) 2. Build Parse Server migration wizard for existing Noodl users 3. Implement data export in multiple formats 4. Create connection switcher for moving to external backends 5. Document migration paths clearly --- ## User Stories ### Story 1: Export to Production Database > As a developer ready to deploy, I want to export my local schema to PostgreSQL so I can set up my production database. ### Story 2: Migrate from Noodl Cloud > As an existing Noodl user, I want to migrate my Parse-based project to the new local backend so I can continue development offline. ### Story 3: Move to Supabase > As a developer scaling up, I want to switch from local backend to Supabase while keeping my data and schema intact. --- ## Implementation Steps ### Step 1: Export Wizard Component (3 hours) **File:** `packages/noodl-editor/src/editor/src/views/BackendPanel/ExportWizard.tsx` ```tsx import React, { useState } from 'react'; import { Dialog } from '@noodl-core-ui/components/layout/Dialog'; import { RadioGroup, Radio } from '@noodl-core-ui/components/inputs/RadioGroup'; import { Checkbox } from '@noodl-core-ui/components/inputs/Checkbox'; import { PrimaryButton } from '@noodl-core-ui/components/inputs/PrimaryButton'; import { SecondaryButton } from '@noodl-core-ui/components/inputs/SecondaryButton'; import { CodeEditor } from '@noodl-core-ui/components/inputs/CodeEditor'; import { IconCopy, IconDownload } from '@noodl-core-ui/components/common/Icon'; import styles from './ExportWizard.module.scss'; type ExportFormat = 'postgres' | 'supabase' | 'pocketbase' | 'json'; type ExportStep = 'options' | 'generating' | 'result'; interface Props { backendId: string; backendName: string; onClose: () => void; } export function ExportWizard({ backendId, backendName, onClose }: Props) { const [step, setStep] = useState('options'); const [format, setFormat] = useState('postgres'); const [includeData, setIncludeData] = useState(false); const [includeSampleData, setIncludeSampleData] = useState(false); const [result, setResult] = useState(''); const [error, setError] = useState(null); const formatOptions = [ { value: 'postgres', label: 'PostgreSQL', description: 'Standard PostgreSQL DDL statements' }, { value: 'supabase', label: 'Supabase', description: 'PostgreSQL with Row Level Security policies' }, { value: 'pocketbase', label: 'PocketBase', description: 'PocketBase collection schema JSON' }, { value: 'json', label: 'JSON Schema', description: 'Portable JSON schema definition' } ]; const handleExport = async () => { setStep('generating'); setError(null); try { // Get schema let schemaResult = await window.electronAPI.backend.exportSchema( backendId, format ); // Get data if requested if (includeData || includeSampleData) { const dataFormat = format === 'json' ? 'json' : 'sql'; const dataResult = await window.electronAPI.backend.exportData( backendId, dataFormat ); if (includeSampleData && !includeData) { // Limit to first 10 records per table schemaResult += '\n\n-- SAMPLE DATA (first 10 records per table)\n'; schemaResult += limitDataToSample(dataResult, 10); } else { schemaResult += '\n\n-- DATA\n'; schemaResult += dataResult; } } // Add format-specific headers/comments const finalResult = addFormatHeader(schemaResult, format, backendName); setResult(finalResult); setStep('result'); } catch (e: any) { setError(e.message || 'Export failed'); setStep('options'); } }; const handleCopy = async () => { await navigator.clipboard.writeText(result); // Show toast notification }; const handleDownload = () => { const extension = format === 'json' || format === 'pocketbase' ? 'json' : 'sql'; const filename = `${backendName.toLowerCase().replace(/\s+/g, '-')}-schema.${extension}`; const blob = new Blob([result], { type: 'text/plain' }); const url = URL.createObjectURL(blob); const a = document.createElement('a'); a.href = url; a.download = filename; a.click(); URL.revokeObjectURL(url); }; return ( {step === 'options' && (

Export your database schema to use with a production database. The exported schema can be used to create tables in your target database.

Export Format

setFormat(v as ExportFormat)}> {formatOptions.map(opt => (
{opt.label} {opt.description}
))}

Data Options

Export all records from all tables. Use for full backup or migration.

Export a few records for testing your production setup.

{error && (
{error}
)}
)} {step === 'generating' && (

Generating export...

)} {step === 'result' && (

Export Complete

Next Steps

{format === 'postgres' && (
  1. Create a PostgreSQL database on your hosting provider
  2. Run this SQL in your database to create the tables
  3. Update your project to use the external database connection
)} {format === 'supabase' && (
  1. Go to your Supabase project's SQL Editor
  2. Paste and run this SQL to create tables with RLS
  3. Configure authentication policies as needed
  4. Update your project to use Supabase as the backend
)} {format === 'pocketbase' && (
  1. Open PocketBase Admin UI
  2. Import collections from the JSON file
  3. Configure API rules as needed
  4. Update your project to use PocketBase as the backend
)} {format === 'json' && (
  1. Use this schema definition with any compatible database
  2. Convert to your target database's DDL format
  3. Create tables and configure your backend
)}
setStep('options')} />
)}
); } function addFormatHeader(content: string, format: ExportFormat, backendName: string): string { const timestamp = new Date().toISOString(); if (format === 'json' || format === 'pocketbase') { // For JSON formats, wrap in metadata try { const parsed = JSON.parse(content); return JSON.stringify({ _meta: { exportedFrom: 'Nodegex Local Backend', backendName, exportedAt: timestamp, format }, ...parsed }, null, 2); } catch { return content; } } // For SQL formats return `-- ============================================ -- Nodegex Schema Export -- Backend: ${backendName} -- Format: ${format.toUpperCase()} -- Exported: ${timestamp} -- ============================================ ${content}`; } function limitDataToSample(data: string, limit: number): string { // For SQL inserts, keep only first N per table const lines = data.split('\n'); const counts: Record = {}; return lines.filter(line => { const match = line.match(/^INSERT INTO "(\w+)"/); if (!match) return true; const table = match[1]; counts[table] = (counts[table] || 0) + 1; return counts[table] <= limit; }).join('\n'); } ``` --- ### Step 2: Parse Migration Wizard (3 hours) **File:** `packages/noodl-editor/src/editor/src/views/Migration/ParseMigrationWizard.tsx` ```tsx import React, { useState, useEffect } from 'react'; import { Dialog } from '@noodl-core-ui/components/layout/Dialog'; import { PrimaryButton } from '@noodl-core-ui/components/inputs/PrimaryButton'; import { SecondaryButton } from '@noodl-core-ui/components/inputs/SecondaryButton'; import { ProgressBar } from '@noodl-core-ui/components/feedback/ProgressBar'; import { TextInput } from '@noodl-core-ui/components/inputs/TextInput'; import { Checkbox } from '@noodl-core-ui/components/inputs/Checkbox'; import styles from './ParseMigrationWizard.module.scss'; type MigrationStep = 'connect' | 'review' | 'migrate' | 'complete' | 'error'; interface ParseConfig { endpoint: string; appId: string; masterKey?: string; } interface TablePreview { className: string; recordCount: number; fields: string[]; selected: boolean; } interface Props { projectId: string; existingParseConfig?: ParseConfig; onComplete: (newBackendId: string) => void; onCancel: () => void; } export function ParseMigrationWizard({ projectId, existingParseConfig, onComplete, onCancel }: Props) { const [step, setStep] = useState('connect'); const [parseConfig, setParseConfig] = useState( existingParseConfig || { endpoint: '', appId: '', masterKey: '' } ); const [tables, setTables] = useState([]); const [migrateData, setMigrateData] = useState(true); const [progress, setProgress] = useState(0); const [progressMessage, setProgressMessage] = useState(''); const [error, setError] = useState(null); const [newBackendId, setNewBackendId] = useState(null); // Step 1: Connect and fetch schema const handleConnect = async () => { setError(null); try { // Validate connection const schemaResponse = await fetch(`${parseConfig.endpoint}/schemas`, { headers: { 'X-Parse-Application-Id': parseConfig.appId, 'X-Parse-Master-Key': parseConfig.masterKey || '', } }); if (!schemaResponse.ok) { throw new Error('Failed to connect to Parse Server. Check your credentials.'); } const schemaData = await schemaResponse.json(); // Get record counts for each class const tablesPreviews: TablePreview[] = []; for (const cls of schemaData.results || []) { // Skip internal Parse classes if (cls.className.startsWith('_') && cls.className !== '_User') { continue; } const countResponse = await fetch( `${parseConfig.endpoint}/classes/${cls.className}?count=1&limit=0`, { headers: { 'X-Parse-Application-Id': parseConfig.appId, 'X-Parse-Master-Key': parseConfig.masterKey || '', } } ); const countData = await countResponse.json(); tablesPreviews.push({ className: cls.className, recordCount: countData.count || 0, fields: Object.keys(cls.fields || {}).filter( f => !['objectId', 'createdAt', 'updatedAt', 'ACL'].includes(f) ), selected: true }); } setTables(tablesPreviews); setStep('review'); } catch (e: any) { setError(e.message); } }; // Step 2: Start migration const handleStartMigration = async () => { setStep('migrate'); setProgress(0); setError(null); try { // Create new local backend setProgressMessage('Creating local backend...'); const backend = await window.electronAPI.backend.create( `Migrated from ${parseConfig.appId}` ); setNewBackendId(backend.id); setProgress(5); // Start the backend setProgressMessage('Starting backend...'); await window.electronAPI.backend.start(backend.id); setProgress(10); // Get selected tables const selectedTables = tables.filter(t => t.selected); const totalTables = selectedTables.length; // Fetch and migrate schema setProgressMessage('Fetching schema...'); const schemaResponse = await fetch(`${parseConfig.endpoint}/schemas`, { headers: { 'X-Parse-Application-Id': parseConfig.appId, 'X-Parse-Master-Key': parseConfig.masterKey || '', } }); const schemaData = await schemaResponse.json(); const selectedSchema = schemaData.results.filter((cls: any) => selectedTables.some(t => t.className === cls.className) ); setProgressMessage('Creating tables...'); await window.electronAPI.invoke('backend:import-parse-schema', { backendId: backend.id, schema: selectedSchema }); setProgress(20); // Migrate data if requested if (migrateData) { let completedTables = 0; let totalRecordsMigrated = 0; for (const table of selectedTables) { setProgressMessage(`Migrating ${table.className}...`); // Fetch in batches let skip = 0; const batchSize = 100; while (skip < table.recordCount) { const response = await fetch( `${parseConfig.endpoint}/classes/${table.className}?limit=${batchSize}&skip=${skip}`, { headers: { 'X-Parse-Application-Id': parseConfig.appId, 'X-Parse-Master-Key': parseConfig.masterKey || '', } } ); const data = await response.json(); if (data.results?.length > 0) { await window.electronAPI.invoke('backend:import-records', { backendId: backend.id, collection: table.className, records: data.results }); totalRecordsMigrated += data.results.length; } skip += batchSize; // Update progress const tableProgress = Math.min(skip, table.recordCount) / table.recordCount; const overallProgress = 20 + ( (completedTables + tableProgress) / totalTables * 70 ); setProgress(Math.round(overallProgress)); } completedTables++; } setProgressMessage(`Migrated ${totalRecordsMigrated} records`); } setProgress(100); setStep('complete'); } catch (e: any) { setError(e.message); setStep('error'); } }; const toggleTableSelection = (className: string) => { setTables(tables.map(t => t.className === className ? { ...t, selected: !t.selected } : t )); }; const totalRecords = tables .filter(t => t.selected) .reduce((sum, t) => sum + t.recordCount, 0); return ( {step === 'connect' && (

Migrate your existing Parse Server data to a local backend. This will create a new local backend with all your data.

setParseConfig({ ...parseConfig, endpoint: v })} placeholder="https://your-parse-server.com/parse" />
setParseConfig({ ...parseConfig, appId: v })} placeholder="your-app-id" />
setParseConfig({ ...parseConfig, masterKey: v })} placeholder="your-master-key" type="password" />
{error &&
{error}
}
)} {step === 'review' && (

Select the tables you want to migrate. Uncheck any tables you want to skip.

{tables.map(table => (
toggleTableSelection(table.className)} />
{table.className} {table.recordCount.toLocaleString()} records
{table.fields.slice(0, 5).join(', ')} {table.fields.length > 5 && ` +${table.fields.length - 5} more`}
))}

If unchecked, only the schema will be migrated, tables will be empty.

Summary: {tables.filter(t => t.selected).length} tables, {' '} {migrateData ? `${totalRecords.toLocaleString()} records` : 'schema only'}
setStep('connect')} /> t.selected).length === 0} />
)} {step === 'migrate' && (

{progressMessage}

{progress}%

Please don't close this window until migration is complete.

)} {step === 'complete' && (

Migration Complete!

Your data has been successfully migrated to a new local backend. You can now use this backend with your project.

onComplete(newBackendId!)} />
)} {step === 'error' && (

Migration Failed

{error}

setStep('review')} />
)}
); } ``` --- ### Step 3: Connection Switcher (2 hours) **File:** `packages/noodl-editor/src/editor/src/views/BackendPanel/ConnectionSwitcher.tsx` ```tsx import React, { useState } from 'react'; import { Dialog } from '@noodl-core-ui/components/layout/Dialog'; import { RadioGroup, Radio } from '@noodl-core-ui/components/inputs/RadioGroup'; import { TextInput } from '@noodl-core-ui/components/inputs/TextInput'; import { PrimaryButton } from '@noodl-core-ui/components/inputs/PrimaryButton'; import { SecondaryButton } from '@noodl-core-ui/components/inputs/SecondaryButton'; import styles from './ConnectionSwitcher.module.scss'; type BackendType = 'local' | 'supabase' | 'pocketbase' | 'custom'; interface Props { projectId: string; currentBackend: { type: string; id?: string; endpoint?: string; }; onSwitch: (config: any) => Promise; onClose: () => void; } export function ConnectionSwitcher({ projectId, currentBackend, onSwitch, onClose }: Props) { const [backendType, setBackendType] = useState( currentBackend.type as BackendType || 'local' ); const [config, setConfig] = useState({ localBackendId: currentBackend.id || '', supabaseUrl: '', supabaseKey: '', pocketbaseUrl: '', customEndpoint: '', customApiKey: '' }); const [testing, setTesting] = useState(false); const [testResult, setTestResult] = useState<'success' | 'error' | null>(null); const [error, setError] = useState(null); const [localBackends, setLocalBackends] = useState([]); React.useEffect(() => { loadLocalBackends(); }, []); const loadLocalBackends = async () => { try { const backends = await window.electronAPI.backend.list(); setLocalBackends(backends); } catch (e) { console.error('Failed to load backends:', e); } }; const handleTest = async () => { setTesting(true); setTestResult(null); setError(null); try { let testEndpoint = ''; let testHeaders: Record = {}; switch (backendType) { case 'local': const backend = localBackends.find(b => b.id === config.localBackendId); if (!backend) throw new Error('Select a local backend'); const status = await window.electronAPI.backend.status(backend.id); if (!status.running) { await window.electronAPI.backend.start(backend.id); } testEndpoint = `http://localhost:${status.port || backend.port}/health`; break; case 'supabase': testEndpoint = `${config.supabaseUrl}/rest/v1/`; testHeaders = { 'apikey': config.supabaseKey, 'Authorization': `Bearer ${config.supabaseKey}` }; break; case 'pocketbase': testEndpoint = `${config.pocketbaseUrl}/api/health`; break; case 'custom': testEndpoint = config.customEndpoint; if (config.customApiKey) { testHeaders = { 'Authorization': `Bearer ${config.customApiKey}` }; } break; } const response = await fetch(testEndpoint, { headers: testHeaders }); if (response.ok) { setTestResult('success'); } else { throw new Error(`Connection failed: ${response.status}`); } } catch (e: any) { setTestResult('error'); setError(e.message); } finally { setTesting(false); } }; const handleSwitch = async () => { try { let newConfig: any; switch (backendType) { case 'local': newConfig = { type: 'local', id: config.localBackendId }; break; case 'supabase': newConfig = { type: 'supabase', endpoint: config.supabaseUrl, apiKey: config.supabaseKey }; break; case 'pocketbase': newConfig = { type: 'pocketbase', endpoint: config.pocketbaseUrl }; break; case 'custom': newConfig = { type: 'custom', endpoint: config.customEndpoint, apiKey: config.customApiKey }; break; } await onSwitch(newConfig); onClose(); } catch (e: any) { setError(e.message); } }; return (

Switch your project to use a different backend. Your frontend code will continue to work with the new backend.

Backend Type

setBackendType(v as BackendType)}>
Local Backend SQLite database running locally
Supabase PostgreSQL with realtime and auth
PocketBase Lightweight Go backend
Custom REST API Any REST-compatible backend
{backendType === 'local' && (
)} {backendType === 'supabase' && ( <>
setConfig({ ...config, supabaseUrl: v })} placeholder="https://your-project.supabase.co" />
setConfig({ ...config, supabaseKey: v })} placeholder="your-anon-key" type="password" />
)} {backendType === 'pocketbase' && (
setConfig({ ...config, pocketbaseUrl: v })} placeholder="http://127.0.0.1:8090" />
)} {backendType === 'custom' && ( <>
setConfig({ ...config, customEndpoint: v })} placeholder="https://api.example.com" />
setConfig({ ...config, customApiKey: v })} placeholder="your-api-key" type="password" />
)}
{testResult === 'success' && (
✓ Connection successful
)} {error && (
{error}
)}
); } ``` --- ### Step 4: Project Open Migration Prompt (2 hours) **File:** `packages/noodl-editor/src/editor/src/views/Migration/BackendMigrationPrompt.tsx` ```tsx import React from 'react'; import { Dialog } from '@noodl-core-ui/components/layout/Dialog'; import { PrimaryButton } from '@noodl-core-ui/components/inputs/PrimaryButton'; import { SecondaryButton } from '@noodl-core-ui/components/inputs/SecondaryButton'; import styles from './BackendMigrationPrompt.module.scss'; interface Props { projectName: string; parseConfig: { endpoint: string; appId: string; }; onContinueWithParse: () => void; onMigrateToLocal: () => void; onDismiss: () => void; } export function BackendMigrationPrompt({ projectName, parseConfig, onContinueWithParse, onMigrateToLocal, onDismiss }: Props) { return (

New Local Backend Available

{projectName} is currently using a Parse Server backend. You can now use a local backend for faster development and offline support.

🌐 Continue with Parse

  • Keep using {parseConfig.endpoint}
  • No changes to your data
  • Requires internet connection

💻 Migrate to Local

  • Works offline
  • Faster development
  • Easy export to production
  • Free (no cloud costs)
); } ``` --- ## Files to Create ``` packages/noodl-editor/src/editor/src/views/BackendPanel/ ├── ExportWizard.tsx ├── ExportWizard.module.scss ├── ConnectionSwitcher.tsx └── ConnectionSwitcher.module.scss packages/noodl-editor/src/editor/src/views/Migration/ ├── ParseMigrationWizard.tsx ├── ParseMigrationWizard.module.scss ├── BackendMigrationPrompt.tsx └── BackendMigrationPrompt.module.scss ``` ## Files to Modify ``` packages/noodl-editor/src/editor/src/models/projectmodel.ts - Add backend migration check on project open packages/noodl-editor/src/editor/src/views/Launcher/BackendManager/BackendCard.tsx - Add export button that opens ExportWizard ``` --- ## Testing Checklist ### Export Wizard - [ ] PostgreSQL export generates valid DDL - [ ] Supabase export includes RLS policies - [ ] PocketBase export creates valid JSON - [ ] JSON schema is portable - [ ] Data export works for all formats - [ ] Sample data limiting works - [ ] Copy to clipboard works - [ ] Download works with correct filename ### Parse Migration - [ ] Connection validation works - [ ] Schema fetching succeeds - [ ] Table selection works - [ ] Progress tracking is accurate - [ ] Data migration preserves all records - [ ] Error handling shows useful messages - [ ] Can retry after failure ### Connection Switcher - [ ] Local backend selection works - [ ] Supabase connection test works - [ ] PocketBase connection test works - [ ] Custom endpoint test works - [ ] Backend switch persists in project ### Migration Prompt - [ ] Shows for Parse-based projects - [ ] "Don't show again" works - [ ] Links to migration wizard - [ ] Continue with Parse works --- ## Success Criteria 1. Users can export schema to any supported format 2. Parse migration preserves 100% of data 3. Connection switching is seamless 4. Clear guidance at each step 5. Error recovery is possible --- ## Dependencies **Internal:** - TASK-007A (LocalSQLAdapter - for schema/data export) - TASK-007B (BackendManager - IPC handlers) - TASK-007D (UI components) **Blocks:** - None (enables production deployment) --- ## Estimated Session Breakdown | Session | Focus | Hours | |---------|-------|-------| | 1 | Export Wizard | 3 | | 2 | Parse Migration Wizard | 3 | | 3 | Connection Switcher + Prompt | 3 | | **Total** | | **9** |