Interactive/Editable UI — Design Document
Created: February 13, 2026 Status: Planned (V1.0-V1.2) Version: V1.0 (basic editing), V1.2 (collaborative)
Connection to Canvas-First Redesign: This feature is the editing layer for the Canvas-First Redesign. Canvas components are rendered by JSRuntime (Level 2 in the compilation chain) and made interactive/editable by this system. See architecture.md — Knowledge Pipeline for how JSRuntime fits into BaseMorphRuntime.
Overview
Make AI-generated UI components editable inline with haptic feedback. Users can touch/interact with components, edit them directly, and have the AI understand context from those interactions.
Motivation
Problem: AI-generated components (Phase 3b.2) are read-only after creation. Users can't:
- Edit a task card inline without opening a separate page
- Reorder list items by dragging
- Quickly modify component content without typing commands
Solution: Inline editing + haptic feedback:
- Editable components — User taps → fields become editable → saves → optionally triggers AI response
- Haptic feedback — Tactile response on interactions (pulse on save, vibrate on drag)
- Context-aware AI — When user edits, AI receives full component context + user changes
- Permission modes — Some edits save silently, others require parent approval, others trigger AI turn
Use Cases
| User | Scenario |
|---|---|
| Parent | Tap task card → edit description inline → AI updates memory |
| Teacher | Drag lesson plan items to reorder → haptic vibrate on drop → AI learns new sequence |
| Manager | Long-press calendar event → resize duration → AI adjusts schedule |
| Grandparent | Tap shopping list → add item inline → no AI needed, direct edit |
Architecture
ComponentSpec Extensions
New fields:
interface ComponentSpec {
id: string;
type: string;
props: object;
children?: ComponentSpec[];
events?: EventDeclaration[];
// NEW Phase 3k fields:
editable?: boolean; // enables inline editing
edit_permissions?: "user" | "parent" | "ai_turn"; // who can edit + what happens
haptic_feedback?: "pulse" | "vibrate" | "flash" | "none";
auto_save?: boolean; // save without confirm button
edit_schema?: object; // validation for edited values (JSON Schema)
}
Edit permission modes:
| Mode | Behavior | Use Case |
|---|---|---|
user | Silent save, no AI involved | Simple list reordering, typo fixes |
parent | Requires parent role | Modifying shared family data |
ai_turn | Triggers new AI response with updated context | "AI, I changed this — what do you think?" |
Frontend Integration Actions
3 new actions:
class FrontendIntegration(BaseInterface):
# ... existing 10 actions
async def enable_editing(self, component_id: str, fields: list[str] | None = None) → ActionResult
async def lock_editing(self, component_id: str) → ActionResult
async def apply_edits(self, component_id: str, changes: dict) → ActionResult
Flow:
- AI renders:
show_card({title: "Buy groceries", editable: true, haptic_feedback: "pulse"}) - User clicks "Edit" → frontend enables contentEditable on title/body
- User types "Buy groceries for party" → haptic pulse on each keystroke (optional)
- User saves →
frontend__apply_editsevent with new title - If
edit_permissions="ai_turn"→POST /api/chat/component-event→ SSE stream with AI response - If
edit_permissions="user"→ silent save, component updates
Haptic Feedback Implementation
Tauri Commands:
// frontend/src-tauri/src/haptics.rs
#[tauri::command]
async fn trigger_haptic(pattern: HapticPattern) -> Result<(), MorpheeError> {
#[cfg(target_os = "macos")]
{
// NSHapticFeedbackManager.defaultPerformer.perform(.alignment, performanceTime: .now)
}
#[cfg(target_os = "windows")]
{
// Windows.Devices.Haptics.SimpleHapticsController
}
#[cfg(target_os = "linux")]
{
// evdev vibration (limited hardware support)
}
#[cfg(target_os = "ios")]
{
// UIImpactFeedbackGenerator
}
#[cfg(target_os = "android")]
{
// Vibrator.vibrate(VibrationEffect.createOneShot(...))
}
Ok(())
}
Haptic patterns:
pub enum HapticPattern {
Pulse, // Light pulse (100ms)
Vibrate, // Strong vibrate (200ms)
Flash, // Visual flash (no haptic)
None,
}
Web fallback:
// frontend/src/lib/haptics.ts
export function triggerHaptic(pattern: HapticPattern) {
if (isTauri()) {
// Use Tauri IPC
invoke('trigger_haptic', { pattern });
} else if ('vibrate' in navigator) {
// Web Vibration API (limited browser support)
const duration = pattern === 'pulse' ? 100 : pattern === 'vibrate' ? 200 : 0;
navigator.vibrate(duration);
}
}
Editable Component Types
| Component | Edit Actions | Haptic Feedback |
|---|---|---|
card | Edit title/body inline (contentEditable) | Pulse on save |
list | Reorder items (drag), add/remove | Vibrate on drop |
table | Edit cell values inline | Flash row on update |
form | Pre-fill, edit before submit | None (standard form) |
calendar | Drag events, resize | Haptic on drop |
kanban | Drag cards between columns | Vibrate on column change |
Component Event Flow
LOCAL events (no network):
- Dismiss, collapse, expand — pure frontend
AI_TURN events (triggers AI response):
// User edits card title
const event: ComponentEvent = {
type: 'ai_turn',
component_id: 'card-123',
component_type: 'card',
action: 'edit',
data: {
field: 'title',
old_value: 'Buy groceries',
new_value: 'Buy groceries for party',
},
};
// Frontend calls
POST /api/chat/component-event
{
conversation_id: 'conv-456',
event: event,
}
// Backend responds with SSE stream:
// AI receives: "User edited card 'Buy groceries' → 'Buy groceries for party'. Component context: {...}"
// AI responds: "Got it! Should I add party supplies to the list?"
API Endpoints
New endpoint:
POST /api/chat/component-event
Request:
{
"conversation_id": "uuid",
"event": {
"type": "ai_turn",
"component_id": "card-123",
"component_type": "card",
"action": "edit",
"data": {"field": "title", "new_value": "..."}
}
}
Response: SSE stream (same format as /api/chat/)
token— AI response tokenstool_use— Tool callstool_result— Tool resultsdone— End of stream
Effort Estimation
| Task | Size | Notes |
|---|---|---|
| Backend: Extend ComponentSpec models | S | Add 5 new fields |
| Backend: FrontendIntegration actions (3) | S | enable_editing, lock_editing, apply_edits |
| Backend: Component event API endpoint | M | SSE streaming with component context |
| Tauri: Haptic feedback commands | M | Platform-specific implementations |
| Frontend: Editable renderers (5 components) | M | card, list, table, calendar, kanban |
| Frontend: Haptic client wrapper | S | isTauri() routing + web fallback |
| Frontend: Component event dispatcher | S | Tier classification + HTTP dispatch |
| Tests: Backend + Tauri + Frontend | M | 30+ tests |
Total: Medium-Large (2 weeks)
Risks & Mitigations
| Risk | Mitigation |
|---|---|
| Haptic not available on all platforms | Graceful fallback (no error), visual feedback only |
| Accidental edits (fat-finger) | Confirm button for destructive edits, undo within 5 seconds |
| Edit conflicts (multi-user) | Show "X is editing this" indicator, last-write-wins + AI resolves conflict |
| Performance: haptic on every keystroke | Debounce haptic triggers (max 1 per 200ms) |
Future Enhancements
Core Features
- Voice editing — Long-press → voice input → AI transcribes + applies
- Gesture controls — Swipe to delete, pinch to zoom (mobile)
- Collaborative editing — Show other users' cursors in real-time
- Edit history — "Show me previous versions of this card"
- Undo/redo — Cmd+Z to revert edits before AI processes them
- Drag-and-drop between components — Drag task from list to calendar, auto-creates event
- Component templates — Save component layouts as reusable templates
- Rich text editing — Markdown toolbar, emoji picker, @ mentions
ACL & Permission Enhancements
Component-Level Permissions (leveraging existing PermissionPolicy):
interface ComponentACL {
component_id: string;
can_view: string[]; // Roles: ["parent", "child", "teacher"]
can_edit: string[]; // Roles: ["parent"]
can_delete: string[]; // Roles: ["parent"]
require_approval: boolean; // Edit requires approval
approvers: string[]; // User IDs or roles who can approve
locked_by?: string; // User ID currently editing (prevents conflicts)
locked_until?: Date; // Lock expiry (15 min default)
}
Use cases:
-
Family Shared Calendar:
- Parent creates calendar component → sets
can_edit: ["parent"] - Kids can view, can propose changes (edit_permissions: "ai_turn") → parent reviews in chat
- Kids can drag their own events (own_events_only ACL rule)
- Parent creates calendar component → sets
-
Classroom Assignments:
- Teacher creates assignment card → sets
can_view: ["teacher", "student"],can_edit: ["teacher"] - Students can't edit assignment details
- Students can edit their own submission field (field-level ACL)
- Teacher creates assignment card → sets
-
Team Task Board:
- Manager creates task list → sets
can_edit: ["manager", "lead"] - Team members can drag tasks (reorder) but can't edit task details
- Completed tasks require manager approval before archiving
- Manager creates task list → sets
Field-Level Permissions:
interface FieldACL {
field_name: string;
editable_by: string[]; // Roles who can edit this specific field
visible_to: string[]; // Roles who can view this field
}
// Example: Task card
{
fields: [
{field: "title", editable_by: ["parent"], visible_to: ["parent", "child"]},
{field: "assigned_to", editable_by: ["parent", "child"], visible_to: ["parent", "child"]},
{field: "notes", editable_by: ["parent"], visible_to: ["parent"]} // Private notes
]
}
Approval Workflow for Edits:
- Kid edits shopping list (adds "candy") →
edit_permissions: "ai_turn"→ triggers AI response - AI: "Emma added candy to the shopping list — approve?" → parent approves/rejects
- Integration with existing ApprovalCard component
- Stored in conversation history: full attribution of who edited what
Component Ownership:
- Components have an
ownerfield (user who created it) - Owner has full permissions by default (edit, delete, share)
- Owner can delegate permissions: "Let Emily edit this calendar"
- Stored in
component_metadatatable:{component_id, owner, created_at, permissions}
Advanced Editing Features
Real-Time Collaborative Editing:
- Multiple users edit same component simultaneously
- Show presence indicators: "Mom is editing this card" (avatar + badge)
- WebSocket broadcasts edit events:
{type: "component_edit", user_id, component_id, field, cursor_position} - Operational Transform (OT) or CRDT for conflict-free merges
- Lock mechanism: First editor gets 15-min lock, others see "locked by X" banner
Edit Attribution & History:
interface ComponentEditHistory {
component_id: string;
edits: Array<{
user_id: string;
timestamp: Date;
field: string;
old_value: any;
new_value: any;
approved_by?: string; // If edit required approval
}>;
}
- Every edit tracked with full attribution
- UI: "View history" button → shows timeline of changes
- Rollback: "Undo Emma's last edit" → restores previous value
- Stored in
component_edit_historytable
Component Locking:
- Prevent accidental overwrites during simultaneous edits
- First user to click "Edit" acquires lock (15 min expiry)
- Other users see: "Locked by Mom — editing in progress"
- Force unlock: Admin can break lock after timeout
- Stored in
component_lockstable:{component_id, locked_by, locked_at, expires_at}
Edit Conflicts Resolution:
- Two users edit same component offline → sync creates conflict
- AI mediates: "Mom changed title to 'Grocery Shopping', you changed to 'Weekly Shopping' — which one?"
- Conflict resolution UI: side-by-side diff, "Keep theirs" / "Keep mine" / "Merge" buttons
- Stores both versions temporarily, user chooses final version
Bulk Editing:
- Select multiple components → apply batch edit
- Example: Select 3 task cards → change status to "completed" → haptic pulse on each
- Permission check: User must have edit rights on ALL selected components
- Undo applies to entire batch
Haptic & Sensory Feedback Enhancements
Advanced Haptic Patterns:
enum HapticPattern {
Pulse, // Light tap (100ms)
Vibrate, // Strong buzz (200ms)
DoubleClick, // Two quick pulses (50ms + 50ms)
Success, // Ascending pattern (light → medium)
Error, // Harsh buzz (300ms)
Warning, // Three short pulses
Custom, // User-defined pattern
}
Context-Aware Haptics:
- Different haptic per action type:
- Save → Success pattern
- Delete → Warning pattern
- Drag → Continuous light vibration
- Drop → Strong pulse
- Error → Error pattern
Haptic Accessibility Settings:
appearance.haptic_intensity: "low" | "medium" | "high" | "off"appearance.haptic_patterns_enabled: boolean- Users with motor disabilities can disable haptics or adjust intensity
Audio Feedback (alternative to haptics):
- Web browsers without vibration API → audio cues
- Configurable sounds: "click", "whoosh", "ding"
appearance.audio_feedback: boolean- Stored in user settings via SettingsIntegration
Visual Feedback (accessible alternative):
- Flash effect: component border briefly flashes on save
- Color change: component background briefly changes to success color (green pulse)
- Animation: subtle scale animation (component "bounces" on edit)
appearance.visual_feedback: "none" | "subtle" | "prominent"
AI-Powered Editing Enhancements
Smart Auto-Fill:
- AI suggests values based on context: Editing task title → suggests similar task titles
- AI pre-fills fields: Creating event → AI suggests time based on "next available slot"
- Integration with Memory: AI remembers user preferences for field values
Natural Language Editing:
- Touch component → say "change this to tomorrow at 3pm" → AI parses + applies
- Voice commands: "add candy to this list", "move this task to high priority"
- Stored in conversation history: full audit trail of voice edits
Context-Aware Edit Suggestions:
- AI notices editing pattern: "You always change status to 'in progress' after creating tasks — auto-apply?"
- AI suggests field values: "Based on this task, I suggest assigning to Dad"
- Proactive optimization: "This calendar event conflicts with soccer practice — reschedule?"
Edit Templates:
- Save common edits as templates: "Homework task template" (priority: high, assigned_to: kid, due: weekday_evening)
- Apply template: Tap component → "Apply template" → select from list
- Templates stored per user in
user_edit_templatestable
Integration with Existing Systems
Edit History in Memory (Phase 2):
- Component edits stored as memories: "User edited shopping list to add milk"
- Searchable: "When did I add milk to the list?" → finds edit timestamp
- RAG includes edit context: AI knows what was recently changed
Editable Skills (Phase 3b.1):
- Skills can create editable components: "Morning briefing skill creates daily task list (editable)"
- Skills can respond to edits: User edits task → skill re-runs with updated data
Edit Notifications (Phase 3a):
- "Mom edited the vacation calendar — view changes?"
- Notification includes diff: "Changed: Paris → Rome"
- Click notification → opens component with changes highlighted
Edit Permissions from Settings (Phase 3j):
- Admin can configure default edit permissions via SettingsIntegration
settings.update("components.default_edit_permissions", {can_edit: ["parent"]})- Applies to all future components created by user
Privacy & Security
Edit Encryption:
- Sensitive component edits encrypted at rest
- Private components (visible only to owner) use end-to-end encryption
- Uses VaultProvider for encryption keys
Edit Audit Log:
- All edits logged with attribution (who, what, when, approved_by)
- GDPR compliant: User can export edit history
- Admin can review audit log: "Show all edits by Emma this week"
Edit Rollback Protection:
- Prevent accidental bulk rollback: "You're about to undo 10 edits — confirm?"
- Important edits require confirmation before rollback
- Destructive edits (delete field, remove item) require parent approval
Last Updated: February 13, 2026