Morphee Architecture
Overview
Morphee is a conversational AI agent platform for groups -- families, classrooms, teams. The architecture is built on one principle: everything is an Integration.
The LLM, the memory system, the frontend, and external services (Gmail, Calendar, etc.) are all Integrations with the same interface. The Agent Orchestrator is the only special component -- it runs the agentic loop that uses the LLM to think and other Integrations to act.
As of V2.0, Morphee runs on a unified Rust engine. The Python backend is retired. morphee-core contains all business logic (traits, providers, pipeline). morphee-server (axum) wraps it for multi-tenant cloud deployments. Tauri wraps it for local desktop/mobile.
The frontend talks to two Rust backends:
- morphee-server (HTTP/SSE/WS at
/v1/) -- auth, groups, spaces, tasks, canvas, identity, SSO, OAuth, notifications, ACL, onboarding, interfaces, extensions, AI (chat, memory, skills, search, embed), WebSocket, knowledge network, neurons - Tauri Rust backend (IPC) -- embeddings, vector search, Git, local LLM, audio/video, biometrics, WASM extensions, offline queue
The Integration/Interface abstraction means the orchestrator does not know or care which backend handles a given action. A "Cloud Claude" Interface calls morphee-server; a "Local Llama" Interface calls Tauri Rust. Same contract.
Core Concepts
Group
A collection of people sharing access to Morphee. Supports families, classrooms, teams, clubs.
Space
The central organizing concept. An isolated context where conversations, tasks, memory, and Interfaces live together.
Spaces can be nested -- a Space can contain sub-Spaces, forming a hierarchy. Sub-Spaces inherit their parent's Interfaces and memory access but can override with their own. This scales from a simple family to complex multi-client work environments.
Every user has an implicit personal Space. Shared Spaces are created for group contexts. The AI manages Space routing naturally through conversation.
Everything is an Integration
| Integration | Role | Actions |
|---|---|---|
| LLM | The brain -- thinks, decides, generates | chat, complete, embed, summarize |
| Memory | The knowledge -- remembers, recalls, searches | search, store, recall, forget |
| Frontend | The face -- renders UI dynamically (17 actions) | show_card, show_list, show_form, show_choices, show_actions, show_progress, show_table, show_calendar, show_kanban, show_video, show_image, render, update, enable_editing, lock_editing, apply_edits, dismiss |
| Tasks | Task management | list, create, update_status |
| Spaces | Space management | list, get_current |
| Cron | Scheduling | schedule, list, get, cancel |
| Notifications | Alerts & reminders | send, list, mark_read |
| Google Calendar | Calendar events (WASM extension) | list_events, create_event, update_event, delete_event, check_availability |
| Gmail | Email service (WASM extension) | list_emails, read_email, send_email, draft_email, reply_email |
| Filesystem | Local file management | list_files, read_file, write_file, delete_file, search_files |
| Skills | Dynamic workflow management | create, list, get, update, delete |
| Conversations | Conversation management | list_conversations, get_conversation, update_conversation, delete_conversation |
| Settings | Conversational settings | get_profile, update_profile, get_notification_preferences, update_notification_preferences, get_privacy_settings, update_privacy_settings, get_appearance_settings, update_appearance_settings |
| Slack | Team messaging (WASM extension) | send_message, list_channels, read_messages |
| JIRA | Issue tracking (WASM extension) | list_issues, get_issue, create_issue, update_issue, transition_issue, add_comment, list_projects |
| Echo | Testing | echo, delay_echo |
| Webhook | HTTP | receive, send |
| Onboarding | New user setup | create_group, create_spaces, complete |
All 18 original Python integrations have been replaced by WASM extensions or morphee-core built-in providers. The same describe() / execute() contract applies whether the integration is a compiled WASM binary or a Rust trait implementation.
Each Integration can have multiple Interfaces (configured instances). Example: Two LLM Interfaces -- "Claude" for reasoning, "Haiku" for summarization.
Interface = Integration + Configuration + Vault
An Interface adds credentials and settings to make an Integration operational:
- LLM Interface: API key (vault) + model + temperature
- Gmail Interface: OAuth tokens (vault) + account
- Memory Interface: pgvector (via PostgreSQL) + Git repo path
- Frontend Interface: WebSocket connection
Secret values (API keys, OAuth tokens) are never stored in the database. The Interface config stores vault:// references that resolve through the active VaultProvider at execution time. See interfaces.md -- Configuration Security & Vault for details.
Interfaces are scoped per-Space and per-Group.
Knowledge Pipeline & Runtime Hierarchy
Morphee's architecture follows a single principle: knowledge enters as conversation and exits as portable, shareable intelligence.
The Compilation Chain
Knowledge exists at different optimization levels, each backed by a runtime:
Level 0: Raw knowledge (conversations, memories) -> LLMRuntime (expensive, flexible)
Level 1: Structured skills (YAML step sequences) -> Skill executor (cheap, structured)
Level 2: Canvas components (React/JS) -> JSRuntime (fast, visual)
Level 3: Compiled extensions (.wasm binaries) -> WasmRuntime (fastest, portable)
Knowledge Pipeline Lifecycle
A Space (family recipes, classroom curriculum, team runbook) can be extracted, PII-stripped, compiled into an installable Integration, shared via the marketplace, and installed into another Space. Non-developers create "apps" by using Morphee -- their accumulated expertise IS the source code.
Request Flow
The LLM is the runtime of last resort. Vector search, structured skills, and compiled extensions handle most requests before the LLM is ever invoked.
Space to Integration
Any Space can become a shareable Integration through the compilation chain:
Digital Organism Architecture
The Fractal Brain module introduces Universal Recursive Intelligence — a paradigm where everything in Morphee is an Organism that follows the same lifecycle: receive signal → recall → respond → learn.
Each SpaceOrganism has its own NeuronStore, edges, substrats, and dream cycle. Spaces learn independently — cross-space collaboration happens via EdgeKind::CrossOrganism edges managed by SpaceOrganismRegistry.
The SignalGraphExecutor propagates signals through the organism graph with safety bounds (max_depth=5, budget_ms=1000, max_fanout=8). SignalTrace records the path for reward attribution via learn().
See features/fractal-brain.md for full design (21 files, ~7,500 lines, 167 tests).
System Components
1. Agent Orchestrator
The only "special" component. It runs the agentic conversation loop.
Location: crates/morphee-core/src/providers/agent_orchestrator.rs
The orchestrator uses the ToolCallingInferencer trait to call the LLM and the ToolExecutor trait to dispatch tool calls. The AnthropicInferencer (behind the cloud-llm feature gate) handles cloud LLM calls. CandleInferencer handles local GGUF inference. Both implement the same trait.
Key traits in crates/morphee-core/src/traits/orchestrator.rs:
ToolCallingInferencer-- send messages + tool definitions, get backAssistantResponsewithStopReasonand optionalToolCalllistOrchestrator-- the agentic loop that calls the inferencer, dispatches tools, feeds results backToolExecutor-- executes a single tool call, returnsToolResult
2. Interface Manager
Interfaces are now managed by morphee-server via REST endpoints (/v1/interfaces/). WASM extensions replace all Python integrations. The describe() / execute() contract is unchanged -- WASM extensions implement it in compiled form.
The Tool Bridge converts registered Interface actions into Anthropic tool definitions using interface__action double-underscore naming (e.g., echo__echo). The Agent Orchestrator uses this to give the LLM access to all registered tools.
2b. VaultProvider
A lower-level system service (not an Integration) that securely stores and retrieves credentials. Interfaces depend on it to resolve their vault:// references at execution time.
The VaultProvider is per-device -- on a Mac it may use 1Password or OS Keychain, on the server it uses environment variables, on a phone it uses the OS keystore. The vault:// references in the Interface config are portable across all backends.
| Backend | Platform | Status |
|---|---|---|
EnvVaultProvider | Server / Docker | Built |
KeychainVaultProvider | Desktop (Tauri) -- OS Keychain | Built |
MobileKeystoreVaultProvider | iOS Keychain / Android SQLite vault | Built |
OnePasswordVaultProvider | Desktop (power users) -- 1Password CLI/SDK | Planned |
CloudKmsVaultProvider | AWS/Azure/GCP secret managers | Planned |
See interfaces.md -- Configuration Security & Vault for the full design.
3. Tauri Rust Backend
Location: frontend/src-tauri/src/
The Tauri v2 app includes a Rust backend that runs locally on the user's machine. It communicates with the frontend via IPC (invoke()). This is where local, privacy-preserving compute lives -- embeddings, vector search, LLM inference, speech, biometrics, and WASM extensions.
src-tauri/src/
├── main.rs
├── lib.rs # Tauri builder + setup + 76 command registration
├── state.rs # AppState (Mutex/AsyncMutex for subsystems)
├── error.rs # MorpheeError (thiserror + Serialize for IPC)
├── embeddings.rs # EmbeddingProvider -- fastembed/ONNX (desktop), candle/BERT (mobile)
├── vector_store.rs # VectorStore -- LanceDB (desktop), SQLite/rusqlite (mobile)
├── vector_router.rs # VectorRouter -- local vector-first routing
├── git_store.rs # GitStore (git2/libgit2, Markdown + YAML frontmatter, OpenMorph)
├── vault.rs # VaultProvider -- keyring (macOS/Win/Linux/iOS), SQLite (Android)
├── file_store.rs # FileStore (sandboxed fs per group, path traversal prevention)
├── action_queue.rs # ActionQueue -- offline action queue (JSON-file-backed)
├── llm.rs # LLMRuntime -- candle/GGUF inference (Phi-4, Llama 3B, Mistral 7B)
├── model_manager.rs # ModelManager -- download, switch, delete GGUF models
├── tts.rs # Text-to-speech (local, offline)
├── whisper.rs # Speech-to-text via ONNX Whisper (on-device)
├── liveness.rs # Liveness detection (face + voice)
├── face_encoder.rs # FaceEncoderProvider -- candle MobileFaceNet (128-dim)
├── voice_encoder.rs # VoiceEncoderProvider -- candle d-vector CNN (192-dim)
├── tokenizer.rs # Pure-Rust BERT WordPiece tokenizer (mobile)
├── error_codes.rs # Structured error codes with EN+FR locales
├── extensions/ # WASM Extension runtime (wasmer)
│ ├── models.rs # ExtensionManifest, ExtensionPermission, ResourceLimits
│ ├── permissions.rs # Permission checking
│ └── wasm_runtime.rs # WasmRuntime (wasmer, module cache)
└── commands/ # 76 IPC commands across 14 modules
Key crates: fastembed (ONNX embeddings, desktop), lancedb (embedded vector DB, desktop), candle (BERT embeddings + GGUF LLM + biometrics), rusqlite (SQLite vector store + Android vault, mobile), git2 (native Git with vendored libgit2), keyring (OS keychain), wasmer (WASM extension runtime). 165 desktop + 9 mobile-ml Rust tests.
3b. morphee-core
Location: crates/morphee-core/
The shared business logic crate. Contains all traits, types, and providers that run identically on server, desktop, mobile, and CLI.
Key traits:
Pipeline-- the central orchestration point. Holds anOrchestrator,ExperienceStore,VectorDB,KnowledgeStoreToolCallingInferencer-- LLM abstraction (cloud or local)Orchestrator+ToolExecutor-- agentic loopExperienceStore-- memory storage and retrievalVectorDB-- vector similarity search abstractionKnowledgeStore-- compiled knowledge bundles
Providers:
AnthropicInferencer(behindcloud-llmfeature gate) -- Anthropic API with streaming + tool callingCandleInferencer-- local GGUF inference via candleAgentOrchestrator-- the agentic loop implementation with approval channels
Fractal Brain + Digital Organism (behind fractal-brain feature gate): Universal recursive intelligence where everything is an Organism. 21 files, ~7,500 lines, 167 tests. See Fractal Brain Design.
- Neuron-based recall — per-token BERT hidden states recursively segmented into NeuronTrees. Three modes: Exact (0 LLM), Variation (0 LLM), Novel (full LLM)
- Organism trait — universal
receive()+learn()contract at 6 scales (Neuron → Network) - SpaceOrganism — first production
impl Organism. Each Space is an independent learning organism with NeuronMemory, edges, substrats, Pipeline fallback - SignalGraphExecutor — safe signal propagation through organism graph (depth/fanout/budget limits)
- Reward system — confidence tracking, quarantine, temporal decay, branch-level blame attribution
- Dream consolidation — 5-phase background cycle (reward cleanup, pruning, merging, mitosis detection)
- Multi-space — SpaceOrganismRegistry manages independent organisms with cross-space edges
- gRPC proto —
organism.protowith 7 RPCs (Send, Learn, Observe, Chat, etc.)
RL Policy (behind rl-policy feature gate): Hierarchical Contextual Bandit for intelligent action selection. See ADR-012 and RL Policy Design.
- LinUCB primary -- Linear UCB with sliding window, Cholesky decomposition, no cold start. 406-dim state (embedding + user + temporal + budget context). Learns from experience #1.
- Neural PPO secondary -- PPO-clip with value head, triggered when LinUCB uncertain (high UCB spread). Catches non-linear patterns.
- Hierarchical action space -- 2-level selection: Level 1 categories (Memory/Skill/Wasm/LLM), Level 2 specific sub-arms.
- Multi-objective reward -- correctness, latency, cost, privacy with configurable presets (default, mobile, offline, child).
- Knowledge transfer -- Group-level priors via LinUCB A/b matrix averaging. Binary bundles for V2.1 Knowledge Marketplace.
Tests: 600 tests (547 unit + 52 integration + 1 doc, with rl-policy + wasm-cranelift); 650 tests (597 unit + 52 integration + 1 doc, with fractal-brain).
3c. morphee-server -- ALL PHASES COMPLETE (A-E)
Location: crates/morphee-server/
Axum-based HTTP server wrapping morphee-core for headless/multi-tenant deployments. This is the production cloud backend, replacing the retired Python backend.
Phase A (Foundation): Auth trait (SupabaseAuth + MorpheeAuth + CompositeAuth), SessionManager (multi-tenant Pipeline pool), AuthIdentity middleware, RateLimiter, PostgresVectorDB (pgvector + HNSW), WebSocket handler, morphee-server binary.
Phase B (AI Endpoints): Chat (SSE streaming), Conversations CRUD, Memory (store/search/delete/embed), Skills CRUD, Unified Search. Persistence via sqlx (conversations, messages, skills tables).
Phase C (Auth CRUD + Entity Management): Full auth lifecycle (signup, signin, refresh, signout, GoTrue client), Groups CRUD, Spaces CRUD, Tasks CRUD, Canvas persistence, Child Identity (PIN auth, Morphee JWT), SSO (OAuth providers), OAuth token management, Notifications (push tokens, APNs/FCM), ACL (role-based access), Onboarding, Interface configuration, Step-up auth with challenge store, Parent consent flow.
Phase D (WASM Extensions): Real wasmer WASM execution (D1), host functions with SSRF protection (D2), server wiring + binary storage (D3), 5 integration extensions + audit logging (D4). All 6 extensions build for wasm32-wasip1.
Phase E (Knowledge Network): Knowledge bundles (compile, sign, publish, install), Specialist Neurons (domain-focused servers), Trust roots (Ed25519 knowledge signing for B2B trust chains), Shared ExperienceStore (pooled intelligence), Federated knowledge sharing (git-based peer sync), Metrics and health endpoints.
19 handler modules: auth_crud, consent, groups, invites, spaces, tasks, canvas, identity, sso, oauth, notifications, acl, onboarding, interfaces, extensions, knowledge, neurons, metrics, trust.
19 persistence modules: users, consents, groups, invites, spaces, tasks, canvas, identity, oauth, notifications, push_tokens, acl, interface_configs, extensions, conversations, messages, skills, knowledge_bundles, shared_experiences, neurons, sync_repos, trust_roots.
Tests: 153 tests (71 unit + 82 integration).
4. LLM Integration
The LLM is an Integration like any other. It is implemented via the ToolCallingInferencer trait in morphee-core:
- Cloud:
AnthropicInferencer(behindcloud-llmfeature gate) -- Anthropic API with streaming + tool calling. Swapping to another cloud provider means implementing the same trait. - Local:
CandleInferencer-- candle/GGUF inference (Phi-4 Mini Q4, Llama 3B, Mistral 7B). Metal GPU on macOS/iOS, CPU on others.
Both go through the same Pipeline. The frontend does not know which inferencer is active -- the Integration/Interface abstraction handles routing.
4b. Memory Integration
Memory is implemented via morphee-core traits:
ExperienceStore-- store and retrieve memories (facts, preferences, events)VectorDB-- vector similarity search with configurable backendsKnowledgeStore-- compiled knowledge bundles for sharing
Storage backends vary by deployment:
| Backend | Platform | Technology |
|---|---|---|
| PostgresVectorDB | morphee-server (cloud) | pgvector + HNSW indexes via sqlx |
| LanceDB | Tauri desktop | Embedded vector DB (Rust-native) |
| SQLite | Tauri mobile | rusqlite with vector similarity |
Git-backed Markdown remains the durable, human-readable memory layer -- per-Space git repos with Markdown files + YAML frontmatter. Branching, commit search, and temporal navigation are supported via git2 in the Tauri backend.
Auto-summarization extracts facts/preferences/events after conversations reach 10+ messages and stores them as structured memories.
5. Task System
Background execution tracking. Tasks have a lifecycle:
Tasks exist to track what the AI does -- the user does not create tasks manually.
6. Event System
Real-time event distribution via WebSocket. morphee-server handles WebSocket connections at /v1/ws with first-message JWT auth. Redis pub/sub distributes events across server instances.
7. Frontend
Location: frontend/src/
frontend/src/
├── pages/ # Route pages (Canvas, Chat, Dashboard, Tasks, Spaces, Calendar, etc.)
│ └── settings/ # 12 settings tabs
├── components/
│ ├── ui/ # shadcn/ui components (26 installed, Radix-based)
│ ├── auth/ # AuthForm, SSOIcons
│ ├── chat/ # ChatBubble, ConversationList, ToolCallCard, ApprovalCard, renderers/
│ ├── layout/ # Sidebar, Header, BottomNav, Breadcrumbs
│ ├── search/ # SearchDialog (Cmd+K)
│ ├── settings/ # GoogleConnect, InterfaceConfigCard
│ ├── tasks/ # TaskList, TaskDetail, CreateTaskDialog
│ └── spaces/ # SpaceList, SpaceDetail, SpaceCard
├── store/ # Zustand stores (11 stores)
├── hooks/ # 20 hooks
├── lib/ # 19 utilities (api, auth, sse, websocket, runtime, tauri, etc.)
├── types/ # TypeScript types
└── styles/ # CSS variables (light/dark theme)
Canvas is the primary view (spatial, free-form with drag/dismiss). Chat is a collapsible Sheet drawer. The Vite proxy forwards /v1/ to morphee-server (port 3000). SSE client handles streaming events (token, tool_use, tool_result, approval_request, done, error, title).
8. Auth Flow: Dual JWT Path
9. WASM Extension Ecosystem
Location: crates/morphee-server/src/handlers/extensions.rs, frontend/src-tauri/src/extensions/
Third-party extensions via WebAssembly. Same .wasm binary runs on morphee-server (wasmer) AND Tauri frontend (wasmer). Extensions implement the same describe() / execute() contract as built-in integrations.
- WasmRuntime: Executes
.wasmextensions with host functions and SSRF protection - JSRuntime: Executes canvas/UI components in browser/Tauri webview
Distribution: OCI registry at rg.morphee.ai
Security: 10 granular install-time permissions, code signing (Ed25519), resource limits, audit logging
OpenMorph: Extensions stored in .morph/extensions/*.wasm -- portable with Space
10. Interactive UI & Haptic Feedback
Editable AI components with haptic feedback. Desktop uses macOS NSHapticFeedbackManager / Windows Haptics API / Linux evdev. Mobile uses iOS UIImpactFeedbackGenerator / Android Vibrator. Editable components: card, list, table, calendar, kanban. Flow: AI renders, user edits inline, haptic pulse, save, optionally triggers new AI turn.
Data Flow
Chat Message
RAG context injection happens before each LLM call (search memory, append to system prompt). Auto-summarization triggers after 10+ messages.
Spaces (automatic routing)
User: "Remind me to buy milk"
-> Agent: No Space specified -> personal Space
-> Memory: Store in personal Space
User: "In meal planning, add chicken to the shopping list"
-> Agent: Detects "meal planning" -> routes to that Space
-> Memory: Store in "Meal Planning" Space
User: "Create a space for the book club"
-> Agent: Creates new Space, invites group members
-> Memory: Initialize Space memory
Infrastructure
Development
Docker Compose:
- morphee-server (Rust/axum, port 3000)
- postgres (PostgreSQL + pgvector, port 54322)
- redis (Redis, port 6379)
- supabase-auth (GoTrue, port 9999)
Frontend runs locally:
- npm run dev (Vite, port 5173, proxy /v1/ -> port 3000)
- npm run tauri dev (Tauri desktop)
Production
- Server: morphee-server (Rust/axum) behind Traefik reverse proxy
- Frontend: Tauri desktop app (macOS, Windows, Linux) + Tauri mobile (iOS, Android)
- Tauri Rust: embedded LanceDB, Git, ONNX/GGUF models (local on user machine)
- Database: Managed PostgreSQL with pgvector extension
- Redis: Managed Redis (event distribution, rate limiting)
- Sync: Local Tauri repos push/pull to morphee-server for backup
Design Decisions
| Decision | Rationale |
|---|---|
| Everything is an Integration | Unified abstraction. LLM, Memory, Frontend, Gmail -- same contract. Now implemented as morphee-core traits + WASM extensions |
| Unified Rust Engine | Same Pipeline runs on server, desktop, mobile, CLI. Python retired. One language, one codebase, one test suite |
| Canvas-first UI | Spatial, persistent components -- chat is a collapsible drawer |
| Group/Space naming | Universal: families, classrooms, teams. Not corporate jargon |
| Personal Space | Users do not need to create a Space to start using Morphee |
| pgvector (server) + LanceDB (desktop) | Server uses PostgreSQL-native vectors for multi-tenant. Desktop uses embedded LanceDB for offline |
| Git-backed memory | Human-readable, versioned, auditable. 1 repo per space. libgit2 via git2 crate |
| GGUF via candle | Local LLM inference. Pure Rust, no C++ build chain. Hugging Face maintained |
| WASM extensions | Portable, sandboxed, same binary runs on server and desktop. Replaces all Python integrations |
| Offline-first | App works without internet. Server is a sync hub, not a hard dependency |
| VaultProvider for secrets | Pluggable vault backends (env, OS Keychain, 1Password, mobile keystore). Secrets never in DB |
| Knowledge as Product | Compiled expertise (WASM bundles) can be shared, signed, and sold via marketplace |
Related Documentation
- ROADMAP.md -- Development roadmap and vision
- features/unified-rust-engine.md -- V2.0 Unified Rust Engine design
- features/fractal-brain.md -- Fractal Brain + Digital Organism architecture
- features/digital-brain-vision.md -- Digital Brain Vision: nature-inspired neuron growth and maturation
- decisions/ADR-011.md -- ADR for V2.0 architecture decision
- interfaces.md -- Integration/Interface system guide
- api.md -- API reference
- status.md -- Implementation status
- testing.md -- Testing guide
- deployment.md -- Deployment guide
Last Updated: March 1, 2026