Skip to main content

V2.0 — Unified Rust Engine

Created: February 27, 2026 Status: Phase A ✅ + Phase B ✅ + Phase C ✅ + Phase D ✅ + Phase E ✅ — all backend phases complete. Frontend URL cutover done (C6). ADR: ADR-011 (supersedes ADR-006) Depends on: morphee-core Phase 3 (DONE), WASM extensions V1.2 (DONE)

Core idea: morphee-core IS the product. The Python backend is replaced by morphee-server (axum + tonic), so the same Pipeline runs on desktop, mobile, server, and CLI. The server's unique value is shared intelligence — pooled experiences, compiled WASM solvers, and federated knowledge.


Why Now

  • morphee-core exists — 15+ traits, Pipeline, EventBus, FeedbackLoop, 351 tests
  • ~80% overlap with Python backend — maintaining both is tech debt
  • No users yet — clean migration window before production traffic
  • Knowledge sharing needs unified code — server must run the same Pipeline as local

Architecture Overview

Before (V1.x — Dual Backend)

Frontend ──HTTP──→ Python Backend (FastAPI, 18 integrations, pgvector, Redis)

└──IPC──→ Tauri Rust Backend (morphee-core, local compute)

Two brains, two codebases, knowledge can't flow between them.

After (V2.0 — Unified Engine)

                     ┌────────────────────────────────┐
│ morphee-core │
│ Pipeline · EventBus · Feedback │
│ 15 traits · all implementations│
└───────┬────────┬────────┬──────┘
│ │ │
┌──────────┤ │ ├──────────┐
│ │ │ │ │
┌────────▼───┐ ┌───▼────────▼──┐ ┌──▼───────┐ │
│ Tauri App │ │morphee-server │ │morphee-cli│ │
│ desktop & │ │ axum + tonic │ │ clap │ │
│ mobile │ │ multi-tenant │ │ one-shot │ │
│ IPC cmds │ │ REST + gRPC │ │ Kaggle │ │
└────────────┘ └───────────────┘ └──────────┘ │

┌─────────▼──────┐
│ cargo add │
│ morphee-core │
│ (any Rust app) │
└────────────────┘

One brain, one codebase, knowledge flows everywhere.


What The Server Uniquely Adds

Local morphee-core handles everything a single user needs. The server adds what only a server can:

CapabilityWhy Server-OnlyValue
Shared ExperienceStoreAggregate learnings across thousands of usersHive mind — everyone benefits from everyone's usage
Knowledge Signing (Auth trait)B2B trust chains for compiled knowledgeCorporations verify WASM solver provenance
OAuth CallbacksGoogle/GitHub/Apple OAuth needs reachable URLsUsers sign in with existing accounts
Push NotificationsAPNs/FCM require server-side credentialsReal-time alerts on mobile
Multi-Tenant Sessions1000s of concurrent Pipelines with isolationWeb users, API consumers
Federated Git SyncCentral remote for .morph/ reposCross-device, cross-user sync
Heavy InferenceGPU cluster (H100) for complex reasoningQueries local hardware can't handle
Specialist NeuronsDomain-focused instances with concentrated expertiseBetter accuracy per domain

New Traits

Auth Trait

Authentication AND knowledge signing in one contract. Critical for B2B where corporations need to verify the provenance of compiled knowledge.

/// Identity — who is this user/service?
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Identity {
pub id: String,
pub provider: AuthProvider, // Supabase, Morphee, ServiceAccount
pub roles: Vec<String>, // admin, user, child, service
pub group_id: Option<String>,
pub metadata: HashMap<String, serde_json::Value>,
}

/// Signature on compiled knowledge
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct KnowledgeSignature {
pub signer_id: String,
pub algorithm: String, // Ed25519, RSA-PSS
pub signature: Vec<u8>,
pub signed_at: u64,
pub chain: Vec<String>, // Trust chain: [root_ca, intermediate, signer]
}

#[async_trait]
pub trait Auth: Send + Sync {
/// Verify a token → returns identity
async fn verify_token(&self, token: &str) -> Result<Identity, MorpheeError>;

/// Issue a token (for child accounts, service accounts)
async fn issue_token(&self, identity: &Identity, ttl: Duration)
-> Result<String, MorpheeError>;

/// Sign compiled knowledge (WASM solver, skill, pattern)
async fn sign_knowledge(&self, content: &[u8], signer: &Identity)
-> Result<KnowledgeSignature, MorpheeError>;

/// Verify a signature on knowledge — checks trust chain
async fn verify_knowledge(&self, content: &[u8], sig: &KnowledgeSignature)
-> Result<bool, MorpheeError>;

/// List trusted signers (for a group/organization)
async fn trusted_signers(&self, group_id: &str)
-> Result<Vec<Identity>, MorpheeError>;
}

Built-in implementations:

ImplementationBackendUse Case
SupabaseAuthGoTrue JWT verificationDefault — users authenticate via Supabase
MorpheeAuthEd25519 key pairChild accounts, service-to-service, knowledge signing
CompositeAuthWraps multiple Auth implsPeek JWT iss → route to correct verifier

Knowledge signing flow:

Teacher compiles "Math Tutoring" → WASM bundle
→ Auth.sign_knowledge(wasm_bytes, teacher_identity) → KnowledgeSignature
→ Published to marketplace with signature

Corporate admin installs "Math Tutoring"
→ Auth.verify_knowledge(wasm_bytes, signature)
→ Checks trust chain: Morphee root CA → teacher verified → signature valid
→ Installed ✓

PostgresVectorDB

Server-grade VectorDB using pgvector with HNSW indexing:

pub struct PostgresVectorDB {
pool: sqlx::PgPool,
table: String,
dimensions: usize,
}

#[async_trait]
impl VectorDB for PostgresVectorDB {
async fn insert(&self, id: &str, embedding: &[f32], content: &str,
metadata: serde_json::Value) -> Result<(), MorpheeError> {
sqlx::query("INSERT INTO $1 (id, embedding, content, metadata) VALUES ($2, $3, $4, $5)")
.bind(&self.table).bind(id).bind(embedding).bind(content).bind(metadata)
.execute(&self.pool).await?;
Ok(())
}

async fn search(&self, embedding: &[f32], top_k: usize,
filter: Option<&str>) -> Result<Vec<SearchResult>, MorpheeError> {
// pgvector cosine distance with HNSW index
// ...
}

async fn delete(&self, id: &str) -> Result<(), MorpheeError> { /* ... */ }
}

Knowledge Architecture

A. Knowledge As Product

The server's ExperienceStore is the aggregation point. Every user's Pipeline contributes, everyone benefits.

         Local User A                    Local User B
(cooking) (homework)
│ │
▼ ▼
ExperienceStore A ExperienceStore B
- unit_conversion: 95% - quadratic: 92%
- recipe_scale: 88% - fractions: 97%
│ │
└───────────┬───────────────────┘

┌───────────────┐
│ Server │
│ Shared │
│ Experience │
│ Store │
└───────┬───────┘

┌─────────┼─────────┐
▼ ▼ ▼
unit_conversion quadratic recipe_scale
.wasm .wasm .wasm
(compiled, (compiled, (compiled,
shared) shared) shared)

Business model:

  • Free tier: Local-only Pipeline. Learn from your own usage. Good.
  • Paid tier: Connected to server. Inherit compiled knowledge from all users. Great from day one.
  • Marketplace (V2.1): Sell your compiled expertise. Teacher's "Math Tutoring" bundle = real money.

B. Federated Knowledge Network

morphee-core instances form a peer network using git as the transport layer:

┌──────────────┐     git push/pull     ┌──────────────┐
│ Family A │◄────────────────────►│ Server │
│ (desktop) │ │ (central │
└──────────────┘ │ remote) │
│ │
┌──────────────┐ git push/pull │ Curates, │
│ Family B │◄────────────────────►│ ranks, │
│ (mobile) │ │ distributes │
└──────────────┘ └──────┬───────┘

┌──────▼───────┐
│ Registry │
│ (OCI + │
│ git) │
└──────────────┘
  • Compiled patterns travel as git objects inside .morph/
  • WASM solvers distributed via OCI registry (rg.morphee.ai)
  • Server curates: ranks patterns by usage + accuracy, prunes low-quality
  • Users opt-in to contribute anonymized experiences

C. Specialist Neurons

Domain-focused morphee-server instances that concentrate expertise:

                     ┌──────────────────┐
│ VectorRouter │
│ (client-side) │
└──────┬───────────┘
│ classify domain
┌────────────┼────────────┐
▼ ▼ ▼
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ math.morphee │ │cook.morphee │ │edu.morphee │
│ .ai │ │.ai │ │.ai │
│ │ │ │ │ │
│ 500 WASM │ │ 200 WASM │ │ 300 WASM │
│ solvers │ │ solvers │ │ solvers │
│ 50K problems │ │ 30K recipes │ │ 40K lessons │
│ solved │ │ solved │ │ solved │
└──────────────┘ └──────────────┘ └──────────────┘
  • Each neuron runs morphee-core with domain-specialized ExperienceStore
  • Local clients ask VectorRouter: "What domain is this?" → route to specialist
  • Specialists improve faster because traffic is concentrated
  • New neurons spin up when a domain reaches critical mass

Python Backend → Rust Migration Map

What Moves Where

Python ModuleLinesRust DestinationComplexity
chat/orchestrator.py~800Pipeline.process()Already done
chat/llm.py~600Inferencer traitAlready done
chat/vector_router.py~200Router traitAlready done
memory/~1,200VectorDB + KnowledgeStoreAlready done
skills/~800Compiler + StrategyAlready done
interfaces/integrations/ (18)~3,000WASM extensionsPhase D
auth/~400Auth trait + axum middlewarePhase A
api/ (18 routers)~2,500axum routesPhase B-C
services/ (user, group, space)~1,500axum + sqlx handlersPhase C
websocket/~300EventBus → axum WebSocketPhase B
utils/~500morphee-core utilitiesPhase B

Total Python to retire: ~11,800 lines Already covered by morphee-core: ~3,600 lines (~30%) New Rust to write: ~5,000 lines (axum routes, sqlx queries, WASM extensions) Net reduction: ~3,200 fewer lines of code

Migration Phases

Phase A: morphee-server Foundation ✅ (Feb 27, 2026)

Server skeleton built. Python backend stays alive as primary.

  • morphee-server crate with axum + tonic
  • Auth trait + SupabaseAuth + MorpheeAuth + CompositeAuth implementations
  • Knowledge signing (Ed25519 key generation, sign, verify, trust chains)
  • PostgresVectorDB (pgvector + HNSW, sqlx)
  • Session manager (multi-tenant Pipeline pool)
  • Health endpoint, CORS, rate limiting middleware
  • SSE event streaming (Pipeline EventBus → HTTP)
  • WebSocket support (real-time events)
  • Tests: auth, vector DB, session lifecycle

Phase B: AI Endpoints Migration ✅ (Feb 27, 2026)

All AI-related endpoints migrated from Python to morphee-server.

  • POST /v1/chat → morphee-server (Pipeline.process + SSE streaming)
  • POST /v1/embed → morphee-server (Embedder trait)
  • GET /v1/memory/search → morphee-server (VectorDB trait)
  • POST /v1/memory → morphee-server (VectorDB + KnowledgeStore)
  • GET/POST /v1/skills, GET/PUT/DELETE /v1/skills/{id} → morphee-server (CRUD + persistence)
  • GET/POST /v1/conversations, GET/PATCH/DELETE /v1/conversations/{id}, GET /v1/conversations/{id}/messages → morphee-server
  • GET /v1/search → unified search across conversations, memory, skills
  • ToolCallingInferencer + Orchestrator + ToolExecutor traits in morphee-core
  • AnthropicInferencer — cloud LLM behind cloud-llm feature gate
  • AgentOrchestrator — agentic tool-calling loop with approval channels
  • Integration tests: 40 morphee-server tests (8 unit + 32 integration)
  • Python backend proxies AI endpoints to morphee-server
  • Frontend switches to morphee-server for AI calls

Phase C: CRUD + Auth Migration ✅ (Feb 27, 2026)

All application-level data management migrated from Python to morphee-server. 171 routes, 19 handler modules, 19 persistence modules.

  • User CRUD (sqlx, replaces asyncpg)
  • Group CRUD + membership
  • Space CRUD + hierarchy (parent_space_id)
  • Task CRUD (replaces Python task service)
  • Canvas CRUD (replaces Python canvas endpoints)
  • Notification system (replaces Redis pub/sub with EventBus)
  • OAuth callback endpoints (Google, Apple, Microsoft)
  • SSO flow (GoTrue integration via axum)
  • Data export (GDPR Article 15/20)
  • Settings service
  • Database migrations (sqlx migrate)
  • Frontend URL cutover (/api//v1/, 12 new endpoints, ~70 path updates)

Phase D: WASM Extension Migration ✅ (Feb 27, 2026)

Real wasmer WASM execution with host functions, SSRF protection, binary storage, audit logging. All 6 extensions build for wasm32-wasip1.

  • WASM runtime (wasmer) with host function bridge
  • SSRF protection for HTTP host functions
  • Binary storage in PostgreSQL
  • Extension upload, install, execute, enable/disable
  • Audit logging for all extension executions
  • 5 integration extensions + echo proof-of-concept

Phase E: Knowledge Network ✅ (Feb 27, 2026)

Federated knowledge and specialist neuron endpoints implemented.

  • Shared ExperienceStore (PostgreSQL-backed, multi-tenant aggregation)
  • Knowledge signing integration (Auth trait → sign on compile, verify on install)
  • Federated git sync endpoints (.morph/ repo management)
  • Specialist neuron routing (domain classification → endpoint selection)
  • Knowledge quality scoring (accuracy, usage count, freshness)
  • Trust root management

morphee-server Endpoint Design

All endpoints use the /v1/ prefix. See docs/api.md for the full 171-route reference.

REST (axum) — Key Routes

# Auth
POST /v1/auth/signup → Sign up (GoTrue)
POST /v1/auth/signin → Sign in → JWT token
POST /v1/auth/refresh → Refreshed JWT
POST /v1/auth/sso/{provider} → SSO redirect URL
POST /v1/auth/sso/callback → OAuth callback handler
POST /v1/identity/auth/child → Issue child JWT (MorpheeAuth)

# Chat / Processing
POST /v1/chat → Chat with tool calling (SSE)
POST /v1/chat/component-event → Component event dispatch
POST /v1/chat/approve/{id} → Approve pending action
POST /v1/process → Process query (Pipeline SSE)

# Conversations
GET /v1/conversations → List conversations
POST /v1/conversations → Create conversation
POST /v1/conversations/sync-local → Sync local conversation
GET /v1/conversations/{id}/messages → List messages
POST /v1/conversations/{id}/messages/{msg_id}/pin → Pin message
POST /v1/conversations/{id}/messages/{msg_id}/react → Add reaction

# Memory
POST /v1/embed → Embed text
POST /v1/memory → Insert memory
GET /v1/memory/search → Search memory (vector similarity)

# Knowledge
POST /v1/knowledge/sign → Sign compiled knowledge
POST /v1/knowledge/verify → Verify knowledge signature

# CRUD (application data)
GET /v1/auth/me → Current user profile
POST /v1/groups → Create group
GET /v1/spaces → List spaces in group
GET /v1/tasks → List tasks
GET /v1/canvas → List canvases

# Extensions
GET /v1/extensions → Installed extensions
POST /v1/extensions/upload → Upload WASM binary
POST /v1/extensions/{id}/execute → Execute extension action

# Interfaces
GET /v1/interfaces/{name}/config → Get config by integration type
PUT /v1/interfaces/{name}/config → Upsert config by integration type

# Specialist Neurons
GET /v1/neurons → List available specialist neurons
POST /v1/neurons/{domain}/process → Route to specialist neuron

# OAuth
GET /v1/oauth/authorize → OAuth authorization URL
GET /v1/oauth/{provider}/status → Connection status

# Health
GET /health → Server health

gRPC (tonic)

service MorpheeService {
// Processing — server-streaming events
rpc Process(ProcessRequest) returns (stream Event);
rpc ChatStream(ChatRequest) returns (stream ChatEvent);

// Embeddings
rpc Embed(EmbedRequest) returns (EmbedResponse);
rpc EmbedBatch(EmbedBatchRequest) returns (EmbedBatchResponse);

// Memory
rpc MemoryInsert(MemoryRecord) returns (Empty);
rpc MemorySearch(SearchRequest) returns (SearchResponse);

// Knowledge
rpc SignKnowledge(SignRequest) returns (SignatureResponse);
rpc VerifyKnowledge(VerifyRequest) returns (VerifyResponse);
rpc ListCompiled(Empty) returns (CompiledList);

// Events — subscribe to all Pipeline events
rpc Subscribe(SubscribeRequest) returns (stream Event);

// Neurons
rpc RouteToNeuron(NeuronRequest) returns (stream Event);
}

Frontend Changes

Extension Installer Improvements

The frontend needs better UX for discovering, installing, and managing WASM extensions — especially since all 18 integrations become extensions.

  • Extension Store page — browse available extensions with search, categories, ratings
  • One-click install — download WASM, verify signature, register with Pipeline
  • Permission review — clear display of what each extension can access
  • Extension settings — per-extension configuration (OAuth tokens via vault)
  • Update notifications — alert when new extension versions are available
  • Dependency resolution — some extensions depend on others (e.g., Gmail needs Google OAuth)

API Client Migration

Frontend HTTP client switches from Python backend URLs to morphee-server:

// Before: dual backend
const chatResponse = await fetch('/api/chat', { ... }); // → Python
const embedding = await invoke('embed_text', { text }); // → Tauri Rust

// After: unified
const chatResponse = await fetch('/api/sessions/:id/chat', { ... }); // → morphee-server
const embedding = await fetch('/api/sessions/:id/embed', { ... }); // → morphee-server
// OR Tauri IPC (same morphee-core, local)
const embedding = await invoke('embed_text', { text }); // → morphee-core

The frontend detects whether it's running in Tauri (local Pipeline) or browser (morphee-server) and routes accordingly. The response format is identical because both run morphee-core.


Database Schema

morphee-server uses PostgreSQL via sqlx (compile-time checked queries).

The existing schema stays — sqlx replaces asyncpg as the driver. New tables:

-- Knowledge signatures for trust chains
CREATE TABLE knowledge_signatures (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
content_hash TEXT NOT NULL, -- SHA-256 of signed content
signer_id UUID NOT NULL REFERENCES users(id),
algorithm TEXT NOT NULL DEFAULT 'Ed25519',
signature BYTEA NOT NULL,
trust_chain JSONB NOT NULL DEFAULT '[]',
signed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
UNIQUE(content_hash, signer_id)
);

-- Shared experience store (aggregated, anonymized)
CREATE TABLE shared_experiences (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
input_hash TEXT NOT NULL,
input_embedding vector(384),
domain TEXT NOT NULL,
route_taken TEXT NOT NULL,
strategy_used TEXT NOT NULL,
signal TEXT NOT NULL CHECK (signal IN ('reject', 'accept', 'approve')),
score JSONB,
duration_ms BIGINT NOT NULL,
compilation_level INT NOT NULL DEFAULT 0,
contributor_count INT NOT NULL DEFAULT 1,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_shared_exp_domain ON shared_experiences(domain);
CREATE INDEX idx_shared_exp_embedding ON shared_experiences
USING hnsw (input_embedding vector_cosine_ops);

-- Specialist neuron registry
CREATE TABLE specialist_neurons (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
domain TEXT NOT NULL UNIQUE,
endpoint TEXT NOT NULL, -- e.g., "math.morphee.ai:50051"
total_experiences BIGINT NOT NULL DEFAULT 0,
compiled_solvers INT NOT NULL DEFAULT 0,
accuracy REAL NOT NULL DEFAULT 0.0,
status TEXT NOT NULL DEFAULT 'active',
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

-- Trust chain roots (for B2B)
CREATE TABLE trust_roots (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
group_id UUID NOT NULL REFERENCES groups(id) ON DELETE CASCADE,
public_key BYTEA NOT NULL,
label TEXT NOT NULL, -- e.g., "TechCorp Root CA"
expires_at TIMESTAMPTZ,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

Testing Strategy

LayerWhatTool
morphee-core traitsUnit tests per implementationcargo test (351 existing + new)
Auth traitJWT verify, knowledge sign/verify, trust chainscargo test --features server
PostgresVectorDBInsert, search, delete, HNSW performancecargo test + test PostgreSQL
morphee-server endpointsHTTP request/response, SSE streamingcargo test + reqwest integration
gRPC serviceProtobuf round-trip, streamingcargo test + tonic test client
Session managerLifecycle, concurrent sessions, cleanupcargo test
Frontend integrationFull chat flow through morphee-serverPlaywright E2E
WASM extensionsEach migrated integration has test suitecargo test in extension crate
Knowledge networkSign → publish → distribute → verifyIntegration test

Target: 500+ tests for morphee-server (matching Python backend's 2,014 adjusted for removed overlap)


Success Criteria

  1. Python backend fully retired — zero Python processes in production
  2. Same API contract — frontend works without changes (or minimal)
  3. All 18 integrations as WASM — no hardcoded Python integrations remain
  4. Knowledge signing works — B2B can verify WASM solver provenance
  5. Shared ExperienceStore live — server aggregates learnings, local users benefit
  6. Specialist neurons running — at least 3 domains (math, cooking, education)
  7. Performance: 10x concurrent capacity — Rust handles more users per machine
  8. Test coverage: 80%+ — matching the Python backend's standard

Timeline

PhaseDurationDeliverable
A: Server FoundationWeek 1-2Auth trait, PostgresVectorDB, session manager, health
B: AI EndpointsWeek 3-4Chat, embed, memory, skills — Python proxies to Rust
C: CRUD + AuthWeek 5-7Users, groups, spaces, tasks, OAuth — Python retired
D: WASM ExtensionsWeek 8-12All 18 integrations as WASM, extension store UX
E: Knowledge NetworkWeek 10-14Shared experience, federated sync, specialist neurons
Total~14 weeksFull V2.0

Phase D and E overlap (weeks 10-12 are parallel).


Open Questions

  1. PostgreSQL OR keep LanceDB for server vectors? — pgvector is proven at scale, but LanceDB avoids a dependency. Decision: use both. PostgresVectorDB for multi-tenant shared data, LanceDB for per-session ephemeral vectors.

  2. How to handle Python-specific libraries? — Some integrations use Python-only SDKs (e.g., google-api-python-client). In WASM, these become HTTP calls to the service APIs directly. More work, but more portable.

  3. gRPC or REST-only? — gRPC adds complexity but enables efficient streaming and typed clients. Decision: both. REST for browser clients, gRPC for service-to-service and CLI.

  4. When to sunset Python backend? — After Phase C (week 7). Keep it running read-only for 2 weeks as safety net, then remove.

  5. Specialist neuron auto-provisioning? — Manual at first (configure domain → endpoint mapping). Auto-scaling in V2.1 based on traffic patterns.


References