Conversation Context Control Implementation
ARCHIVE — This document is historical reference only. It may contain outdated information. See docs/status.md for current project state.
Item: H-FEAT-009 — No conversation context control Priority: 🟠 High Effort: 🏷️ M (Medium) Personas: Thomas, Seb, Julie (professional/power users) Status: Phase A Complete (Backend + API), Phase B Pending (Frontend UI)
Overview
Implements conversation context control with two features:
- Message Pinning: Pin up to 5 messages per conversation that are always included in AI context
- Context Window Control: Adjust how many messages are sent to the LLM (short/medium/long/full)
Phase A: Backend & API ✅ COMPLETE
Database Migration
File: supabase/migrations/015_add_message_pinning_and_context_window.sql
- Added
pinnedboolean column tomessagestable (default FALSE) - Added
context_window_sizecolumn toconversationstable (VARCHAR, default 'medium') - Added index on
messages(conversation_id, pinned)for performance - Added check constraint for
context_window_sizevalues
Backend Models
File: backend/chat/models.py
- Added
pinned: bool = FalsetoMessagemodel - Added
context_window_size: str = "medium"toConversationmodel - Added
context_window_sizetoConversationUpdatewith pattern validation
Backend Service
File: backend/chat/service.py
New methods added to ChatService:
async def pin_message(message_id, conversation_id, group_id) -> Optional[Message]
async def unpin_message(message_id, conversation_id, group_id) -> Optional[Message]
async def get_pinned_messages(conversation_id, group_id) -> list[Message]
Constraints:
- Maximum 5 pinned messages per conversation
- Group-based security checks
Backend API Endpoints
File: backend/api/chat.py
New endpoints:
POST /api/chat/conversations/{conversation_id}/messages/{message_id}/pin
DELETE /api/chat/conversations/{conversation_id}/messages/{message_id}/pin
GET /api/chat/conversations/{conversation_id}/messages/pinned
PATCH /api/chat/conversations/{conversation_id} updated to accept context_window_size field.
Frontend Types
File: frontend/src/types/index.ts
- Added
pinned?: booleantoMessageinterface - Added
context_window_size?: 'short' | 'medium' | 'long' | 'full'toConversationinterface - Added
context_window_sizetoConversationUpdateinterface
Frontend API Client
File: frontend/src/lib/api.ts
New methods in conversations object:
pinMessage(conversationId: string, messageId: string): Promise<Message>
unpinMessage(conversationId: string, messageId: string): Promise<Message>
pinnedMessages(conversationId: string): Promise<Message[]>
Phase B: Frontend UI 🚧 PENDING
Components to Create
-
PinnedMessages.tsx — Display pinned messages at top of conversation
- Compact card layout with distinctive styling (border, background color)
- Shows first 100 chars of content + "..." if truncated
- Click to scroll to original message in conversation
- Unpin button (X icon)
-
ConversationSettingsDialog.tsx — Context window controls
- Trigger: Gear icon in chat header (next to conversation title)
- Radio group for context window size:
- Short: ~10 messages (faster, cheaper)
- Medium: ~25 messages (default, balanced)
- Long: ~50 messages (comprehensive context)
- Full: All messages (expensive, use sparingly)
- Show estimated token count indicator
- Save button (calls
conversations.update())
-
ChatBubble.tsx updates — Pin button in message actions
- Add pin/unpin button to message dropdown menu (next to edit/regenerate/retry)
- Icon: Pin (filled when pinned, outline when not)
- Visual indicator: Pinned messages show small pin icon in corner
- Disable pin button when 5 messages already pinned (show tooltip)
-
Chat.tsx updates — Integrate components
- Add
<PinnedMessages />component above message list - Add gear icon button in header to open ConversationSettingsDialog
- Fetch pinned messages on conversation load
- Update local state when pin/unpin actions succeed
- Add
Translation Keys to Add
File: frontend/src/i18n/locales/en/common.json
{
"chat": {
"contextSettings": "Context Settings",
"contextWindowSize": "Context Window Size",
"contextShort": "Short (~10 messages)",
"contextMedium": "Medium (~25 messages)",
"contextLong": "Long (~50 messages)",
"contextFull": "Full (all messages)",
"contextDescription": "Controls how many messages are sent to the AI",
"pinnedMessages": "Pinned Messages",
"pinMessage": "Pin message",
"unpinMessage": "Unpin message",
"pinLimit": "Maximum 5 pinned messages",
"noPinnedMessages": "No pinned messages yet",
"pinTooltip": "Pin this message to always include it in AI context",
"unpinTooltip": "Unpin this message",
"estimatedTokens": "Estimated tokens: {{count}}"
}
}
Backend Integration ✅ COMPLETE
Orchestrator Updates
File: backend/chat/orchestrator.py
TODO (Phase B.2):
Update context building logic:
async def _build_context(self, conversation_id: UUID, group_id: UUID) -> list[Message]:
# 1. Get pinned messages (always include)
pinned = await self.chat_service.get_pinned_messages(conversation_id, group_id)
# 2. Get conversation settings
conversation = await self.chat_service.get_conversation(conversation_id, group_id)
window_size = conversation.context_window_size
# 3. Map window size to message limit
limits = {"short": 10, "medium": 25, "long": 50, "full": 1000}
limit = limits.get(window_size, 25)
# 4. Fetch recent messages (excluding already-pinned ones)
recent = await self.chat_service.get_messages(
conversation_id, group_id, limit=limit
)
# 5. Deduplicate and combine: pinned + recent
all_messages = pinned + [m for m in recent if m.id not in {p.id for p in pinned}]
# 6. Sort chronologically
return sorted(all_messages, key=lambda m: m.created_at)
Testing
Backend Tests ✅ PASSED
- All 17 chat_service tests passed
- Frontend build successful (981 KB JS, 63 KB CSS)
Frontend Tests 🚧 PENDING
Create tests in frontend/src/components/chat/__tests__/:
PinnedMessages.test.tsx— rendering, click to scroll, unpinConversationSettingsDialog.test.tsx— radio group, saveChatBubble.test.tsx— pin button, visual indicator
User Experience
Before (Current)
- ❌ No way to ensure important instructions stay in context
- ❌ No control over how much history AI sees
- ❌ Power users frustrated by context limitations
After (When Complete)
- ✅ Pin critical messages (e.g., project requirements, code snippets)
- ✅ Adjust context window based on conversation needs
- ✅ Visual feedback on pinned status
- ✅ Better token cost control
Security & Performance
Security
- ✅ Group-based access control on all endpoints
- ✅ Pin limit (5) prevents abuse
- ✅ No user input validation needed (IDs only)
Performance
- ✅ Index on
messages(conversation_id, pinned)for fast pinned queries - ✅ Context window controls reduce LLM API costs
- ⚠️ Full context window can be expensive — show warning in UI
Next Steps
-
Complete Phase B.1: Create frontend UI components
- Priority: ChatBubble pin button (quickest user value)
- Then: PinnedMessages display
- Finally: ConversationSettingsDialog
-
Complete Phase B.2: Update orchestrator context building
- Implement
_build_context()logic described above - Add tests for pinned message prioritization
- Implement
-
User Testing:
- Test with Thomas (manager persona) — pin project requirements
- Test with Seb (contractor persona) — pin API docs
- Test with Julie (employee persona) — adjust context for complex discussions
-
Documentation:
- Add to API docs (
docs/api.md) - Update ROADMAP.md Phase 3e.5 status
- Add to test results (
docs/test-results.md)
- Add to API docs (
Estimated Remaining Effort
- Phase B.1 (Frontend UI): 2-3 hours
- Phase B.2 (Orchestrator): 1 hour
- Testing & Polish: 1 hour
- Total: 4-5 hours
Files Modified
Backend
- ✅
supabase/migrations/015_add_message_pinning_and_context_window.sql - ✅
backend/chat/models.py - ✅
backend/chat/service.py - ✅
backend/api/chat.py
Frontend
- ✅
frontend/src/types/index.ts - ✅
frontend/src/lib/api.ts - 🚧
frontend/src/components/chat/PinnedMessages.tsx(NEW) - 🚧
frontend/src/components/chat/ConversationSettingsDialog.tsx(NEW) - 🚧
frontend/src/components/chat/ChatBubble.tsx(UPDATE) - 🚧
frontend/src/pages/Chat.tsx(UPDATE) - 🚧
frontend/src/i18n/locales/en/common.json(UPDATE)
Pending
- 🚧
backend/chat/orchestrator.py(Phase B.2)
Status: Ready for Phase B frontend implementation. Backend infrastructure complete and tested.