konar est. 2026
CASE STUDY · 01
MULTI-PLATFORM AI WORKSPACE  ·  2025 — NOW  ·  SOLE ARCHITECT

An AI workspace with coworkers, not chatbots.

Trilo is an AI workspace for chat, projects, workflows, meetings and a knowledge graph, with 11 role-based AI coworker templates (copywriter, devops engineer, data analyst, project manager, support agent, and seven more) that join every doc as live peers. They sit inside the doc with persistent memory and tool access, the way a teammate does. One monorepo ships web, iOS, Android and desktop clients, plus a Python real-time transcription agent. Live at trilo.chat.

STATUS
live
trilo.chat · paid plans
CLIENTS
4 surfaces
web · ios · android · desktop
BACKEND
bun · elysia
drizzle · 18 background job processors
AGENT RUNTIME
AI-planned DAG
3-way review gates · 200K token cap
MCP
77 tools
14 modules · in anthropic registry

One product that replaces chat, task management, docs, spreadsheets, video calls, calendar, and social media scheduling, connected by a knowledge graph that remembers context across all of them. The differentiator isn't the feature list; it's the AI layer. The 11 starter coworker templates (copywriter, devops engineer, data analyst, content strategist, frontend engineer, marketing advisor, project manager, SEO specialist, social media manager, support agent, e-commerce analyst) ship as full members of the workspace with persistent memory, tool access, and a place in the team structure. Trilo was designed from the first commit to be AI-native.

  ┌─ trilo monorepo  (turborepo · bun workspaces · biome)
  │
  ├──▶ apps/web              [ next.js 16 · react 19 · tailwind · zustand · yjs ]
  │      └── lexical rich-text · real-time collab (CRDT) · stripe checkout
  │
  ├──▶ apps/backend          [ bun runtime · elysia · drizzle ORM ]
  │      ├── 18 background job processors
  │      ├── websockets + livekit integration
  │      ├── openrouter (multi-model via openai-compatible client)
  │      ├── opik LLM observability
  │      ├── composio toolkit  (gmail · github · linear · shopify · 11 total)
  │      ├── billing — stripe (subs · metered · connect) + apple iap
  │      └── mcp/server.ts  ──▶  77 tools across 14 modules · in anthropic registry
  │
  ├──▶ apps/mobile           [ expo · react native · ios + android · apple IAP ]
  │
  ├──▶ apps/desktop          [ tauri 2 · rust · 33 source files ]
  │
  ├──▶ apps/landing-page     [ next.js · vercel ]
  │
  └──▶ agent/                [ python · livekit · real-time transcription ]
  │
  └─ packages/db             [ postgres via supabase · drizzle ]
01  ·  DECISION
AI-planned DAG agent runtime

Off-the-shelf workflow tools (Temporal, Inngest, LangGraph) assume human-authored linear sequences. Trilo's runtime in workflowOrchestrator.ts uses an LLM as the planner: it generates the dependency graph from natural language, dispatches each step to a different AI coworker by capability, enforces workspace-scoped pinned tool parameters, and gates progress through three-way review (approve / reject / regenerate-with-feedback). A 200K-token completion budget caps cost per workflow, and a real safety guard force-serializes when the planner returns zero dependsOn annotations (a check that exists because it failed in production).

02  ·  DECISION
AI coworkers as live Yjs peers

Coworker templates connect to every Lexical page as server-side Yjs peers via Hocuspocus, appearing in awareness with a shimmer effect on the exact blocks they're editing. They sit inside the doc with persistent memory and tool access, the way a teammate does. Initialization order is load-bearing: the Yjs binding has to register before the WebSocket connects, and the Lexical → Yjs writeback listener has to register after sync. Get it wrong and Lexical's normalization permanently accumulates phantom paragraphs across every user's view.

03  ·  DECISION
77-tool MCP server in the Anthropic registry

Trilo ships a spec-compliant MCP server (built on @modelcontextprotocol/sdk, live at api.trilo.chat/mcp) and is published in the Anthropic MCP Registry. 77 tools across 14 domain modules: boards, tasks, messages, meetings, knowledge graph, pages (26 alone), spreadsheets and more. Auth supports PAT, bot tokens, and OAuth 2.1 with workspace-level isolation enforced on every call. Schemas are hand-authored Zod, used as the single source of truth shared with internal coworkers. Distribution by protocol, not by integration.

04  ·  DECISION
Rust ↔ Swift FFI for native StoreKit

The desktop app is Tauri 2, but the desktop story isn't "Tauri over Electron". The Rust shell calls Swift directly via extern "C" for StoreKit 2 (five IPC functions covering fetch, purchase, entitlements, restore and the transaction listener) plus Sign in with Apple. objc2 reaches into NSApplication.dockTile for badge counts where Tauri's API was broken, and tweaks WKWebView compositing to fix scroll artifacts in virtualized lists. The result is a native macOS App Store build that ships from the same Rust workspace as Linux and Windows.

05  ·  DECISION
Unified credit ledger across 5 billing surfaces

Stripe subscriptions, Stripe Billing Meters (metered AI overage), Stripe Connect (referral payouts), Apple IAP (8 product IDs across iOS + Mac App Store) and partner/promo plans all flow into a single Postgres credit ledger via a consistent resetCreditsForNewPeriod() regardless of which provider triggered the event. Referral commissions enforce a 30-day hold with atomic eligible → processing transitions before generating a Stripe Transfer with a deterministic idempotency key derived from sorted commission IDs. Full marketplace-payout correctness with deterministic idempotency.

06  ·  DECISION
Sammy: multi-speaker transcription infrastructure

Sammy is a Python LiveKit agent that delivers real-time multi-speaker transcription. It is not a conversational voice AI. The interesting work is in infrastructure: per-participant STT stream isolation with a custom 100ms / 5-frame backpressure drop policy, dual-path delivery (Redis WebSocket for authenticated members and a LiveKit data channel for guests), and a 15-second TTL timestamp dedup that survives LiveKit's forked-child dispatcher model where in-memory sets in the parent are invisible to the child.

PLATFORM STACK SCOPE STATE
web next.js 16 · react 19 · tailwind · zustand · yjs · lexical sole architect in production
backend bun · elysia · supabase · postgres · redis · drizzle sole architect in production
mobile expo · react native · ios + android · apple IAP sole architect shipped
desktop tauri 2 · rust · shared react frontend sole architect shipped
agent python · livekit · real-time transcription pipeline sole architect in production

Trilo is live at trilo.chat with paid plans, on the App Store, and as a Tauri desktop binary, all from a single monorepo, one architect. The technical bets that made this possible: an AI-planned DAG runtime where the planner is itself an LLM and capability dispatch routes steps to different coworker models; AI coworkers that connect to every Lexical page as live Yjs peers (they don't wait behind a slash command); a 77-tool MCP server in the Anthropic registry that lets external agents operate Trilo natively; and a Rust ↔ Swift FFI layer that makes the macOS build a native App Store binary. The deploy pipeline posts its own status updates into Trilo's chat. The workspace runs itself.