How it works
How Fathym turns every working session into compound intelligence — and what you can configure along the way.
Work
as normal
Capture
auto
Compound
background
Crystallize
over time
Fathym doesn't replace your tools. It wraps around them. Claude Code, Cursor, Windsurf, Codex, Cline, Ollama — whatever you already use keeps working exactly as it does today. You just run fai first.
That one command loads your three vaults — session, project, and personal — and hands all of that context to whatever AI you're using next. No configuration. No re-explaining yourself. Your context travels with you, not with the tool.
Your context loads instantly. Session 50 picks up exactly where session 49 left off — even across tools.
As you work, Fathym captures every AI interaction in real time — via hooks inside Claude Code, Cursor, and any other tool you've connected. You don't press anything. It just happens.
Captures are written as JSONL files on a journal/<session-name> git branch — not a database, not someone else's cloud. Your project directory. Your git history. Plain files you can read, inspect, and take with you.
Every connected AI tool writes to the same session. Seal when you're done — everything you did is now queued for synthesis.
AI transcripts, file changes, terminal commands, git events — written as JSONL files continuously. Zero AI calls. Zero cost.
You seal the session. The journal branch merges to pending — the staging area for synthesis. Other active sessions on the project inherit your context.
When enough sessions have sealed, four synthesis agents run in parallel — one each for Context, Pattern, Decision, and Petnames sub-vaults.
Synthesized knowledge commits to the kept branch — the stable, distilled record every new session inherits from.
Live captures. JSONL files on journal/<session>. AI transcripts, file changes, terminal activity. Temporary — feeds the knowledge vaults through synthesis.
What this project knows. Architecture decisions, naming conventions, stack choices. Shared by everyone on the project. Exports FAI-PROJECT.md — read by every AI on startup.
How you work, everywhere. Lives at ~/.fai/ — follows you across every project and machine. Exports FAI-PERSONAL.md.
This is where Fathym goes beyond anything your AI tools do on their own. Their memory is per-session, per-tool, and gone when you switch. What Fathym builds is structured, typed context — decisions with lineage, patterns with frequency, preferences with evidence.
In the background, synthesis agents read your captures and update the four sub-vaults: Context (current focus), Pattern (recurring behaviours), Decision (architectural choices), and Petnames (session naming). These aggregate into FAI-PROJECT.md and FAI-PERSONAL.md — what every AI reads on startup.
Here's the test: delete your .fai folder. You'll feel the regression immediately — the AI suddenly asking things it already knew, forgetting preferences you'd stopped thinking about. That gap is what Fathym is filling.
This isn't memory, and it isn't context. It's trained instinct — the compound result of everything you've done, synthesized into something the AI applies without being asked.
Compounding intelligence is powerful. But there's a fourth step most tools skip entirely: crystallization.
When Fathym notices you running the same workflow repeatedly — same input, same output, same structure — it can propose locking that pattern into a deterministic micro-agent. Typed input schema. Typed output. No AI thrashing. No model variance.
The pattern runs fast, cheap, and reliably. If a model upgrade breaks it, you fix the prompt once. And the more you crystallize, the less AI you actually need.
Think of it like micro frameworks for your workflow. Crystallize the repetitive stuff. Save the AI for the work that actually needs it.
The vault notices you've run this same workflow 12 times. Same inputs, same structure, same preferences applied every time.
A typed input schema and output spec are generated. You review and approve — or edit to refine.
The crystallized agent runs without full AI. Input in, output out. Cheap, fast, predictable.
If a new model or preference changes the output, you update the prompt once. The agent self-corrects everywhere.
Sessions are how you organize context. Name a session "auth" and everything you build — decisions, patterns, active workstreams — is scoped to that name. Close it when you're done.
Six months later, when auth work comes back up, you don't start from scratch. You resurrect that session. The branch reopens. Every decision from last time is still there. Your AI picks up like no time has passed.
And when you and a teammate are working on the same project? Every time your session seals, their session automatically receives your latest context. No merge conflicts. No catch-up conversations.
Sessions let you scope context to the work — not the calendar. Past work is never lost. It's just archived, waiting to be resumed.
fai's vault structure never changes. What changes is when synthesis runs and which model runs it. Configure once with fai synthesize — then it runs automatically.
Capture-only is free forever. A developer who captures for six months, then switches on synthesis, immediately gets six months of compound intelligence synthesized. Capture is never wasted.
Local models (Ollama, LM Studio) run synthesis free on your machine. No API key. No cost. Full privacy.
Everything captured to disk as JSONL. No synthesis, no AI calls. Vaults are searchable and inspectable. No API key needed.
Capture + synthesis on your schedule. Works with local models (Ollama, LM Studio) or cloud (OpenRouter, Groq, OpenAI). ~$1–2/day at moderate cadence with cloud.
Synthesis runs after every configured number of seals. Maximum freshness. Higher cost — immediate compound intelligence after each work block.
Install in under a minute. Your first session starts building context immediately — and it only gets smarter from there.
deno run -A jsr:@fathym/fai/install