Persistent, searchable shared memory for AI coding agents.
Ogham MCP
Ogham (pronounced "OH-um") -- persistent, searchable shared memory for AI coding agents. Works across clients.
Contents
- Retrieval quality -- 97.2% R@10 on LongMemEval
- The problem
- Quick start
- Installation methods -- Claude Code, OpenCode, Docker, source
- SSE transport -- multi-agent setup
- CLI -- command-line interface
- Configuration -- env vars, embedding providers, temporal search, lifecycle hooks
- MCP tools -- memory, search, graph, profiles, import/export
- Skills -- ogham-research, ogham-recall, ogham-maintain
- Scoring and condensing
- Database setup -- Supabase, Neon, vanilla Postgres
- Architecture
Retrieval quality
97.2% Recall@10 on LongMemEval (500 questions, ICLR 2025). No LLM in the search pipeline -- one PostgreSQL query, no neural rerankers, no knowledge graph.
End-to-end QA accuracy on LongMemEval (retrieval + LLM reads and answers):
| System | Accuracy | Architecture |
|---|---|---|
| OMEGA | 95.4% | Classification + extraction pipeline |
| Observational Memory (Mastra) | 94.9% | Observation extraction + GPT-5-mini |
| Hindsight (Vectorize) | 91.4% | 4 memory types + Gemini-3 |
| Zep (Graphiti) | 71.2% | Temporal knowledge graph + GPT-4o |
| Mem0 | 49.0% | RAG-based |
Retrieval only (R@10 -- no LLM in the search loop):
| System | R@10 | Architecture |
|---|---|---|
| Ogham | 97.2% | 1 SQL query (pgvector + tsvector CCF hybrid search) |
| LongMemEval paper baseline | 78.4% | Session decomposition + fact-augmented keys |
Other retrieval systems that report similar R@10 numbers typically use cross-encoder reranking, NLI verification, knowledge graph enrichment, and LLM-as-a-judge pipelines. Ogham reaches 97.2% with one Postgres query.
These tables measure different things. QA accuracy tests whether the full system (retrieval + LLM) produces the correct answer. R@10 tests whether retrieval alone finds the right memories. Ogham is a retrieval engine -- it finds the memories, your LLM reads them.
| Category | R@10 | Questions |
|---|---|---|
| single-session-assistant | 100% | 56 |
| knowledge-update | 100% | 78 |
| single-session-user | 98.6% | 70 |
| multi-session | 97.3% | 133 |
| single-session-preference | 96.7% | 30 |
| temporal-reasoning | 93.5% | 133 |
Full breakdown: ogham-mcp.dev/features
The problem
AI coding agents forget everything between sessions. Switch from Claude Code to Cursor to Kiro to OpenCode and context is lost. Decisions, gotchas, architectural patterns -- gone. You end up repeating yourself, re-explaining your codebase, re-debugging the same issues.
Ogham gives your agents a shared memory that persists across sessions and clients.
Quick start
1. Install
uvx ogham-mcp init
This runs the setup wizard. It walks you through everything: database connection, embedding provider, schema migration, and writes MCP client configs for Claude Code, Cursor, VS Code, and others.
You need a database before running this. Either create a free Supabase project or a Neon database. The wizard handles the rest.
Using Neon or self-hosted Postgres? Install with the postgres extra so the driver is available:
uvx --from 'ogham-mcp[postgres]' ogham-mcp init
2. Add to your MCP client
The wizard configures everything and writes your client config -- including all environment variables the server needs. For Claude Code, it runs claude mcp add automatically. For other clients, copy the config snippet it prints.
3. Use it
Tell your agent to remember something, then ask about it later -- from the same client or a different one. It works because they all share the same database.
Manual setup (
Tools (5)
memoryStore and retrieve persistent memories across AI sessions.searchPerform hybrid search (pgvector + tsvector) on stored memories.graphInteract with the knowledge graph of stored information.profilesManage user or agent profiles for memory scoping.import_exportImport or export memory data.Environment Variables
DATABASE_URLrequiredConnection string for the PostgreSQL database (Supabase, Neon, or self-hosted).Configuration
{"mcpServers": {"ogham": {"command": "uvx", "args": ["ogham-mcp"], "env": {"DATABASE_URL": "your-db-url-here"}}}}