Engram MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add engram -- npx -y engram-sdk
README.md

The intelligence layer for AI agents

🧠 Engram

The intelligence layer for AI agents

Every AI agent is born smart but amnesiac. Engram fixes that. It doesn't just store memories -- it learns, consolidates patterns, detects contradictions, and surfaces context you didn't ask for.


Install

npm install -g engram-sdk
engram init

That's it. Works with Claude Code, Cursor, or any MCP client. Also available as a REST API and TypeScript SDK.


Why Engram

Existing memory solutions are storage layers -- they save facts and retrieve them. Engram is an intelligence layer with three tiers:

Tier What it does Who has it
Explicit Memory Stores facts, preferences, conversation turns Everyone
Implicit Memory Detects behavioral patterns from how users work Engram only
Synthesized Memory Consolidation produces insights nobody asked for Engram only

Key insight: Engram invests intelligence at read time (when the query is known), not write time (when you don't know what'll matter). This is the fundamental architectural difference from Mem0, Zep, and LangMem.


Benchmarks

Evaluated on LOCOMO -- the standard benchmark for agent memory systems. Same benchmark Mem0 used to claim state of the art.

System Accuracy Tokens/Query
Engram 80.0% 1,504
Full Context 88.4% 23,423
Mem0 (published) 66.9% --
MEMORY.md 28.8% --

10 conversations, 1,540 questions, 4 categories. 19.6% relative improvement over Mem0 with 93.6% fewer tokens than full context.

Full context (dumping entire conversation history) scores highest but uses 30x more tokens and can't scale past context window limits. Engram closes most of the gap while using 96.6% fewer tokens.

Full benchmark methodology and per-category breakdown


Features

  • MCP Server -- 10 memory tools for Claude Code, Cursor, and any MCP client
  • REST API -- Full HTTP API for any language or framework
  • TypeScript SDK -- Embedded use for Node.js agents
  • CLI -- Interactive REPL, bulk operations, eval tools
  • Model-agnostic -- Works with Gemini, OpenAI, Ollama, Groq, Cerebras (any OpenAI-compatible provider)
  • Zero infrastructure -- SQLite, no Docker, no Neo4j, no Redis
  • Consolidation -- LLM-powered memory merging, contradiction detection, pattern discovery
  • Entity-aware recall -- Knows "Sarah" in the query should boost memories about Sarah
  • Bi-temporal model -- Tracks when facts were true, not just when they were stored
  • Spreading activation -- Graph-based context surfacing

Quick Start

MCP Setup (Claude Code / Cursor)

npm install -g engram-sdk
engram init

REST API

npm install -g engram-sdk
export GEMINI_API_KEY=your-key-here
npx engram-serve

Server starts on http://127.0.0.1:3800.

Remember and Recall

# Store a memory
curl -X POST http://localhost:3800/v1/memories \
  -H "Content-Type: application/json" \
  -d '{"content": "User prefers TypeScript over JavaScript", "type": "semantic"}'

# Recall relevant memories
curl "http://localhost:3800/v1/memories/recall?context=language+preferences&limit=5"

TypeScript SDK

import { Vault } from 'engram-sdk';

const vault = new Vault({ owner: 'my-agent' });

await vault.remember('User prefers TypeScript');
const memories = await vault.recall('language preferences');
await vault.consolidate();

API Reference

Full REST API and MCP tool documentation: engram.fyi/docs


Configuration

Variable Description Default
GEMINI_API_KEY Gemini API key for embeddings and consolidation --
ENGRAM_LLM_BASE_URL Custom API base URL (Groq, Cerebras, Ollama, etc.) provider default
ENGRAM_LLM_MODEL LLM model name provider default
ENGRAM_DB_PATH SQLite database path ~/.engram/default.db
PORT Server port 3800
ENGRAM_AUTH_TOKEN Bearer token for API auth --

Benchmarks & Eval Scripts

This repo contains the evaluation scripts used to benchmark Engram:

  • eval-locomo.ts -- LOCOMO benchmark (the main result)
  • eval-letta.ts -- Letta Context-Bench evaluation
  • eval-codebase-v2.ts -- Enterprise codebase navigation benchmark
  • eval-enron.ts

Tools (3)

rememberStore a fact, preference, or conversation turn in the memory vault.
recallRetrieve relevant memories based on context and semantic similarity.
consolidateTrigger memory merging, contradiction detection, and pattern discovery.

Environment Variables

GEMINI_API_KEYGemini API key for embeddings and consolidation
ENGRAM_LLM_BASE_URLCustom API base URL for OpenAI-compatible providers
ENGRAM_LLM_MODELLLM model name to use
ENGRAM_DB_PATHSQLite database path
PORTServer port
ENGRAM_AUTH_TOKENBearer token for API auth

Configuration

claude_desktop_config.json
{"mcpServers": {"engram": {"command": "npx", "args": ["-y", "engram-sdk"]}}}

Try it

Remember that I prefer using TypeScript over JavaScript for all new projects.
Recall my previous language preferences and suggest a stack for my new project.
Consolidate my recent memories to identify any patterns in my workflow.
Find any memories related to 'Sarah' and summarize what I know about her.

Frequently Asked Questions

What are the key features of Engram?

Entity-aware recall for context-sensitive memory retrieval. Bi-temporal model tracking when facts were true. LLM-powered consolidation for contradiction detection and pattern discovery. Zero infrastructure requirement using local SQLite storage. Model-agnostic support for OpenAI-compatible providers.

What can I use Engram for?

Maintaining long-term user preferences across different coding sessions. Detecting behavioral patterns in developer workflows to suggest optimizations. Synthesizing insights from fragmented conversation history. Managing entity-specific context for complex project management tasks.

How do I install Engram?

Install Engram by running: npm install -g engram-sdk && engram init

What MCP clients work with Engram?

Engram works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Engram docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare