GrantAi MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add grant-ai -- docker run -i --rm --pull always -v grantai-data:/data ghcr.io/solonai-com/grantai-memory:1.8.6
README.md

Deterministic Memory for AI

GrantAi

Deterministic Memory for AI Local. Private. Secure.

Website • Download • Documentation


The Problem

Every AI system today has the same flaw: it guesses instead of remembers.

RAG (Retrieval-Augmented Generation) converts your documents into vectors — numerical approximations of meaning. When you query, it returns content that is mathematically similar to your question. Similar is not the same as correct.

Ask for "HIPAA encryption penalties" and RAG returns chunks that look like compliance content. Maybe the right section. Maybe adjacent paragraphs. Maybe hallucinated ranges. You pay for every token retrieved, whether relevant or not.

This is the Retrieval Tax:

  • Re-retrieval — Same questions, same searches, same cost
  • Over-retrieval — 20 chunks when you need 3
  • Labor — Engineers tuning embeddings instead of building products
  • Risk — Approximate answers in domains that require precision

Enterprise AI spends 85% of compute on inference. Most of that is wasted on retrieving content that doesn't answer the question.

The Solution

GrantAi is deterministic memory for AI agents.

Instead of similarity search, GrantAi uses direct addressing. Every piece of knowledge has a unique identifier. Retrieval is a lookup, not a search. You get the exact content you indexed — verbatim, with attribution, in milliseconds.

RAG GrantAi
Returns similar content Returns the exact content
10-20 chunks, hope one is right 1-3 sentences, always right
Slows down as corpus grows Milliseconds regardless of size
No attribution Full audit trail
Approximate Deterministic

Result: 97% reduction in tokens sent to the LLM. Faster responses. Lower cost. No hallucination from retrieval.

Why It Matters

  • Compliance — Exact citations, not paraphrased guesses
  • Multi-Agent — Shared memory across your AI workforce with speaker attribution
  • Cost — Pay for answers, not for searching
  • Security — 100% local, AES-256 encrypted, zero data egress

Quick Start

macOS / Linux (Native)

# 1. Download from https://solonai.com/grantai/download
# 2. Extract and install
./install.sh

# 3. Restart your AI tool (Claude Code, Cursor, etc.)

Docker (All Platforms)

docker pull ghcr.io/solonai-com/grantai-memory:1.8.6

Add to your Claude Desktop config (~/.config/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "grantai": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "--pull", "always",
               "-v", "grantai-data:/data",
               "ghcr.io/solonai-com/grantai-memory:1.8.6"]
    }
  }
}

Supported Platforms

Platform Method Status
macOS (Apple Silicon) Native
Linux (x64) Native
Windows Native
All Platforms Docker

MCP Tools

GrantAi provides these tools to your AI:

Tool Description
grantai_infer Query memory for relevant context
grantai_teach Store content for future recall
grantai_learn Import files or directories
grantai_health Check server status
grantai_summarize Store session summaries
grantai_project Track project state
grantai_snippet Store code patterns
grantai_git Import git commit history
grantai_capture Save conversation turns for continuity

Multi-Agent Memory Sharing

Multiple agents can share knowledge through GrantAi's memory layer.

Basic shared memory (no setup required)

# Any agent stores
grantai_teach(
    content="API rate limit is 100 requests/minute.",
    source="api-notes"
)

# Any agent retrieves
grantai_infer(input="API rate limiting")

All agents read from and write to the same memory pool. No configuration needed.

With agent attribution (optional)

Use speaker to track which agent stored what, and from_agents to filter retrieval:

# Store with identity
grantai_teach(
    content="API uses Bearer token auth.",
    source="api-research",
    speaker="researcher"  # optional
)

# Retrieve from specific agent
grantai_infer(
    input="API authentication",
    from_agents=["researcher"]  # optional filter
)

When to use `speaker`

Scenario Use speaker? Why
Shared knowledge base No All contributions equal, no filtering needed
Session continuity No Same context, just persist and retrieve
Research → Code handoff Yes Coder filters for researcher's findings only
Role-based trust

Tools (9)

grantai_inferQuery memory for relevant context
grantai_teachStore content for future recall
grantai_learnImport files or directories
grantai_healthCheck server status
grantai_summarizeStore session summaries
grantai_projectTrack project state
grantai_snippetStore code patterns
grantai_gitImport git commit history
grantai_captureSave conversation turns for continuity

Configuration

claude_desktop_config.json
{"mcpServers": {"grantai": {"command": "docker", "args": ["run", "-i", "--rm", "--pull", "always", "-v", "grantai-data:/data", "ghcr.io/solonai-com/grantai-memory:1.8.6"]}}}

Try it

Store this API documentation in my memory using grantai_teach so I can reference it later.
Use grantai_infer to find the authentication requirements for the project we discussed yesterday.
Import the current project directory into GrantAi memory using grantai_learn.
Save this code snippet for a reusable React component using grantai_snippet.
Summarize our current session and store it in memory with grantai_summarize.

Frequently Asked Questions

What are the key features of GrantAi?

Deterministic memory retrieval using direct addressing instead of vector similarity. 100% local, AES-256 encrypted storage with zero data egress. Shared memory pool for multi-agent collaboration with speaker attribution. Full audit trail and verbatim content retrieval with attribution. Significant reduction in token usage compared to traditional RAG.

What can I use GrantAi for?

Maintaining long-term context across multiple AI coding sessions. Ensuring compliance by retrieving exact citations rather than paraphrased guesses. Sharing research findings between different AI agents in a team. Storing and retrieving specific code patterns and project state. Building secure, private knowledge bases for enterprise AI applications.

How do I install GrantAi?

Install GrantAi by running: docker pull ghcr.io/solonai-com/grantai-memory:1.8.6

What MCP clients work with GrantAi?

GrantAi works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep GrantAi docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare