← Back to Massive Context MCP
Install Massive Context MCP
Pick your client, copy the command, done.
1
Add it to Claude Code
claude mcp add massive-context-mcp -- uvx massive-context-mcpEnvironment Variables
Set these before running Massive Context MCP.
VariableDescriptionRequired
RLM_DATA_DIRDirectory path for storing RLM data.NoOLLAMA_URLURL for the Ollama inference service.NoAvailable Tools (5)
Once configured, Massive Context MCP gives your AI agent access to:
rlm_system_checkVerify system requirements including macOS, Apple Silicon, RAM, and Homebrew.rlm_setup_ollamaInstall Ollama via Homebrew with managed service and auto-updates.rlm_setup_ollama_directInstall Ollama via direct download for headless environments.rlm_ollama_statusCheck Ollama availability to detect if free local inference is available.rlm_auto_analyzePerform one-step analysis by auto-detecting type, chunking, and querying.Try It Out
After setup, try these prompts with your AI agent:
→Analyze the provided massive dataset in ~/.rlm-data and summarize the key findings.
→Check if my system is ready for local inference using the rlm_system_check tool.
→Perform an auto-analysis on the document to extract the main themes.
→Verify if Ollama is currently running and available for local processing.
Prerequisites & system requirements
- An MCP-compatible client (Claude Code, Cursor, Windsurf, Claude Desktop, or Codex)
- Python 3.8+ with pip installed
Alternative installation methods
From Source
git clone https://github.com/egoughnour/massive-context-mcp.git && cd massive-context-mcp && uv syncKeep this setup from going cold
Save the docs, env vars, and workflow around Massive Context MCP in Conare so Claude Code, Codex, and Cursor remember it next time.