Fast C++ code intelligence for LLMs via MCP

**🦘 Clangaroo: Fast C++ code intelligence for LLMs via MCP**
✨ About
NOTE (January 2026): Claude Code now has built-in support for LSPs, making this unnecessary. Since it may still be useful in other agentic harnesses I'll leave the project here for now.
Clangaroo enables Claude Code, Gemini CLI, and other coding agents to jump around your C++ codebase with ease. Clangaroo provides fast, direct lookup of C/C++ symbols, functions, definitions, call hierarchies, type hierarchies, and more by your bestest LLM pals.
Clangaroo combines the speed of Tree-sitter parsing with the accuracy of clangd LSP, optionally enhanced by Google Gemini Flash AI for deeper insights. Let your AI buddies spend more time coding and less time stumbling around.
But WHY did you make this? I ❤️ using Claude Code, but every time it auto-compacts and then starts grepping around for the function we've been working on for forever, I die a little bit inside. But aren't there already a few MCPs that do this - why do we need another? I spent some time searching and found both MCP-language-server and Serena, which both look perfectly nice! Unfortunately, neither worked for me 😭
Clangaroo is meant to be super simple and is intended to 'just work'.
📚 Table of Contents
- 🚀 Quick Start
- 🎯 Features
- 💬 Usage Examples
- 🛠️ Available Tools
- 🤖 AI Features (Optional)
- ⚙️ Configuration Reference
- 📋 Requirements
- 🔧 Troubleshooting
- 📄 License
- 🙏 Acknowledgments
🚀 Quick Start
1. Install Clangaroo
git clone https://github.com/jasondk/clangaroo
cd clangaroo
pip install -e .
2. Special compilation step for your C++ project
The clang LSP needs you to do this once:
# For Makefile-based projects
make clean
compiledb make
# (Some people prefer using 🐻)
bear -- make
# For CMake projects
cmake -B build -DCMAKE_EXPORT_COMPILE_COMMANDS=ON
cp build/compile_commands.json .
This will create a special compile_commands.json file in your project root.
3. Configure Claude Desktop or other MCP client
Did you know you can now add MCP servers to LM Studio?
N.B.: Use of --ai-enabled will use Google Gemini and will incur a small cost via your Gemini API key, if provided. This is usually very minor as long as you use Gemini Flash or Flash Lite.
Note: Please replace 'command' and 'project' with correct paths for your system, and replace your-google-ai-api-key with your API key (if using one). If you don't wish to use the AI enhanced services, simply leave out all the --ai options and the API key.
{
"mcpServers": {
"clangaroo": {
"command": "/usr/local/bin/clangaroo",
"args": [
"--project", "/path/to/your/cpp/project",
"--warmup",
"--warmup-limit", "10",
"--log-level", "info",
"--ai-enabled",
"--ai-provider", "gemini-2.5-flash",
"--ai-cache-days", "14",
"--ai-cost-limit", "15.0",
"--call-hierarchy-depth", "10",
"--ai-analysis-level", "summary",
"--ai-context-level", "minimal"
],
"env": {
"CLANGAROO_AI_API_KEY": "your-google-ai-api-key"
}
}
}
}
📍 Claude Desktop config file locations
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Default depth of AI analysis (--ai-analysis-level, default: summary).
summary: Quick overview with key pointsdetailed: Comprehensive analysis with examples and context
Default depth of context (--ai-context-level, default: minimal).
minimal: Just the symbol and immediate documentationlocal: Include surrounding code in the same file- `f
Environment Variables
CLANGAROO_AI_API_KEYGoogle Gemini API key for AI-enhanced code analysisConfiguration
{"mcpServers": {"clangaroo": {"command": "/usr/local/bin/clangaroo", "args": ["--project", "/path/to/your/cpp/project", "--warmup", "--warmup-limit", "10", "--log-level", "info", "--ai-enabled", "--ai-provider", "gemini-2.5-flash", "--ai-cache-days", "14", "--ai-cost-limit", "15.0", "--call-hierarchy-depth", "10", "--ai-analysis-level", "summary", "--ai-context-level", "minimal"], "env": {"CLANGAROO_AI_API_KEY": "your-google-ai-api-key"}}}}