GZOO Cortex MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add gzoo-cortex -- npx -y @gzoo/cortex mcp
README.md

Local-first knowledge graph for developers.

GZOO Cortex

Local-first knowledge graph for developers. Watches your project files, extracts entities and relationships using LLMs, and lets you query across all your projects in natural language.

“What architecture decisions have I made across projects?”

Cortex finds decisions from your READMEs, TypeScript files, config files, and conversation exports — then synthesizes an answer with source citations.

Why

You work on multiple projects. Decisions, patterns, and context are scattered across hundreds of files. You forget what you decided three months ago. You re-solve problems you already solved in another repo.

Cortex watches your project directories, extracts knowledge automatically, and gives it back to you when you need it.

What It Does

  • Watches your project files (md, ts, js, json, yaml) for changes
  • Extracts entities: decisions, patterns, components, dependencies, constraints, action items
  • Infers relationships between entities across projects
  • Detects contradictions when decisions conflict
  • Queries in natural language with source citations
  • Routes intelligently between cloud and local LLMs
  • Respects privacy — restricted projects never leave your machine
  • Web dashboard with knowledge graph visualization, live feed, and query explorer
  • MCP server for direct integration with Claude Code

Quick Start

1. Install

npm install -g @gzoo/cortex

Or install from source:

git clone https://github.com/gzoonet/cortex.git
cd cortex
npm install && npm run build && npm link

2. Setup

Run the interactive wizard:

cortex init

This walks you through:

  • LLM provider — Anthropic, Google Gemini, Groq, OpenRouter, or Ollama (local)
  • API key — saved securely to ~/.cortex/.env
  • Routing mode — cloud-first, hybrid, local-first, or local-only
  • Watch directories — which directories Cortex should monitor
  • Budget limit — monthly LLM spend cap

Config is stored at ~/.cortex/cortex.config.json. API keys go in ~/.cortex/.env.

3. Register Projects

cortex projects add my-app ~/projects/app
cortex projects add api ~/projects/api
cortex projects list                       # verify

4. Watch & Query

cortex watch                               # start watching for changes
cortex query "what caching strategies am I using?"
cortex query "what decisions have I made about authentication?"
cortex find "PostgreSQL" --expand 2
cortex contradictions

5. Web Dashboard

cortex serve                               # open http://localhost:3710

Excluding Files & Directories

Cortex ignores node_modules, dist, .git, and other common directories by default. To add more:

cortex config exclude add docs             # exclude a directory
cortex config exclude add "*.log"          # exclude by pattern
cortex config exclude list                 # see all excludes
cortex config exclude remove docs          # remove an exclude

How It Works

Cortex runs a pipeline on every file change:

  1. Parse — file content is chunked by a language-aware parser (tree-sitter for code, remark for markdown)
  2. Extract — LLM identifies entities (decisions, components, patterns, etc.)
  3. Relate — LLM infers relationships between new and existing entities
  4. Detect — contradictions and duplicates are flagged automatically
  5. Store — entities, relationships, and vectors go into SQLite + LanceDB
  6. Query — natural language queries search the graph and synthesize answers

All data stays local in ~/.cortex/. Only LLM API calls leave your machine (and never for restricted projects).

LLM Providers

Cortex is provider-agnostic. It supports:

  • Anthropic Claude (Sonnet, Haiku) — via native Anthropic API
  • Google Gemini — via OpenAI-compatible API
  • Any OpenAI-compatible API — OpenRouter, local proxies, etc.
  • Ollama (Mistral, Llama, etc.) — fully local, no cloud required

Routing Modes

Mode Cloud Cost Quality GPU Required
cloud-first Varies by provider Highest No
hybrid Reduced High Yes (Ollama)
local-first Minimal Good Yes (Ollama)
local-only $0 Good Yes (Ollama)

Hybrid mode routes high-volume tasks (entity extraction, ranking) to Ollama and reasoning-heavy tasks (relationship inference, queries) to your cloud provider.

Requirements

  • Node.js 20+
  • LLM API key for cloud modes — Anthropic, Google Gemini, or any OpenAI-compatible provider
  • Ollama (for hybrid/local modes) — install

Configuration

All config lives in ~/.cortex/cortex.config.json. API keys are in ~/.cortex/.env.

cortex config list                       # see

Tools (4)

get_statusReturns the current status of the Cortex knowledge graph.
list_projectsLists all projects currently registered and being watched by Cortex.
find_entitySearches for specific entities like decisions, components, or patterns within the knowledge graph.
query_cortexExecutes a natural language query across projects to retrieve information with source citations.

Environment Variables

ANTHROPIC_API_KEYAPI key for Anthropic services if using cloud-based LLM routing.

Configuration

claude_desktop_config.json
{"mcpServers": {"gzoo-cortex": {"command": "npx", "args": ["-y", "@gzoo/cortex", "mcp"]}}}

Try it

What architecture decisions have I made across my projects regarding authentication?
Find all entities related to 'PostgreSQL' in my codebase.
Are there any contradictions in my project decisions?
What caching strategies am I currently using in my active projects?

Frequently Asked Questions

What are the key features of GZOO Cortex?

Watches project files for changes in real-time. Extracts entities and relationships using LLMs. Supports natural language querying with source citations. Detects contradictions and duplicates across projects. Privacy-focused with local-first data storage.

What can I use GZOO Cortex for?

Retrieving past architectural decisions scattered across multiple repositories. Identifying conflicting patterns or constraints in large codebases. Synthesizing project context for documentation or onboarding. Tracking action items and dependencies across different project directories.

How do I install GZOO Cortex?

Install GZOO Cortex by running: npm install -g @gzoo/cortex

What MCP clients work with GZOO Cortex?

GZOO Cortex works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep GZOO Cortex docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare