Fremem MCP Server

A persistent vector memory server for Windsurf, VS Code, and other MCP editors.

README.md

Fremem (formerly MCP Memory Server)

A persistent vector memory server for Windsurf, VS Code, and other MCP-compliant editors.

🌟 Philosophy

  • Privacy-first, local-first AI memory: Your data stays on your machine.
  • No vendor lock-in: Uses open standards and local files.
  • Built for MCP: Designed specifically to enhance Windsurf, Cursor, and other MCP-compatible IDEs.

â„šī¸ Status (v0.2.0)

Stable:

  • ✅ Local MCP memory with Windsurf/Cursor
  • ✅ Multi-project isolation
  • ✅ Ingestion of Markdown docs

Not stable yet:

  • 🚧 Auto-ingest (file watching)
  • 🚧 Memory pruning
  • 🚧 Remote sync

Note: There are two ways to run this server:

  1. Local IDE (stdio): Used by Windsurf/Cursor (default).
  2. Docker/Server (HTTP): Used for remote deployments or Docker (exposes port 8000).

đŸĨ Health Check

To verify the server binary runs correctly:

# From within the virtual environment
python -m fremem.server --help

✅ Quickstart (5-Minute Setup)

There are two ways to set this up: Global Install (recommended for ease of use) or Local Dev.

Option A: Global Install (Like `npm -g`)

This method allows you to run fremem from anywhere without managing virtual environments manually.

1. Install pipx (if not already installed):

MacOS (via Homebrew):

brew install pipx
pipx ensurepath
# Restart your terminal after this!

Linux/Windows: See pipx installation instructions.

2. Install fremem:

# Install from PyPI
pipx install fremem

# Verify installation
fremem --help

Configure Windsurf / VS Code:

Since pipx puts the executable in your PATH, the config is simpler:

{
  "mcpServers": {
    "memory": {
      "command": "fremem",
      "args": [],
      "env": {
        "MCP_MEMORY_PATH": "/Users/YOUR_USERNAME/mcp-memory-data"
      }
    }
  }
}

Note on MCP_MEMORY_PATH: This is where fremem will store its persistent database. You can point this to any directory you like (checks locally or creating it if it doesn't exist). We recommend something like ~/mcp-memory-data or ~/.fremem-data. It must be an absolute path.

Option B: Local Dev Setup

1. Clone and Setup

git clone https://github.com/iamjpsharma/fremem.git
cd fremem

# Create virtual environment
python3 -m venv .venv
source .venv/bin/activate

# Install dependencies AND the package in editable mode
pip install -e .

2. Configure Windsurf / VS Code (Local Dev)

Add this to your mcpServers configuration (e.g., ~/.codeium/windsurf/mcp_config.json):

Note: Replace /ABSOLUTE/PATH/TO/fremem with the actual full path to the cloned directory.

{
  "mcpServers": {
    "memory": {
      "command": "/ABSOLUTE/PATH/TO/fremem/.venv/bin/python",
      "args": ["-m", "fremem.server"],
      "env": {
        "MCP_MEMORY_PATH": "/ABSOLUTE/PATH/TO/fremem/mcp_memory_data"
      }
    }
  }
}

In local dev mode, it's common to store the data inside the repo (ignored by git), but you can use any absolute path.

🚀 Usage

0. HTTP Server (New)

You can run the server via HTTP (SSE) if you prefer:

# Run on port 8000
python -m fremem.server_http

Access the SSE endpoint at http://localhost:8000/sse and send messages to http://localhost:8000/messages.

đŸŗ Run with Docker

To run the server in a container:

# Build the image
docker build -t fremem .

# Run the container
# Mount your local data directory to /data inside the container
docker run -p 8000:8000 -v $(pwd)/mcp_memory_data:/data fremem

The server will be available at http://localhost:8000/sse.

1. Ingestion (Adding Context)

Use the included helper script ingest.sh to add files to a specific project.

# ingest.sh  <file1> <file2> ...

# Example: Project "Thaama"
./ingest.sh project-thaama \
  docs/architecture.md \
  src/main.py

# Example: Project "OpenClaw"
./ingest.sh project-openclaw \
  README.md \
  CONTRIBUTING.md

💡 Project ID Naming Convention

It is recommended to use a consistent prefix for your project IDs to avoid collisions:

  • project-thaama
  • project-openclaw
  • project-myapp

2. Connect in Editor

Once configured, the following tools will be available to the AI Assistant:

  • memory_search(project_id, q, filter=None): Semantic search. Supports metadata filtering (e.g., filter={"type": "code"}). Returns distance scores.
  • memory_add(project_id, id, text): Manual addition.
  • memory_list_sources(project_id): specific files ingested.
  • memory_delete_source(project_id, source): Remove a specific file.
  • memory_stats(project_id): Get chunk count.
  • **`memory_re

Tools 5

memory_searchPerforms a semantic search across project memory.
memory_addManually adds text content to a specific project memory.
memory_list_sourcesLists all files or sources ingested for a specific project.
memory_delete_sourceRemoves a specific file or source from project memory.
memory_statsRetrieves the chunk count and statistics for a project.

Environment Variables

MCP_MEMORY_PATHrequiredAbsolute path where the persistent database files are stored.

Try it

→Search my project-thaama memory for information regarding the architecture design.
→What are the current statistics for the project-openclaw memory?
→List all the source files currently indexed in project-myapp.
→Add a new memory entry to project-thaama about the recent API changes.

Frequently Asked Questions

What are the key features of Fremem?

Persistent local vector memory using LanceDB. Multi-project isolation for context management. Semantic search capabilities for project documentation and code. Support for both stdio (IDE) and HTTP (SSE) transport modes. Privacy-first design with no external API keys required.

What can I use Fremem for?

Maintaining long-term context across different coding projects in Windsurf or Cursor.. Ingesting project-specific documentation to allow AI assistants to answer questions about internal architecture.. Managing local knowledge bases without relying on cloud-based vector databases.. Tracking and searching through project-specific notes and code snippets..

How do I install Fremem?

Install Fremem by running: pipx install fremem

What MCP clients work with Fremem?

Fremem works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Fremem docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Open Conare