Fremem (formerly MCP Memory Server)
A persistent vector memory server for Windsurf, VS Code, and other MCP-compliant editors.
đ Philosophy
- Privacy-first, local-first AI memory: Your data stays on your machine.
- No vendor lock-in: Uses open standards and local files.
- Built for MCP: Designed specifically to enhance Windsurf, Cursor, and other MCP-compatible IDEs.
âšī¸ Status (v0.2.0)
Stable:
- â Local MCP memory with Windsurf/Cursor
- â Multi-project isolation
- â Ingestion of Markdown docs
Not stable yet:
- đ§ Auto-ingest (file watching)
- đ§ Memory pruning
- đ§ Remote sync
Note: There are two ways to run this server:
- Local IDE (stdio): Used by Windsurf/Cursor (default).
- Docker/Server (HTTP): Used for remote deployments or Docker (exposes port 8000).
đĨ Health Check
To verify the server binary runs correctly:
# From within the virtual environment
python -m fremem.server --help
â Quickstart (5-Minute Setup)
There are two ways to set this up: Global Install (recommended for ease of use) or Local Dev.
Option A: Global Install (Like `npm -g`)
This method allows you to run fremem from anywhere without managing virtual environments manually.
1. Install pipx (if not already installed):
MacOS (via Homebrew):
brew install pipx
pipx ensurepath
# Restart your terminal after this!
Linux/Windows: See pipx installation instructions.
2. Install fremem:
# Install from PyPI
pipx install fremem
# Verify installation
fremem --help
Configure Windsurf / VS Code:
Since pipx puts the executable in your PATH, the config is simpler:
{
"mcpServers": {
"memory": {
"command": "fremem",
"args": [],
"env": {
"MCP_MEMORY_PATH": "/Users/YOUR_USERNAME/mcp-memory-data"
}
}
}
}
Note on
MCP_MEMORY_PATH: This is wherefrememwill store its persistent database. You can point this to any directory you like (checks locally or creating it if it doesn't exist). We recommend something like~/mcp-memory-dataor~/.fremem-data. It must be an absolute path.
Option B: Local Dev Setup
1. Clone and Setup
git clone https://github.com/iamjpsharma/fremem.git
cd fremem
# Create virtual environment
python3 -m venv .venv
source .venv/bin/activate
# Install dependencies AND the package in editable mode
pip install -e .
2. Configure Windsurf / VS Code (Local Dev)
Add this to your mcpServers configuration (e.g., ~/.codeium/windsurf/mcp_config.json):
Note: Replace /ABSOLUTE/PATH/TO/fremem with the actual full path to the cloned directory.
{
"mcpServers": {
"memory": {
"command": "/ABSOLUTE/PATH/TO/fremem/.venv/bin/python",
"args": ["-m", "fremem.server"],
"env": {
"MCP_MEMORY_PATH": "/ABSOLUTE/PATH/TO/fremem/mcp_memory_data"
}
}
}
}
In local dev mode, it's common to store the data inside the repo (ignored by git), but you can use any absolute path.
đ Usage
0. HTTP Server (New)
You can run the server via HTTP (SSE) if you prefer:
# Run on port 8000
python -m fremem.server_http
Access the SSE endpoint at http://localhost:8000/sse and send messages to http://localhost:8000/messages.
đŗ Run with Docker
To run the server in a container:
# Build the image
docker build -t fremem .
# Run the container
# Mount your local data directory to /data inside the container
docker run -p 8000:8000 -v $(pwd)/mcp_memory_data:/data fremem
The server will be available at http://localhost:8000/sse.
1. Ingestion (Adding Context)
Use the included helper script ingest.sh to add files to a specific project.
# ingest.sh <file1> <file2> ...
# Example: Project "Thaama"
./ingest.sh project-thaama \
docs/architecture.md \
src/main.py
# Example: Project "OpenClaw"
./ingest.sh project-openclaw \
README.md \
CONTRIBUTING.md
đĄ Project ID Naming Convention
It is recommended to use a consistent prefix for your project IDs to avoid collisions:
project-thaamaproject-openclawproject-myapp
2. Connect in Editor
Once configured, the following tools will be available to the AI Assistant:
memory_search(project_id, q, filter=None): Semantic search. Supports metadata filtering (e.g.,filter={"type": "code"}). Returns distance scores.memory_add(project_id, id, text): Manual addition.memory_list_sources(project_id): specific files ingested.memory_delete_source(project_id, source): Remove a specific file.memory_stats(project_id): Get chunk count.- **`memory_re
Tools 5
memory_searchPerforms a semantic search across project memory.memory_addManually adds text content to a specific project memory.memory_list_sourcesLists all files or sources ingested for a specific project.memory_delete_sourceRemoves a specific file or source from project memory.memory_statsRetrieves the chunk count and statistics for a project.Environment Variables
MCP_MEMORY_PATHrequiredAbsolute path where the persistent database files are stored.