Local Mem0 MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add mem0-mcp -- docker exec -i mem0-mcp-server python /app/src/server.py
README.md

A fully self-hosted MCP server that integrates Mem0 for persistent memory.

Local Mem0 MCP Server

A fully self-hosted Model Context Protocol (MCP) server that integrates Mem0 for persistent memory capabilities. Enables AI assistants like Claude to store and retrieve contextual information across conversations.

✨ Features

  • 🧠 Persistent Memory: Store and retrieve memories across conversations
  • šŸ”’ Fully Self-Hosted: No external APIs or cloud dependencies
  • 🐳 Containerized: Complete Docker deployment with one command
  • šŸš€ Easy Installation: Single script setup for Windows, Mac, and Linux
  • šŸ¤– Local AI Models: Uses Ollama with phi3:mini and nomic-embed-text
  • šŸ“Š Vector Storage: PostgreSQL with pgvector for efficient memory search
  • šŸ”Œ MCP Compatible: Works with Claude Desktop and other MCP-capable AI tools

šŸš€ Quick Start

Prerequisites

Installation

Windows:

git clone https://github.com/Synapse-OS/local-mem0-mcp.git
cd local-mem0-mcp
install.bat

Mac/Linux:

git clone https://github.com/Synapse-OS/local-mem0-mcp.git
cd local-mem0-mcp
chmod +x install.sh
./install.sh

The installation will:

  1. Build the MCP server container
  2. Start PostgreSQL and Ollama services
  3. Download AI models (~2.5GB total)
  4. Configure Claude Desktop integration
  5. Test the installation

Testing

After installation and configuration:

  1. Restart Claude Desktop completely (close and reopen)
  2. Verify MCP server: Type /mcp - should list mem0-local as available
  3. Test memory storage: "Remember that I'm testing the MCP memory system today"
  4. Test memory retrieval: "What do you remember about me?"
  5. Verify persistence: Restart Claude Desktop and ask again - memories should persist

Troubleshooting MCP Connection:

  • If /mcp shows no servers, check the configuration file path and JSON syntax
  • Ensure Docker containers are running: docker ps
  • Check MCP server logs: docker logs mem0-mcp-server

šŸ—ļø Architecture

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│   Claude        │    │   MCP Server     │    │   PostgreSQL    │
│   Desktop       │◄──►│   (FastMCP)      │◄──►│   + pgvector    │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                │
                                ā–¼
                       ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                       │     Ollama       │
                       │   phi3:mini +    │
                       │ nomic-embed-text │
                       ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

šŸ”§ Configuration

Claude Desktop MCP Configuration

After installation, configure Claude Desktop to use the MCP server:

Windows: Edit %APPDATA%\Claude\claude_desktop_config.json:

Mac: Edit ~/Library/Application Support/Claude/claude_desktop_config.json:

Linux: Edit ~/.config/Claude/claude_desktop_config.json:

Add this configuration:

{
  "mcpServers": {
    "mem0-local": {
      "command": "docker",
      "args": [
        "exec", "-i", "mem0-mcp-server",
        "python", "/app/src/server.py"
      ]
    }
  }
}

System Configuration

The system is configured for local operation by default:

  • MCP Server: Runs in Docker container with STDIO transport
  • Database: PostgreSQL with pgvector on port 5432
  • AI Models: Local Ollama instance on port 11434
  • Memory Storage: User-isolated memories with vector embeddings

šŸ“‹ Available Memory Operations

  • add_memory: Store new memories
  • search_memories: Find relevant memories by query
  • get_all_memories: Retrieve all memories for a user
  • update_memory: Modify existing memories
  • delete_memory: Remove specific memories
  • delete_all_memories: Clear all memories for a user
  • get_memory_stats: Get memory statistics

šŸ” Troubleshooting

Check services:

docker ps

View logs:

docker-compose -f docker-compose.local.yml logs

Restart services:

docker-compose -f docker-compose.local.yml restart

Clean restart:

docker-compose -f docker-compose.local.yml down -v
# Run install script again

šŸ¤ Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

šŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

šŸ™ Acknowledgments

  • Mem0 - Memory management framework
  • FastMCP - MCP server implementation
  • Ollama - Local AI model inference
  • pgvector - Vector similarity search

Tools (7)

add_memoryStore new memories
search_memoriesFind relevant memories by query
get_all_memoriesRetrieve all memories for a user
update_memoryModify existing memories
delete_memoryRemove specific memories
delete_all_memoriesClear all memories for a user
get_memory_statsGet memory statistics

Configuration

claude_desktop_config.json
{"mcpServers": {"mem0-local": {"command": "docker", "args": ["exec", "-i", "mem0-mcp-server", "python", "/app/src/server.py"]}}}

Try it

→Remember that I prefer my code explanations to be concise and focused on best practices.
→What do you remember about my project preferences?
→Search my memories for the notes I saved about the API integration.
→Delete the memory regarding my old project requirements.
→Show me the statistics of how many memories I have stored.

Frequently Asked Questions

What are the key features of Local Mem0 MCP Server?

Persistent memory storage across different AI conversations. Fully self-hosted architecture with no external API dependencies. Containerized deployment using Docker for easy setup. Local AI model inference via Ollama. Efficient vector-based memory search using PostgreSQL and pgvector.

What can I use Local Mem0 MCP Server for?

Maintaining long-term context for coding projects across multiple sessions. Storing personal preferences and work habits for a more personalized AI assistant. Managing research notes and findings that need to be retrieved in future chats. Building a private, local knowledge base that stays on your machine.

How do I install Local Mem0 MCP Server?

Install Local Mem0 MCP Server by running: git clone https://github.com/Synapse-OS/local-mem0-mcp.git && cd local-mem0-mcp && install.bat

What MCP clients work with Local Mem0 MCP Server?

Local Mem0 MCP Server works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Local Mem0 MCP Server docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare