RobotMem MCP Server

Local setup required. This server has to be cloned and prepared on your machine before you register it in Claude Code.
1

Set the server up locally

Run this once to clone and prepare the server before adding it to Claude Code.

Run in terminal
pip install robotmem
2

Register it in Claude Code

After the local setup is done, run this command to point Claude Code at the built server.

Run in terminal
claude mcp add robotmem -- node "<FULL_PATH_TO_ROBOTMEM>/dist/index.js"

Replace <FULL_PATH_TO_ROBOTMEM>/dist/index.js with the actual folder you prepared in step 1.

README.md

Let Robots Learn from Experience

中文版

robotmem — Let Robots Learn from Experience

Your robot ran 1000 experiments, starting from scratch every time. robotmem stores episode experiences — parameters, trajectories, outcomes — and retrieves the most relevant ones to guide future decisions.

FetchPush experiment: +25% success rate improvement (42% → 67%), CPU-only, reproducible in 5 minutes.

Quick Start

pip install robotmem
from robotmem import learn, recall, save_perception, start_session, end_session

# Start an episode
session = start_session(context='{"robot_id": "arm-01", "task": "push"}')

# Record experience
learn(
    insight="grip_force=12.5N yields highest grasp success rate",
    context='{"params": {"grip_force": {"value": 12.5, "unit": "N"}}, "task": {"success": true}}'
)

# Retrieve experiences (structured filtering + spatial nearest-neighbor)
memories = recall(
    query="grip force parameters",
    context_filter='{"task.success": true}',
    spatial_sort='{"field": "spatial.position", "target": [1.3, 0.7, 0.42]}'
)

# Store perception data
save_perception(
    description="Grasp trajectory: 30 steps, success",
    perception_type="procedural",
    data='{"sampled_actions": [[0.1, -0.3, 0.05, 0.8], ...]}'
)

# End episode (auto-consolidation + proactive recall)
end_session(session_id=session["session_id"])

7 APIs

API Purpose
learn Record physical experiences (parameters / strategies / lessons)
recall Retrieve experiences — BM25 + vector hybrid search with context_filter and spatial_sort
save_perception Store perception / trajectory / force data (visual / tactile / proprioceptive / auditory / procedural)
forget Delete incorrect memories
update Correct memory content
start_session Begin an episode
end_session End an episode (auto-consolidation + proactive recall)

Key Features

Structured Experience Retrieval

Not just vector search — robotmem understands the structure of robot experiences:

# Retrieve only successful experiences
recall(query="push to target", context_filter='{"task.success": true}')

# Find spatially nearest scenarios
recall(query="grasp object", spatial_sort='{"field": "spatial.object_position", "target": [1.3, 0.7, 0.42]}')

# Combine: success + distance < 0.05m
recall(
    query="push",
    context_filter='{"task.success": true, "params.final_distance.value": {"$lt": 0.05}}'
)

Context JSON — 4 Sections

{
    "params":  {"grip_force": {"value": 12.5, "unit": "N", "type": "scalar"}},
    "spatial": {"object_position": [1.3, 0.7, 0.42], "target_position": [1.25, 0.6, 0.42]},
    "robot":   {"id": "fetch-001", "type": "Fetch", "dof": 7},
    "task":    {"name": "push_to_target", "success": true, "steps": 38}
}

Each recalled memory automatically extracts params / spatial / robot / task as top-level fields.

Memory Consolidation + Proactive Recall

end_session automatically triggers:

  • Consolidation: Merges similar memories with Jaccard similarity > 0.50 (protects constraint / postmortem / high-confidence entries)
  • Proactive Recall: Returns historically relevant memories for the next episode

FetchPush Demo

cd examples/fetch_push
pip install gymnasium-robotics
PYTHONPATH=../../src python demo.py  # 90 episodes, ~2 min

Three-phase experiment: baseline → memory writing → memory utilization. Expected Phase C success rate 10-20% higher than Phase A.

Architecture

SQLite + FTS5 + vec0
├── BM25 full-text search (jieba CJK tokenizer)
├── Vector search (FastEmbed ONNX, CPU-only)
├── RRF fusion ranking
├── Structured filtering (context_filter)
└── Spatial nearest-neighbor sorting (spatial_sort)
  • CPU-only, no GPU required
  • Single-file database ~/.robotmem/memory.db
  • MCP Server (7 tools) or direct Python import
  • Web management UI: robotmem web

Comparison

Feature MemoryVLA (Academic) Mem0 (Product) robotmem
Target users Specific VLA models Text AI Robotic AI
Memory format Vectors (opaque) Text Natural language + perception + parameters
Structured filtering No No Yes (context_filter)
Spatial retrieval No No Yes (spatial_sort)
Physical parameters No No Yes (params section)
Installation Compile from paper code pip install pip install
Database Embedded Cloud Local SQLite

License

Apache-2.0

Tools (7)

learnRecord physical experiences including parameters, strategies, and lessons.
recallRetrieve experiences using BM25 and vector hybrid search with context filtering and spatial sorting.
save_perceptionStore perception, trajectory, or force data such as visual, tactile, or procedural data.
forgetDelete incorrect memories.
updateCorrect existing memory content.
start_sessionBegin a new robotic episode.
end_sessionEnd an episode, triggering auto-consolidation and proactive recall.

Configuration

claude_desktop_config.json
{"mcpServers": {"robotmem": {"command": "python", "args": ["-m", "robotmem.mcp"]}}}

Try it

Recall all successful grasp experiences where the grip force was greater than 10N.
Find memories related to 'push to target' that are spatially nearest to the coordinates [1.3, 0.7, 0.42].
Start a new session for robot arm-01 to perform a push task.
Save a new perception record for a grasp trajectory with 30 steps.
End the current session and consolidate the learned experiences.

Frequently Asked Questions

What are the key features of RobotMem?

Hybrid vector and BM25 search for retrieving episodic experiences. Structured filtering based on task success and physical parameters. Spatial nearest-neighbor sorting for physical task scenarios. Automatic memory consolidation using Jaccard similarity. Local-only storage using a single-file SQLite database.

What can I use RobotMem for?

Improving success rates for robotic manipulation tasks like FetchPush. Storing and retrieving proprioceptive and tactile data for robot learning. Managing robot memory across multiple experimental episodes. Filtering past robot experiences based on specific task outcomes and spatial constraints.

How do I install RobotMem?

Install RobotMem by running: pip install robotmem

What MCP clients work with RobotMem?

RobotMem works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep RobotMem docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare