MCP server for OpenClaw AI assistant integration.
OpenClaw MCP Server
š¦ Model Context Protocol (MCP) server for OpenClaw AI assistant integration.
Demo
Why I Built This
Hey! I created this MCP server because I didn't want to rely solely on messaging channels to communicate with OpenClaw. What really excites me is the ability to connect OpenClaw to the Claude web UI. Essentially, my chat can delegate tasks to my Claw bot, which then handles everything else ā like spinning up Claude Code to fix issues for me.
Think of it as an AI assistant orchestrating another AI assistant. Pretty cool, right?
Quick Start
Docker (Recommended)
Pre-built images are published to GitHub Container Registry on every release.
docker pull ghcr.io/freema/openclaw-mcp:latest
Create a docker-compose.yml:
services:
mcp-bridge:
image: ghcr.io/freema/openclaw-mcp:latest
container_name: openclaw-mcp
restart: unless-stopped
ports:
- "3000:3000"
environment:
- OPENCLAW_URL=http://host.docker.internal:18789
- OPENCLAW_GATEWAY_TOKEN=${OPENCLAW_GATEWAY_TOKEN}
- AUTH_ENABLED=true
- MCP_CLIENT_ID=openclaw
- MCP_CLIENT_SECRET=${MCP_CLIENT_SECRET}
- MCP_ISSUER_URL=${MCP_ISSUER_URL:-}
- CORS_ORIGINS=https://claude.ai
extra_hosts:
- "host.docker.internal:host-gateway"
read_only: true
security_opt:
- no-new-privileges
Generate secrets and start:
export MCP_CLIENT_SECRET=$(openssl rand -hex 32)
export OPENCLAW_GATEWAY_TOKEN=your-gateway-token
docker compose up -d
Then in Claude.ai add a custom MCP connector pointing to your server with MCP_CLIENT_ID=openclaw and your MCP_CLIENT_SECRET.
Tip: Pin a specific version instead of
latestfor production:ghcr.io/freema/openclaw-mcp:1.1.0
Local (Claude Desktop)
npx openclaw-mcp
Add to your Claude Desktop config:
{
"mcpServers": {
"openclaw": {
"command": "npx",
"args": ["openclaw-mcp"],
"env": {
"OPENCLAW_URL": "http://127.0.0.1:18789",
"OPENCLAW_GATEWAY_TOKEN": "your-gateway-token",
"OPENCLAW_TIMEOUT_MS": "300000"
}
}
}
}
Remote (Claude.ai) without Docker
AUTH_ENABLED=true MCP_CLIENT_ID=openclaw MCP_CLIENT_SECRET=your-secret \
MCP_ISSUER_URL=https://mcp.your-domain.com \
CORS_ORIGINS=https://claude.ai OPENCLAW_GATEWAY_TOKEN=your-gateway-token \
npx openclaw-mcp --transport sse --port 3000
Important: When running behind a reverse proxy (Caddy, nginx, etc.), you must set
MCP_ISSUER_URL(or--issuer-url) to your public HTTPS URL. Without this, OAuth metadata will advertisehttp://localhost:3000and clients will fail to authenticate.
See Installation Guide for details.
Architecture
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Your Server ā
ā ā
ā āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāāāāāāāāā ā
ā ā OpenClaw ā ā OpenClaw MCP ā ā
ā ā Gateway āāāāāāāŗā Bridge Server ā ā
ā ā :18789 ā ā :3000 ā ā
ā ā ā ā ā ā
ā ā OpenAI-compat ā ā - OAuth 2.1 auth ā ā
ā ā /v1/chat/... ā ā - CORS protection ā ā
ā āāāāāāāāāāāāāāāāāāā ā - Input validation ā ā
ā āāāāāāāāāāāā¬āāāāāāāāāāāāāāā ā
ā ā ā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā¼āāāāāāāāāāāāāāāāāāāāāāāāāāā
ā HTTPS + OAuth 2.1
ā¼
āāāāāāāāāāāāāāāāāāā
ā Claude.ai ā
ā (MCP Client) ā
āāāāāāāāāāāāāāāāāāā
Environment Variables
OPENCLAW_URLrequiredThe URL of your OpenClaw gatewayOPENCLAW_GATEWAY_TOKENrequiredAuthentication token for the OpenClaw gatewayAUTH_ENABLEDEnable or disable OAuth2 authenticationMCP_CLIENT_IDClient ID for OAuth2 authenticationMCP_CLIENT_SECRETClient secret for OAuth2 authenticationMCP_ISSUER_URLPublic HTTPS URL for OAuth metadataConfiguration
{"mcpServers": {"openclaw": {"command": "npx", "args": ["openclaw-mcp"], "env": {"OPENCLAW_URL": "http://127.0.0.1:18789", "OPENCLAW_GATEWAY_TOKEN": "your-gateway-token", "OPENCLAW_TIMEOUT_MS": "300000"}}}}