MCP-Crawl4AI MCP Server

Local setup required. This server has to be cloned and prepared on your machine before you register it in Claude Code.
1

Set the server up locally

Run this once to clone and prepare the server before adding it to Claude Code.

Run in terminal
pip install mcp-crawl4ai
mcp-crawl4ai --setup
2

Register it in Claude Code

After the local setup is done, run this command to point Claude Code at the built server.

Run in terminal
claude mcp add mcp-crawl4ai -- node "<FULL_PATH_TO_MCP_CRAWL4AI>/dist/index.js"

Replace <FULL_PATH_TO_MCP_CRAWL4AI>/dist/index.js with the actual folder you prepared in step 1.

README.md

A Model Context Protocol server for web crawling powered by Crawl4AI

MCP-Crawl4AI

A Model Context Protocol server for web crawling powered by Crawl4AI


Overview

MCP-Crawl4AI is a Model Context Protocol server that gives AI systems access to the live web. Built on FastMCP v3 and Crawl4AI, it exposes 4 tools, 2 resources, and 3 prompts through the standardized MCP interface, backed by a lifespan-managed headless Chromium browser.

Only 2 runtime dependenciesfastmcp and crawl4ai.

[!TIP] Full documentation site →

Key Features

  • Full MCP compliance via FastMCP v3 with tool annotations (readOnlyHint, destructiveHint, etc.)
  • 4 focused tools centered on canonical scrape/crawl plus session lifecycle/artifacts
  • 3 prompts for common LLM workflows (summarize, extract schema, compare pages)
  • 2 resources exposing server configuration and version info
  • Headless Chromium managed as a lifespan singleton (start once, reuse everywhere)
  • Multiple transports — stdio (default) and Streamable HTTP
  • LLM-optimized output — markdown, cleaned HTML, raw HTML, or plain text
  • Canonical option groups for extraction, runtime, diagnostics, sessions, rendering, and traversal
  • List and deep traversal in one crawl contract
  • Session-aware workflows with explicit session close and artifact retrieval tools
  • Auto browser setup — detects missing Playwright browsers and installs automatically

Installation

pip
pip install mcp-crawl4ai
mcp-crawl4ai --setup       # one-time: installs Playwright browsers
uv (recommended)
uv add mcp-crawl4ai
mcp-crawl4ai --setup       # one-time: installs Playwright browsers
Docker
docker build -t mcp-crawl4ai .
docker run -p 8000:8000 mcp-crawl4ai

The Docker image includes Playwright browsers — no separate setup needed.

Development
git clone https://github.com/wyattowalsh/mcp-crawl4ai.git
cd mcp-crawl4ai
uv sync --group dev
mcp-crawl4ai --setup

[!NOT

Tools (2)

crawlPerforms web crawling and scraping tasks with support for deep traversal and structured extraction.
close_sessionCloses an active crawling session and cleans up resources.

Configuration

claude_desktop_config.json
{"mcpServers": {"crawl4ai": {"command": "mcp-crawl4ai"}}}

Try it

Crawl the documentation page at https://example.com/docs and summarize the key features in markdown.
Extract the pricing table from https://example.com/pricing and format it as a JSON object.
Perform a deep crawl of https://example.com to find all links related to 'API documentation'.
Scrape the latest news articles from https://example.com/news and provide a bulleted list of headlines.

Frequently Asked Questions

What are the key features of MCP-Crawl4AI?

Full MCP compliance via FastMCP v3. Headless Chromium managed as a lifespan singleton. LLM-optimized output including markdown, cleaned HTML, and plain text. Support for session-aware workflows and deep site traversal. Automatic browser setup and Playwright dependency management.

What can I use MCP-Crawl4AI for?

Automated research and data gathering from live websites. Converting complex web pages into LLM-friendly markdown format. Building session-aware scraping workflows for multi-page documentation sites. Extracting structured data from dynamic web content for analysis.

How do I install MCP-Crawl4AI?

Install MCP-Crawl4AI by running: pip install mcp-crawl4ai && mcp-crawl4ai --setup

What MCP clients work with MCP-Crawl4AI?

MCP-Crawl4AI works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep MCP-Crawl4AI docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare