MCP WebScout MCP Server

Local setup required. This server has to be cloned and prepared on your machine before you register it in Claude Code.
1

Set the server up locally

Run this once to clone and prepare the server before adding it to Claude Code.

Run in terminal
pip install -e ".[dev]"
2

Register it in Claude Code

After the local setup is done, run this command to point Claude Code at the built server.

Run in terminal
claude mcp add -e "DEEPSEEK_API_KEY=${DEEPSEEK_API_KEY}" mcp-webscout -- node "<FULL_PATH_TO_MCP_WEBSCOUT>/dist/index.js"

Replace <FULL_PATH_TO_MCP_WEBSCOUT>/dist/index.js with the actual folder you prepared in step 1.

Required:DEEPSEEK_API_KEY+ 2 optional
README.md

Web search and intelligent content extraction with LLM-powered analysis.

MCP WebScout

A Model Context Protocol (MCP) server providing web search (DuckDuckGo) and intelligent content extraction with LLM-powered analysis.

Features

  • search: Search the web using DuckDuckGo
  • fetch: Advanced web fetching with Crawl4AI and LLM extraction

System Requirements

Requirement Version Notes
Python >= 3.10 Required runtime environment
pip latest Package manager (included with Python)
Playwright latest Required by Crawl4AI for browser automation
DeepSeek API Key - Required for LLM extraction mode
Proxy (optional) - Required for users in mainland China

Python Dependencies (14 packages)

Package Version Purpose
mcp >=1.0.0 MCP protocol implementation
duckduckgo-search >=3.0.0 DuckDuckGo search API
requests >=2.32.0 HTTP requests
beautifulsoup4 >=4.12.0 HTML parsing
openai >=1.30.0 OpenAI API client for DeepSeek
crawl4ai >=0.5.0 Advanced web scraping

Quick Start

Get started in 5 steps:

1. Clone and Setup Environment

git clone <repository>
cd mcp-webscout
python -m venv .venv

On Windows:

.venv\Scripts\activate

On macOS/Linux:

source .venv/bin/activate

2. Install Dependencies

pip install -e ".[dev]"

3. Install Playwright Browsers

playwright install chromium

4. Configure Environment Variables

cp .env.example .env

Edit .env and add your configuration:

# Required for LLM extraction
DEEPSEEK_API_KEY=sk-your-actual-key-here

# Required for mainland China users
PROXY_URL=http://127.0.0.1:7890
USE_PROXY=true

5. Verify Installation

# Run tests
pytest tests/ -v

# Test the server
python -m mcp_webscout --help

Detailed Configuration

For detailed environment setup instructions, see ENV_SETUP.md.

Usage

As a Command

mcp-webscout

As a Python Module

python -m mcp_webscout

With Claude Desktop

Add to your claude_desktop_config.json:

Basic Configuration
{
  "mcpServers": {
    "webscout": {
      "command": "mcp-webscout"
    }
  }
}
With Environment Variables (Recommended)
{
  "mcpServers": {
    "webscout": {
      "command": "mcp-webscout",
      "env": {
        "DEEPSEEK_API_KEY": "sk-your-key-here",
        "PROXY_URL": "http://127.0.0.1:7890",
        "USE_PROXY": "true",
        "DEFAULT_MAX_LENGTH": "5000",
        "PYTHONUTF8": "1"
      }
    }
  }
}
Windows Configuration
{
  "mcpServers": {
    "webscout": {
      "command": "python",
      "args": ["-m", "mcp_webscout"],
      "env": {
        "DEEPSEEK_API_KEY": "sk-your-key-here",
        "PROXY_URL": "http://127.0.0.1:7890",
        "USE_PROXY": "true",
        "PYTHONUTF8": "1"
      }
    }
  }
}

Tools

search

Search the web using DuckDuckGo.

Parameters:

Name Type Required Description
query string Yes Search query
max_results integer No Maximum results (1-10, default: 5)

Returns:

Formatted search results with titles, URLs, and snippets.

Example:

{
  "query": "Python programming",
  "max_results": 3
}

fetch

Advanced web fetching with Crawl4AI and LLM extraction.

Parameters:

Name Type Required Description
url string Yes URL to fetch
mode string No Extraction mode: simple, llm (default: simple)
prompt string No Custom extraction prompt for LLM mode
max_length integer No Maximum characters (default: 5000)
use_proxy boolean No Use proxy (default: true)

Returns:

Fetched and optionally extracted content.

Tools (2)

searchSearch the web using DuckDuckGo.
fetchAdvanced web fetching with Crawl4AI and LLM extraction.

Environment Variables

DEEPSEEK_API_KEYrequiredAPI key for LLM extraction mode
PROXY_URLProxy URL for users in mainland China
USE_PROXYEnable proxy usage

Configuration

claude_desktop_config.json
{"mcpServers": {"webscout": {"command": "mcp-webscout", "env": {"DEEPSEEK_API_KEY": "sk-your-key-here", "PROXY_URL": "http://127.0.0.1:7890", "USE_PROXY": "true", "DEFAULT_MAX_LENGTH": "5000", "PYTHONUTF8": "1"}}}}

Try it

Search for the latest updates on the Model Context Protocol and summarize the top 3 results.
Fetch the content of this URL and extract the key technical specifications using LLM mode.
Find recent articles about Python 3.13 features and provide a concise summary.
Use the fetch tool to get the main content from this documentation page, limiting the output to 2000 characters.

Frequently Asked Questions

What are the key features of MCP WebScout?

Web search capabilities via DuckDuckGo. Advanced web content fetching using Crawl4AI. LLM-powered content extraction and summarization. Configurable extraction modes and character limits.

What can I use MCP WebScout for?

Researching current events or technical documentation by searching and summarizing web content.. Extracting structured data from complex websites for analysis within Claude.. Automating the retrieval of specific information from multiple web sources.. Bypassing regional access restrictions using proxy configuration for web scraping..

How do I install MCP WebScout?

Install MCP WebScout by running: pip install -e ".[dev]"

What MCP clients work with MCP WebScout?

MCP WebScout works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep MCP WebScout docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare