MCP LLMS-TXT Documentation Server MCP Server

$uvx --from mcpdoc mcpdoc --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt" --transport sse --port 8082 --host localhost
README.md

Fetch and audit documentation from user-defined llms.txt index files.

MCP LLMS-TXT Documentation Server

Overview

llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.

MCP offers a way for developers to have full control over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt files and (2) a simple fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned.

llms-txt

You can find llms.txt files for langgraph and langchain here:

Library llms.txt
LangGraph Python https://langchain-ai.github.io/langgraph/llms.txt
LangGraph JS https://langchain-ai.github.io/langgraphjs/llms.txt
LangChain Python https://python.langchain.com/llms.txt
LangChain JS https://js.langchain.com/llms.txt

Quickstart

Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
Choose an `llms.txt` file to use.
  • For example, here's the LangGraph llms.txt file.

Note: Security and Domain Access Control

For security reasons, mcpdoc implements strict domain access controls:

  1. Remote llms.txt files: When you specify a remote llms.txt URL (e.g., https://langchain-ai.github.io/langgraph/llms.txt), mcpdoc automatically adds only that specific domain (langchain-ai.github.io) to the allowed domains list. This means the tool can only fetch documentation from URLs on that domain.

  2. Local llms.txt files: When using a local file, NO domains are automatically added to the allowed list. You MUST explicitly specify which domains to allow using the --allowed-domains parameter.

  3. Adding additional domains: To allow fetching from domains beyond those automatically included:

    • Use --allowed-domains domain1.com domain2.com to add specific domains
    • Use --allowed-domains '*' to allow all domains (use with caution)

This security measure prevents unauthorized access to domains not explicitly approved by the user, ensuring that documentation can only be retrieved from trusted sources.

(Optional) Test the MCP server locally with your `llms.txt` file(s) of choice:
uvx --from mcpdoc mcpdoc \
    --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt" "LangChain:https://python.langchain.com/llms.txt" \
    --transport sse \
    --port 8082 \
    --host localhost

Screenshot 2025-03-18 at 3 29 30 PM

npx @modelcontextprotocol/inspector

Screenshot 2025-03-18 at 3 30 30 PM

  • Here, you can test the tool calls.
Connect to Cursor
  • Open Cursor Settings and MCP tab.
  • This will open the ~/.cursor/mcp.json file.

Screenshot 2025-03-19 at 11 01 31 AM

  • Paste the following into the file (we use the langgraph-docs-mcp name and link to the LangGraph llms.txt).
{
  "mcpServers": {
    "langgraph-docs-mcp": {
      "command": "uvx",
      "args": [
        "--from",
        "mcpdoc",
        "mcpdoc",
        "--urls",
        "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt",
        "--transport",
        "stdio"
      ]
    }
  }
}
  • Confirm that the server is running i

Tools (2)

fetch_docsRead URLs within any of the provided llms.txt files to retrieve documentation context.
list_docsList documentation sources from the user-defined llms.txt index files.

Configuration

claude_desktop_config.json
{
  "mcpServers": {
    "langgraph-docs-mcp": {
      "command": "uvx",
      "args": [
        "--from",
        "mcpdoc",
        "mcpdoc",
        "--urls",
        "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt",
        "--transport",
        "stdio"
      ]
    }
  }
}

Try it

List the available documentation sources from my llms.txt index.
Fetch the documentation for LangGraph from the URL provided in the llms.txt file.
Retrieve the latest context for LangChain JS using the fetch_docs tool.
Audit the documentation content returned from the LangGraph Python index.

Frequently Asked Questions

What are the key features of MCP LLMS-TXT Documentation Server?

User-defined list of llms.txt files for custom documentation indexing.. Strict domain access controls to ensure documentation is only fetched from trusted sources.. Full auditability of tool calls and returned context for transparent retrieval.. Support for both remote and local llms.txt files with configurable security parameters.. Compatibility with MCP host applications like Cursor, Windsurf, and Claude Desktop..

What can I use MCP LLMS-TXT Documentation Server for?

Developers needing precise context from LangChain or LangGraph documentation within their IDE.. Teams wanting to audit the specific documentation context being fed into LLMs.. Users who want to restrict LLM documentation retrieval to specific, trusted domains.. Standardizing documentation access across different AI coding assistants using the MCP protocol..

How do I install MCP LLMS-TXT Documentation Server?

Install MCP LLMS-TXT Documentation Server by running: uvx --from mcpdoc mcpdoc --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt" --transport sse --port 8082 --host localhost

What MCP clients work with MCP LLMS-TXT Documentation Server?

MCP LLMS-TXT Documentation Server works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Use MCP LLMS-TXT Documentation Server with Conare

Manage MCP servers visually, upload persistent context, and never start from zero with Claude Code & Codex.

Try Free