LLMS-TXT Documentation Server MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add llms-txt-docs -- uvx --from mcpdoc mcpdoc --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt" --transport stdio
README.md

Fetch and audit documentation from user-defined llms.txt index files.

MCP LLMS-TXT Documentation Server

Overview

llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.

MCP offers a way for developers to have full control over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt files and (2) a simple fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned.

llms-txt

You can find llms.txt files for langgraph and langchain here:

Library llms.txt
LangGraph Python https://langchain-ai.github.io/langgraph/llms.txt
LangGraph JS https://langchain-ai.github.io/langgraphjs/llms.txt
LangChain Python https://python.langchain.com/llms.txt
LangChain JS https://js.langchain.com/llms.txt

Quickstart

Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
Choose an `llms.txt` file to use.
  • For example, here's the LangGraph llms.txt file.

Note: Security and Domain Access Control

For security reasons, mcpdoc implements strict domain access controls:

  1. Remote llms.txt files: When you specify a remote llms.txt URL (e.g., https://langchain-ai.github.io/langgraph/llms.txt), mcpdoc automatically adds only that specific domain (langchain-ai.github.io) to the allowed domains list. This means the tool can only fetch documentation from URLs on that domain.

  2. Local llms.txt files: When using a local file, NO domains are automatically added to the allowed list. You MUST explicitly specify which domains to allow using the --allowed-domains parameter.

  3. Adding additional domains: To allow fetching from domains beyond those automatically included:

    • Use --allowed-domains domain1.com domain2.com to add specific domains
    • Use --allowed-domains '*' to allow all domains (use with caution)

This security measure prevents unauthorized access to domains not explicitly approved by the user, ensuring that documentation can only be retrieved from trusted sources.

(Optional) Test the MCP server locally with your `llms.txt` file(s) of choice:
uvx --from mcpdoc mcpdoc \
    --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt" "LangChain:https://python.langchain.com/llms.txt" \
    --transport sse \
    --port 8082 \
    --host localhost

Screenshot 2025-03-18 at 3 29 30 PM

npx @modelcontextprotocol/inspector

Screenshot 2025-03-18 at 3 30 30 PM

  • Here, you can test the tool calls.
Connect to Cursor
  • Open Cursor Settings and MCP tab.
  • This will open the ~/.cursor/mcp.json file.

Screenshot 2025-03-19 at 11 01 31 AM

  • Paste the following into the file (we use the langgraph-docs-mcp name and link to the LangGraph llms.txt).
{
  "mcpServers": {
    "langgraph-docs-mcp": {
      "command": "uvx",
      "args": [
        "--from",
        "mcpdoc",
        "mcpdoc",
        "--urls",
        "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt",
        "--transport",
        "stdio"
      ]
    }
  }
}
  • Confirm that the server is running i

Tools (1)

fetch_docsReads URLs within any of the provided llms.txt files to retrieve documentation content.

Configuration

claude_desktop_config.json
{"mcpServers": {"langgraph-docs-mcp": {"command": "uvx", "args": ["--from", "mcpdoc", "mcpdoc", "--urls", "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt", "--transport", "stdio"]}}}

Try it

Fetch the latest documentation for LangGraph using the configured llms.txt file.
Retrieve the content from the LangChain documentation URL provided in the index.
List the available documentation sources defined in my current llms.txt configuration.

Frequently Asked Questions

What are the key features of LLMS-TXT Documentation Server?

Provides a user-defined list of llms.txt files for context retrieval. Includes a fetch_docs tool to read documentation from URLs. Implements strict domain access controls for security. Allows auditing of tool calls and returned context. Supports both remote and local llms.txt file configurations.

What can I use LLMS-TXT Documentation Server for?

Developers needing to provide LLMs with accurate, up-to-date library documentation. Users who want to audit exactly what documentation context is being fed to their AI. Teams maintaining custom llms.txt indexes for internal project documentation. Developers working with LangChain or LangGraph who need quick access to official docs.

How do I install LLMS-TXT Documentation Server?

Install LLMS-TXT Documentation Server by running: uvx --from mcpdoc mcpdoc --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt" "LangChain:https://python.langchain.com/llms.txt" --transport sse --port 8082 --host localhost

What MCP clients work with LLMS-TXT Documentation Server?

LLMS-TXT Documentation Server works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep LLMS-TXT Documentation Server docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare