MCP Filesystem Readonly MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add mcp-filesystem-readonly -- docker compose up -d
README.md

A read-only filesystem FastMCP server for AI assistants.

mcp-filesystem-readonly

A read-only filesystem FastMCP server. Configure a root directory and let AI assistants browse its contents via MCP tools.

MCP (Model Context Protocol) is an open standard that lets AI assistants call external tools and services. This server implements MCP over HTTP so any MCP-compatible AI application can reach it.


Prerequisites

  • Docker — for the Docker Compose deployment path
  • uv — for the source deployment path (see Installing uv)
  • Node.js — required for the git commit hooks; the hooks use commitlint to enforce Conventional Commits, which is the best-in-class Node.js tool for commit message validation

Customising the Template

1. Copy the template

On GitHub — click Use this template → Create a new repository. This creates a clean copy with no fork relationship and no template history.

Without GitHub — clone, strip the history, and reinitialise:

git clone https://github.com/sesopenko/mcp-base.git my-project
cd my-project
rm -rf .git
git init
git add .
git commit -m "chore: bootstrap from mcp-base template"

2. Customise identity values

Edit project.env to set your own values (Docker image name, package name, project name, description), then run the setup script to substitute them throughout the repository:

bash scripts/apply-project-config.sh

The script is idempotent — safe to run multiple times.


Quick Start

Option A — Docker Compose

  1. Create a docker-compose.yml:

    services:
      mcp-filesystem-readonly:
        image: sesopenko/mcp-filesystem-readonly:latest
        ports:
          - "8080:8080"
        volumes:
          - ./config.toml:/config/config.toml:ro
          - /mnt/video:/mnt/video:ro
        restart: unless-stopped
    
  2. Copy the example config and edit it:

    cp config.toml.example config.toml
    
  3. Start the server:

    docker compose up -d
    

Option B — Run from Source

  1. Install uv if you haven't already.

  2. Install dependencies:

    uv sync
    
  3. Copy the example config and edit it:

    cp config.toml.example config.toml
    
  4. Start the server:

    uv run python -m mcp_base
    

Security

This server has no authentication on its MCP endpoint. It is designed for LAN use only.

Do not expose this server directly to the internet.

If you need to access it remotely, place it behind a reverse proxy that handles TLS termination and access control. Configuring a reverse proxy is outside the scope of this project.


Configuration

Create a config.toml in the working directory (or pass --config ):

[server]
host = "0.0.0.0"
port = 8080

[logging]
level = "info"

[filesystem]
roots = "/mnt/video"

[server]

Key Default Description
host "0.0.0.0" Address the MCP server listens on. 0.0.0.0 binds all interfaces.
port 8080 Port the MCP server listens on.

[logging]

Key Default Description
level "info" Log verbosity. One of: debug, info, warning, error.

[filesystem]

Key Required Description
roots yes Comma-separated list of absolute paths exposed via list_folder. Only paths within one of these roots can be listed.

Connecting an AI Application

This server uses the Streamable HTTP MCP transport. Clients communicate via HTTP POST with streaming responses — opening the endpoint in a browser will return a Not Acceptable error, which is expected.

Point your MCP-compatible AI application at the server's MCP endpoint:

http://<host>:/mcp

For example, if the server is running on 192.168.1.10 with the default port:

http://192.168.1.10:8080/mcp

Consult your AI application's documentation for how to register an MCP server. Ensure it supports the Streamable HTTP transport (most modern MCP clients do).


Example System Prompt

XML is preferred over markdown for system prompts because explicit named tags give unambiguous semantic meaning — the AI always knows exactly what each block contains. Markdown headings require inference and are more likely to be misinterpreted.

Copy and adapt this prompt to give your AI assistant clear guidance on using the tools.

Tip — let an LLM write this for you. XML-structured system prompts are effective but unfamiliar to most developers and tedious to write by hand. A quick conversation with any capable LLM (descr

Tools (1)

list_folderLists the contents of a directory within the configured roots.

Environment Variables

CONFIG_PATHPath to the configuration file

Configuration

claude_desktop_config.json
{"mcpServers": {"filesystem-readonly": {"command": "uv", "args": ["run", "mcp_base"], "env": {"CONFIG_PATH": "/path/to/config.toml"}}}}

Try it

List the files in the /mnt/video directory to see what media is available.
Can you browse the contents of the root directory and tell me what folders are present?
Find all files within the configured filesystem roots and summarize the directory structure.

Frequently Asked Questions

What are the key features of MCP Filesystem Readonly?

Read-only access to local filesystem directories. Configurable root paths for security. Implements MCP over HTTP for broad compatibility. Built-in health check utility. Supports Docker and source-based deployment.

What can I use MCP Filesystem Readonly for?

Allowing an AI assistant to index or summarize local media libraries. Providing read-only access to project documentation folders for AI context. Enabling AI-assisted file discovery in a secure, restricted local environment.

How do I install MCP Filesystem Readonly?

Install MCP Filesystem Readonly by running: docker compose up -d

What MCP clients work with MCP Filesystem Readonly?

MCP Filesystem Readonly works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep MCP Filesystem Readonly docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare