A read-only filesystem FastMCP server for AI assistants.
mcp-filesystem-readonly
A read-only filesystem FastMCP server. Configure a root directory and let AI assistants browse its contents via MCP tools.
MCP (Model Context Protocol) is an open standard that lets AI assistants call external tools and services. This server implements MCP over HTTP so any MCP-compatible AI application can reach it.
Prerequisites
- Docker — for the Docker Compose deployment path
- uv — for the source deployment path (see Installing uv)
- Node.js — required for the git commit hooks; the hooks use commitlint to enforce Conventional Commits, which is the best-in-class Node.js tool for commit message validation
Customising the Template
1. Copy the template
On GitHub — click Use this template → Create a new repository. This creates a clean copy with no fork relationship and no template history.
Without GitHub — clone, strip the history, and reinitialise:
git clone https://github.com/sesopenko/mcp-base.git my-project
cd my-project
rm -rf .git
git init
git add .
git commit -m "chore: bootstrap from mcp-base template"
2. Customise identity values
Edit project.env to set your own values (Docker image name, package name, project name, description), then run the setup script to substitute them throughout the repository:
bash scripts/apply-project-config.sh
The script is idempotent — safe to run multiple times.
Quick Start
Option A — Docker Compose
Create a
docker-compose.yml:services: mcp-filesystem-readonly: image: sesopenko/mcp-filesystem-readonly:latest ports: - "8080:8080" volumes: - ./config.toml:/config/config.toml:ro - /mnt/video:/mnt/video:ro restart: unless-stoppedCopy the example config and edit it:
cp config.toml.example config.tomlStart the server:
docker compose up -d
Option B — Run from Source
Install uv if you haven't already.
Install dependencies:
uv syncCopy the example config and edit it:
cp config.toml.example config.tomlStart the server:
uv run python -m mcp_base
Security
This server has no authentication on its MCP endpoint. It is designed for LAN use only.
Do not expose this server directly to the internet.
If you need to access it remotely, place it behind a reverse proxy that handles TLS termination and access control. Configuring a reverse proxy is outside the scope of this project.
Configuration
Create a config.toml in the working directory (or pass --config ):
[server]
host = "0.0.0.0"
port = 8080
[logging]
level = "info"
[filesystem]
roots = "/mnt/video"
[server]
| Key | Default | Description |
|---|---|---|
host |
"0.0.0.0" |
Address the MCP server listens on. 0.0.0.0 binds all interfaces. |
port |
8080 |
Port the MCP server listens on. |
[logging]
| Key | Default | Description |
|---|---|---|
level |
"info" |
Log verbosity. One of: debug, info, warning, error. |
[filesystem]
| Key | Required | Description |
|---|---|---|
roots |
yes | Comma-separated list of absolute paths exposed via list_folder. Only paths within one of these roots can be listed. |
Connecting an AI Application
This server uses the Streamable HTTP MCP transport. Clients communicate via HTTP POST with streaming responses — opening the endpoint in a browser will return a Not Acceptable error, which is expected.
Point your MCP-compatible AI application at the server's MCP endpoint:
http://<host>:/mcp
For example, if the server is running on 192.168.1.10 with the default port:
http://192.168.1.10:8080/mcp
Consult your AI application's documentation for how to register an MCP server. Ensure it supports the Streamable HTTP transport (most modern MCP clients do).
Example System Prompt
XML is preferred over markdown for system prompts because explicit named tags give unambiguous semantic meaning — the AI always knows exactly what each block contains. Markdown headings require inference and are more likely to be misinterpreted.
Copy and adapt this prompt to give your AI assistant clear guidance on using the tools.
Tip — let an LLM write this for you. XML-structured system prompts are effective but unfamiliar to most developers and tedious to write by hand. A quick conversation with any capable LLM (descr
Tools (1)
list_folderLists the contents of a directory within the configured roots.Environment Variables
CONFIG_PATHPath to the configuration fileConfiguration
{"mcpServers": {"filesystem-readonly": {"command": "uv", "args": ["run", "mcp_base"], "env": {"CONFIG_PATH": "/path/to/config.toml"}}}}