MCP Production MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add -e "OPENAI_API_KEY=${OPENAI_API_KEY}" mcp-production -- uv run app/main.py
Required:OPENAI_API_KEY+ 2 optional
README.md

Production-ready MCP server with FastAPI, Redis session memory, and streaming.

MCP Production — Model Context Protocol + OpenAI

Production-ready MCP server with FastAPI, Redis session memory, streaming, retries, rate limiting, and structured logging. Managed with uv.


Project Structure

mcp_production/
├── app/
│   ├── main.py                  # FastAPI app factory
│   ├── config.py                # Centralised settings (pydantic-settings)
│   ├── logger.py                # Structured JSON logging (structlog)
│   ├── api/
│   │   ├── routes.py            # All route handlers
│   │   └── schemas.py           # Pydantic request/response models
│   ├── core/
│   │   ├── mcp_loop.py          # Agentic loop (blocking + streaming)
│   │   └── openai_client.py     # OpenAI client with retry
│   ├── tools/
│   │   ├── base.py              # BaseTool + ToolRegistry
│   │   ├── weather.py           # get_weather tool
│   │   ├── calculator.py        # calculate tool (sympy)
│   │   └── wiki.py              # search_wiki tool
│   └── memory/
│       └── session.py           # Redis-backed session memory
├── tests/
│   ├── test_tools.py            # Tool unit tests
│   └── test_api.py              # API integration tests
├── scripts/
│   ├── run_dev.sh               # Dev server (hot reload)
│   ├── run_prod.sh              # Production server (multi-worker)
│   └── test.sh                  # Run test suite
├── pyproject.toml               # uv project + dependencies
├── .env.example                 # Environment variable template
├── docker-compose.yml           # Local dev stack (app + Redis)
└── Dockerfile                   # Multi-stage Docker build

Quick Start

1. Install uv

curl -LsSf https://astral.sh/uv/install.sh | sh

2. Clone and set up

git clone <repo>
cd mcp_production

# Install all dependencies
uv sync

# Copy and fill in env vars
cp .env.example .env
# Edit .env — add OPENAI_API_KEY at minimum

3. Start Redis

# Option A: Docker Compose (recommended)
docker-compose up redis -d

# Option B: Local Redis
brew install redis && redis-server

4. Run the server

# Development (hot reload)
bash scripts/run_dev.sh

# Or directly:
uv run uvicorn app.main:app --reload

Server starts at http://localhost:8000 API docs at http://localhost:8000/docs


API Endpoints

`POST /api/v1/chat` — Blocking

curl -X POST http://localhost:8000/api/v1/chat \
  -H "Content-Type: application/json" \
  -d '{
    "message": "What is the weather in Tokyo and calculate 17 * 4?",
    "session_id": "user-123"
  }'

Response:

{
  "answer": "The weather in Tokyo is 22°C, sunny. And 17 * 4 = 68.",
  "session_id": "user-123",
  "turns": 2,
  "tools_called": ["get_weather", "calculate"],
  "total_tokens": 312
}

`POST /api/v1/chat/stream` — Streaming SSE

curl -N -X POST http://localhost:8000/api/v1/chat/stream \
  -H "Content-Type: application/json" \
  -d '{"message": "Search Wikipedia for Python", "session_id": "user-123"}'

Events:

data: {"type": "tool_call",   "name": "search_wiki", "args": {"query": "Python"}}
data: {"type": "tool_result", "name": "search_wiki", "content": "Python is..."}
data: {"type": "token",       "content": "Python "}
data: {"type": "token",       "content": "is a..."}
data: {"type": "done",        "turns": 2, "tools": ["search_wiki"]}

`DELETE /api/v1/session/{session_id}` — Clear History

curl -X DELETE http://localhost:8000/api/v1/session/user-123

`GET /api/v1/tools` — List Tools

curl http://localhost:8000/api/v1/tools

`GET /api/v1/health` — Health Check

curl http://localhost:8000/api/v1/health

Adding a New Tool

  1. Create app/tools/my_tool.py:
from app.tools.base import BaseTool

class MyTool(BaseTool):
    name = "my_tool"

    def schema(self) -> dict:
        return {
            "type": "function",
            "function": {
                "name": self.name,
                "description": "Does something useful.",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "input": {"type": "string", "description": "Input value"}
                    },
                    "required": ["input"]
                }
            }
        }

    async def execute(self, input: str) -> str:
        return f"Result for: {input}"
  1. Register in app/tools/__init__.py:
from app.tools.my_tool import MyTool
registry.register(MyTool())

That's it — the tool is automatically included in all API calls.


Running Tests

bash scripts/test.sh

# Or with uv directly:
uv run pytest tests/ -v

Docker (Full Stack)

docker-compose up --build

Environment Variables

Variable Default Description
OPENAI_API_KEY required Your OpenAI API key
OPENAI_MODEL gpt-4o-mini Model to use
REDIS_URL `redis

Tools (3)

get_weatherRetrieves current weather information for a specified location.
calculatePerforms mathematical calculations using sympy.
search_wikiSearches Wikipedia for information on a given query.

Environment Variables

OPENAI_API_KEYrequiredYour OpenAI API key
OPENAI_MODELModel to use (default: gpt-4o-mini)
REDIS_URLConnection string for Redis instance

Configuration

claude_desktop_config.json
{"mcpServers": {"mcp-production": {"command": "uv", "args": ["run", "app/main.py"], "env": {"OPENAI_API_KEY": "your-key-here", "REDIS_URL": "redis://localhost:6379"}}}}

Try it

What is the weather in Tokyo and calculate 17 * 4?
Search Wikipedia for Python and summarize its history.
Calculate the square root of 144 and tell me if it's sunny in London.

Frequently Asked Questions

What are the key features of MCP Production?

Streaming agentic chat capabilities using SSE. Redis-backed session memory for conversation history. Built-in tools for weather, math, and Wikipedia search. Enterprise-grade features including rate limiting and structured logging. Multi-worker production server support.

What can I use MCP Production for?

Building production-ready AI agents with persistent memory. Integrating OpenAI models into FastAPI-based applications. Creating custom tools for LLM-based automation workflows. Deploying scalable agentic services with Redis session management.

How do I install MCP Production?

Install MCP Production by running: git clone https://github.com/Riteshpcs1994/MCP && cd mcp_production && uv sync

What MCP clients work with MCP Production?

MCP Production works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep MCP Production docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare