AI Customer Support Agent MCP Server

Local setup required. This server has to be cloned and prepared on your machine before you register it in Claude Code.
1

Set the server up locally

Run this once to clone and prepare the server before adding it to Claude Code.

Run in terminal
pip install -r requirements.txt
2

Register it in Claude Code

After the local setup is done, run this command to point Claude Code at the built server.

Run in terminal
claude mcp add ai-customer-support -- node "<FULL_PATH_TO_MCP>/dist/index.js"

Replace <FULL_PATH_TO_MCP>/dist/index.js with the actual folder you prepared in step 1.

README.md

An MCP-compatible tool server for automated customer service interactions.

AI Customer Support Agent

This project is a production-style starter for an AI customer support system built with FastAPI, a lightweight MCP-compatible tool server, and an OpenAI-compatible LLM client.

Features

  • Answers customer questions with a knowledge base connector
  • Retrieves order status from a database
  • Creates support tickets for unresolved issues
  • Summarizes conversations for human agents
  • Detects likely escalation scenarios
  • Exposes required tools through an MCP-style JSON-RPC endpoint

Project Structure

ai-support-agent/
├── backend/
│   ├── agent.py
│   ├── main.py
│   ├── mcp_server.py
│   ├── database/
│   │   └── models.py
│   └── tools/
│       ├── crm_tool.py
│       ├── kb_tool.py
│       ├── order_tool.py
│       └── ticket_tool.py
├── frontend/
│   └── simple_chat_ui.html
├── requirements.txt
└── README.md

Architecture

  • FastAPI backend: serves the /chat API, health endpoint, and the simple browser UI.
  • SupportAgent: orchestrates LLM responses and decides when to call tools.
  • MCP server: exposes get_order_status, search_knowledge_base, create_support_ticket, and get_customer_details over JSON-RPC at /mcp.
  • Connectors: isolated tool classes for orders, CRM, knowledge base, and ticketing.
  • Database: SQLAlchemy models for customers, orders, and support_tickets.

Tech Choices

  • Python 3.11+
  • FastAPI
  • SQLAlchemy
  • SQLite by default, with a clear upgrade path to PostgreSQL
  • OpenAI-compatible SDK via the openai Python package

Setup

  1. Create and activate a virtual environment.
  2. Install dependencies:
pip install -r requirements.txt
  1. Optional: configure an OpenAI-compatible provider.
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_MODEL="gpt-4.1-mini"
export OPENAI_BASE_URL="https://api.openai.com/v1"

If OPENAI_API_KEY is not set, the app still runs in a rules-based fallback mode so you can test the flows locally.

Run

uvicorn backend.main:app --reload --host 0.0.0.0 --port 8000

Then open http://localhost:8000.

API Usage

`POST /chat`

Request:

{
  "customer_id": 1,
  "message": "Where is my order #45231?",
  "conversation_history": []
}

Response:

{
  "response": "Order #45231 for Noise-Cancelling Headphones is currently shipped. Carrier: FedEx. Tracking: ZX991245US. Estimated delivery: 2026-03-15.",
  "used_tools": [
    {
      "name": "get_order_status",
      "arguments": {
        "order_id": "45231"
      },
      "result": {
        "order_id": 45231,
        "customer_id": 1,
        "item_name": "Noise-Cancelling Headphones",
        "status": "shipped",
        "tracking_number": "ZX991245US",
        "shipping_carrier": "FedEx",
        "estimated_delivery": "2026-03-15",
        "total_amount": 199.99
      }
    }
  ],
  "escalated": false,
  "conversation_summary": "Customer 1 asked: Where is my order #45231?. Agent responded: ...",
  "llm_mode": false
}

`POST /mcp`

Example initialization request:

{
  "jsonrpc": "2.0",
  "id": "1",
  "method": "initialize",
  "params": {}
}

Example tool list request:

{
  "jsonrpc": "2.0",
  "id": "2",
  "method": "tools/list",
  "params": {}
}

Example tool call request:

{
  "jsonrpc": "2.0",
  "id": "3",
  "method": "tools/call",
  "params": {
    "name": "get_order_status",
    "arguments": {
      "order_id": "45231"
    }
  }
}

Database

The app creates and seeds a local SQLite database file named support_agent.db on startup with example customers, orders, and support tickets.

To migrate to PostgreSQL:

  • replace DATABASE_URL in backend/database/models.py
  • update the engine configuration
  • add migrations with Alembic for production deployments

Example Agent Flow

User message:

Where is my order #45231?

Expected flow:

  1. Agent detects an order-tracking request.
  2. Agent calls get_order_status(order_id).
  3. Tool returns shipping carrier, tracking number, and estimated delivery.
  4. Agent responds with a concise customer-facing answer.

Error Handling

  • Invalid order IDs return 400
  • Unknown customers return 400
  • Unexpected backend failures return 500
  • Tool errors are surfaced in structured JSON-RPC format on the MCP endpoint

Notes

  • The knowledge base is intentionally simple and in-memory for easy extension.
  • The ticketing and CRM connectors are implemented as modular service classes so they can be swapped with real APIs later.
  • For a production deployment, add authentication, persistent conversation storage, rate limiting, and observability.

Tools (4)

get_order_statusRetrieves the status, tracking, and delivery details for a specific order.
search_knowledge_baseSearches the internal knowledge base for customer support information.
create_support_ticketCreates a new support ticket for unresolved customer issues.
get_customer_detailsRetrieves profile information for a specific customer.

Environment Variables

OPENAI_API_KEYAPI key for OpenAI-compatible LLM provider
OPENAI_MODELThe model name to use for the agent
OPENAI_BASE_URLBase URL for the OpenAI-compatible API

Configuration

claude_desktop_config.json
{"mcpServers": {"ai-support-agent": {"command": "python", "args": ["-m", "backend.mcp_server"]}}}

Try it

Check the status of order #45231 and let me know when it will arrive.
Search the knowledge base for our return policy regarding damaged items.
Create a support ticket for customer 1 regarding a missing package from their recent order.
Get the customer details for ID 1 and summarize their recent order history.

Frequently Asked Questions

What are the key features of AI Customer Support Agent?

Answers customer questions using a knowledge base connector. Retrieves real-time order status from a database. Creates support tickets for unresolved customer issues. Summarizes conversations for human agent review. Detects potential escalation scenarios.

What can I use AI Customer Support Agent for?

Automating routine order tracking inquiries for e-commerce stores. Providing instant answers to common FAQs via a knowledge base. Streamlining support workflows by automatically creating tickets for complex issues. Assisting human agents by summarizing long customer conversation histories.

How do I install AI Customer Support Agent?

Install AI Customer Support Agent by running: pip install -r requirements.txt

What MCP clients work with AI Customer Support Agent?

AI Customer Support Agent works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep AI Customer Support Agent docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare