Integrates Redshift database query capabilities with vector-based knowledgebase
Combined MCP Server
A production-grade MCP (Model Context Protocol) server combining Redshift query capabilities and Knowledgebase vector store features.
Features
Redshift Tools
- run_query - Execute SQL with IAM authentication via
get_cluster_credentials - list_schemas - List database schemas
- list_tables - List tables in a schema
- describe_table - Get table structure
Large results (>100 rows) are automatically stored in S3 with 20 sample rows returned.
Knowledgebase Tools
- build_vectorstore - Build vector store from S3 markdown files
- query_vectorstore - Hybrid search (semantic + keyword) with RRF reranking
- get_vectorstore_status - Check build status and cache stats
Quick Start
Local Development
Install uv (if not already installed):
curl -LsSf https://astral.sh/uv/install.sh | sh # Or on Windows: powershell -c "irm https://astral.sh/uv/install.ps1 | iex"Start infrastructure:
docker-compose up -d postgres localstackInstall dependencies:
uv pip install -e ".[dev]"Configure environment:
cp .env.example .env.local # Edit .env.local with your settingsRun the server:
# With MCP Inspector mcp dev src/combined_mcp_server/main.py # Or directly python -m combined_mcp_server.main
ECS Deployment
# Build container
docker build -t combined-mcp-server .
# Run with health checks
docker run -p 8080:8080 --env-file .env combined-mcp-server
Health endpoints:
GET /health- Liveness probeGET /ready- Readiness probeGET /status- Detailed status
Configuration
See .env.example for all configuration options. Key settings:
| Variable | Description |
|---|---|
REDSHIFT_CLUSTER_ID |
Redshift cluster identifier |
POSTGRES_SECRET_NAME |
Secrets Manager secret for pgvector DB |
KNOWLEDGEBASE_S3_BUCKET |
S3 bucket with markdown files |
BEDROCK_EMBEDDING_MODEL |
Titan embedding model ID |
Architecture
┌─────────────────────────────────────────────────────┐
│ Combined MCP Server │
├─────────────────────┬───────────────────────────────┤
│ Redshift Tools │ Knowledgebase Tools │
│ ───────────────── │ ─────────────────────────── │
│ • run_query │ • build_vectorstore │
│ • list_schemas │ • query_vectorstore │
│ • list_tables │ • get_vectorstore_status │
│ • describe_table │ │
├─────────────────────┴───────────────────────────────┤
│ Core Services │
│ AWS (Secrets Manager, S3, Bedrock, Redshift) │
│ PostgreSQL + pgvector │
└─────────────────────────────────────────────────────┘
Testing
# Unit tests
pytest tests/ -v
# With coverage
pytest tests/ -v --cov=combined_mcp_server
# Integration tests (requires Docker)
docker-compose up -d
pytest tests/ -v -m integration
License
MIT
Tools (7)
run_queryExecute SQL with IAM authentication via get_cluster_credentialslist_schemasList database schemaslist_tablesList tables in a schemadescribe_tableGet table structurebuild_vectorstoreBuild vector store from S3 markdown filesquery_vectorstoreHybrid search (semantic + keyword) with RRF rerankingget_vectorstore_statusCheck build status and cache statsEnvironment Variables
REDSHIFT_CLUSTER_IDrequiredRedshift cluster identifierPOSTGRES_SECRET_NAMErequiredSecrets Manager secret for pgvector DBKNOWLEDGEBASE_S3_BUCKETrequiredS3 bucket with markdown filesBEDROCK_EMBEDDING_MODELrequiredTitan embedding model IDConfiguration
{"mcpServers": {"combined-mcp-server": {"command": "python", "args": ["-m", "combined_mcp_server.main"], "env": {"REDSHIFT_CLUSTER_ID": "your-cluster-id", "KNOWLEDGEBASE_S3_BUCKET": "your-bucket-name", "BEDROCK_EMBEDDING_MODEL": "amazon.titan-embed-text-v1"}}}}