Bonnard MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add bonnard -- npx @bonnard/cli init --self-hosted
README.md

Self-hosted semantic layer for AI agents.

  <source media="(prefers-color-scheme: dark)" srcset="./assets/banner-dark.png" />
  <source media="(prefers-color-scheme: light)" srcset="./assets/banner-light.png" />
  

Self-hosted semantic layer for AI agents.

Docs · CLI · Discord · Website


Bonnard is an agent-native semantic layer — one set of metric definitions, every consumer (AI agents, apps, dashboards) gets the same governed answer. This repo is the self-hosted Docker deployment: run Bonnard on your own infrastructure with no cloud account needed.

Quick Start

# 1. Scaffold project
npx @bonnard/cli init --self-hosted

# 2. Configure your data source
#    Edit .env with your database credentials

# 3. Start the server
docker compose up -d

# 4. Define your semantic layer
#    Add cube/view YAML files to bonnard/cubes/ and bonnard/views/

# 5. Deploy models to the server
bon deploy

# 6. Verify your semantic layer
bon schema

# 7. Connect AI agents
bon mcp

Requires Node.js 20+ and Docker.

What's Included

  • MCP server — AI agents query your semantic layer over the Model Context Protocol
  • Cube semantic layer — SQL-based metric definitions with caching, access control, and multi-database support
  • Cube Store — pre-aggregation cache for fast analytical queries
  • Admin UI — browse deployed models, views, and measures at http://localhost:3000
  • Deploy API — push model updates via bon deploy without restarting containers
  • Health endpointGET /health for uptime monitoring

Connecting AI Agents

Run bon mcp to see connection config for your setup. Examples below.

Claude Desktop / Cursor

{
  "mcpServers": {
    "bonnard": {
      "url": "https://bonnard.example.com/mcp",
      "headers": {
        "Authorization": "Bearer your-secret-token-here"
      }
    }
  }
}

Claude Code

{
  "mcpServers": {
    "bonnard": {
      "type": "url",
      "url": "https://bonnard.example.com/mcp",
      "headers": {
        "Authorization": "Bearer your-secret-token-here"
      }
    }
  }
}

CrewAI (Python)

from crewai import MCPServerAdapter

mcp = MCPServerAdapter(
    url="https://bonnard.example.com/mcp",
    transport="streamable-http",
    headers={"Authorization": "Bearer your-secret-token-here"}
)

Production Deployment

Authentication

Protect your endpoints by setting ADMIN_TOKEN in .env:

ADMIN_TOKEN=your-secret-token-here

All API and MCP endpoints will require Authorization: Bearer <token>. The /health endpoint remains open for monitoring.

Restart after changing .env:

docker compose up -d

TLS with Caddy

Caddy provides automatic HTTPS via Let's Encrypt.

Create a Caddyfile next to your docker-compose.yml:

bonnard.example.com {
    reverse_proxy localhost:3000
}

Add Caddy to your docker-compose.yml:

  caddy:
    image: caddy:2
    ports:
      - "80:80"
      - "443:443"
    volumes:
      - ./Caddyfile:/etc/caddy/Caddyfile:ro
      - caddy_data:/data
    restart: unless-stopped

Add the volume at the top level:

volumes:
  models: {}
  caddy_data: {}

Then remove the Bonnard port mapping (ports: - "3000:3000") since Caddy handles external traffic.

Deploy to a VM

# Copy project files to your server
scp -r . user@your-server:~/bonnard/

# SSH in and start
ssh user@your-server
cd ~/bonnard
docker compose up -d

Configuration

Variable Description Default
CUBEJS_DB_TYPE Database driver (postgres, duckdb, snowflake, bigquery, databricks, redshift, clickhouse) duckdb
CUBEJS_DB_* Database connection settings (host, port, name, user, pass)
CUBEJS_DATASOURCES Comma

Environment Variables

ADMIN_TOKENAuthentication token for API and MCP endpoints
CUBEJS_DB_TYPEDatabase driver (postgres, duckdb, snowflake, bigquery, databricks, redshift, clickhouse)
CUBEJS_DB_*Database connection settings (host, port, name, user, pass)

Configuration

claude_desktop_config.json
{"mcpServers": {"bonnard": {"url": "https://bonnard.example.com/mcp", "headers": {"Authorization": "Bearer your-secret-token-here"}}}}

Try it

Query the semantic layer to get the total revenue for the last quarter.
List all available metrics and views defined in the Bonnard semantic layer.
Compare the user growth metrics between this month and last month using the semantic layer.
Fetch the top 10 products by sales volume from the connected data warehouse.

Frequently Asked Questions

What are the key features of Bonnard?

MCP server for AI agents to query semantic layers. SQL-based metric definitions with caching and access control. Multi-database support including Snowflake, BigQuery, and PostgreSQL. Cube Store pre-aggregation cache for fast analytical queries. Admin UI for browsing deployed models, views, and measures.

What can I use Bonnard for?

Enabling AI agents to generate accurate, governed business reports from raw warehouse data. Standardizing metric definitions across multiple AI agents and dashboarding tools. Providing a secure, self-hosted semantic layer for sensitive enterprise data. Accelerating analytical query performance for AI agents using pre-aggregation caching.

How do I install Bonnard?

Install Bonnard by running: npx @bonnard/cli init --self-hosted

What MCP clients work with Bonnard?

Bonnard works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Bonnard docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare