AI Visibility MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add ai-visibility -- npx -y ai-visibility-mcp
README.md

Track your brand across ChatGPT, Perplexity, Claude, and Gemini.

Know exactly how AI platforms talk about your brand. Track mentions, sentiment, position, and competitors across the AI platforms that shape buying decisions.


Features

  • Multi-Platform Tracking — Monitor visibility across ChatGPT, Perplexity, Claude, and Gemini simultaneously
  • Visibility Scoring — Get a 0-100 score with tier ratings (Excellent / Good / Moderate / Low / Very Low)
  • Sentiment Analysis — Understand whether AI platforms describe your brand positively, neutrally, or negatively
  • Competitor Intelligence — See which competitors appear alongside your brand in AI responses
  • Brand Comparison — Compare up to 10 brands side by side with per-platform breakdowns
  • Actionable Recommendations — Get prioritized steps to improve your AI visibility
  • MCP Server — Works with Claude Desktop, Cursor, and any MCP-compatible client
  • Consistent Results — Seeded randomness ensures reproducible results for the same brand

MCP Server Installation

Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "ai-visibility": {
      "command": "npx",
      "args": ["-y", "ai-visibility-mcp"]
    }
  }
}

Cursor

Add to your Cursor MCP settings (.cursor/mcp.json):

{
  "mcpServers": {
    "ai-visibility": {
      "command": "npx",
      "args": ["-y", "ai-visibility-mcp"]
    }
  }
}

npx (standalone)

npx ai-visibility-mcp

MCP Tools

Tool Description
check_brand_visibility Check a brand's visibility across AI platforms with detailed per-platform results, mention rates, positions, sentiment, and competitor analysis
check_single_query Check brand mention for a specific query on a specific platform — get mention status, position, context snippet, and sentiment
get_visibility_score Calculate overall AI visibility score (0-100) with tier rating, per-platform breakdowns, and improvement recommendations
compare_brands Compare visibility of 2-10 brands side by side with ranked results and per-platform scores
get_recommendations Get prioritized, actionable recommendations to improve AI visibility based on current score
list_platforms List all supported AI platforms with details on how each sources and presents brand information

How Scoring Works

The visibility score (0-100) is calculated from three components:

Component Weight What It Measures
Mention Rate 40% How often the brand appears in AI responses
Position Quality 30% Where the brand is mentioned (1st = best, 5th = worst)
Sentiment 30% Whether mentions are positive, neutral, or negative

Score Tiers:

  • 80-100 Excellent — Strong presence across platforms
  • 60-79 Good — Visible but room to improve
  • 40-59 Moderate — Appears in some responses
  • 20-39 Low — Limited visibility
  • 0-19 Very Low — Rarely mentioned

Self-Hosting

Clone the repo and build the MCP server:

git clone https://github.com/sharozdawa/ai-visibility.git
cd ai-visibility/mcp-server
npm install
npm run build

Run the MCP server directly:

node dist/index.js

Or point your MCP client to the local build:

{
  "mcpServers": {
    "ai-visibility": {
      "command": "node",
      "args": ["/path/to/ai-visibility/mcp-server/dist/index.js"]
    }
  }
}

Run the Web Dashboard

The project includes a Next.js dashboard for visual tracking:

cd ai-visibility
cp .env.example .env
# Set your DATABASE_URL in .env
npm install
npx prisma db

Tools (6)

check_brand_visibilityCheck a brand's visibility across AI platforms with detailed per-platform results, mention rates, positions, sentiment, and competitor analysis
check_single_queryCheck brand mention for a specific query on a specific platform
get_visibility_scoreCalculate overall AI visibility score (0-100) with tier rating, per-platform breakdowns, and improvement recommendations
compare_brandsCompare visibility of 2-10 brands side by side with ranked results and per-platform scores
get_recommendationsGet prioritized, actionable recommendations to improve AI visibility based on current score
list_platformsList all supported AI platforms with details on how each sources and presents brand information

Environment Variables

DATABASE_URLDatabase connection string for the web dashboard

Configuration

claude_desktop_config.json
{"mcpServers": {"ai-visibility": {"command": "npx", "args": ["-y", "ai-visibility-mcp"]}}}

Try it

Check the current visibility score for my brand across all supported AI platforms.
Compare my brand's visibility against my top 3 competitors.
What are the top 3 actionable recommendations to improve my brand's sentiment on ChatGPT?
List all platforms currently supported by the AI Visibility tracker.

Frequently Asked Questions

What are the key features of AI Visibility?

Multi-platform tracking across ChatGPT, Perplexity, Claude, and Gemini. Visibility scoring (0-100) with tier-based ratings. Sentiment analysis of brand mentions. Competitor intelligence and side-by-side brand comparison. Actionable recommendations for improving AI presence.

What can I use AI Visibility for?

Monitoring brand reputation across major AI search and chat interfaces. Benchmarking brand performance against industry competitors in AI responses. Identifying specific platforms where brand visibility is low or sentiment is negative. Generating data-driven strategies to improve brand positioning in AI-generated content.

How do I install AI Visibility?

Install AI Visibility by running: npx -y ai-visibility-mcp

What MCP clients work with AI Visibility?

AI Visibility works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep AI Visibility docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare