Second Opinion MCP MCP Server

$python main.py
README.md

Get instant second opinions from 17 AI platforms and 800,000+ models

🤖 Second Opinion MCP

🎯 Get instant second opinions from 17 AI platforms and 800,000+ models

OpenAI • Gemini • Grok • Claude • HuggingFace • DeepSeek • OpenRouter • Mistral • Together AI • Cohere • Groq • Perplexity • Replicate • AI21 Labs • Stability AI • Fireworks AI • Anyscale


🚀 What it does

This MCP server allows Claude to consult other AI models for different perspectives on:

  • Coding problems - Compare approaches across models
  • Creative writing - Get diverse style feedback
  • Problem solving - Validate logic and reasoning
  • Cross-model analysis - See how different AIs tackle the same task
  • Group discussions - Host AI debates with multiple models
  • Custom model access - Use any HuggingFace model via Inference API

✨ Version 5.0 Features & Improvements

🎭 **NEW: AI Personality System**

  • 5 Distinct Personalities: honest, freind, coach, wise, creative
  • Intelligent Model Matching: Each personality uses models best suited for their character
  • Always Available: Works with any configured AI provider

🧠 **NEW: Intelligent Model Selection**

  • Quality-Based Ranking: 34+ models ranked by capability (Grok-4 → Gemini Pro → GPT-4.1)
  • Smart Defaults: Automatically selects the best available model
  • Personality Optimization: Different models for different personality types

🏗️ **NEW: Modular Architecture**

  • 5 Clean Files: Replaced 51k+ token monolith with maintainable modules
  • Professional Structure: client_manager.py, ai_providers.py, conversation_manager.py, mcp_server.py, main.py
  • JSON Configuration: Easy model priority updates via model_priority.json

🚀 Major Platform Integrations

  • 🎭 Replicate: Access to open-source models including Llama 2, CodeLlama, Mistral, and more
  • 🌟 AI21 Labs: Jamba 1.5 models with advanced reasoning capabilities
  • 🎨 Stability AI: StableLM models including code-specialized variants
  • 🔥 Fireworks AI: Ultra-fast inference for popular open-source models
  • 🚀 Anyscale: Ray-powered LLM serving with enterprise-grade reliability

🆕 Enhanced Existing Platform Support

  • 🤖 Mistral AI: Direct access to Mistral's latest models including mistral-large-latest and codestral-latest
  • 🔗 Together AI: Access to 200+ open-source models with fast inference
  • 🧠 Cohere: Enterprise-grade language models with Command R+ and Command R
  • ⚡ Groq Fast: Ultra-fast inference API for lightning-quick responses
  • 🔍 Perplexity AI: Web-connected AI with real-time search capabilities

🔧 Previous Bug Fixes (v3.0)

  • Fixed HuggingFace Models: Completely rebuilt HuggingFace integration with advanced retry logic, better model format detection, and comprehensive error handling
  • Fixed Gemini Blank Responses: Enhanced Gemini conversation handling to prevent empty responses in long chats with smart fallback and retry mechanisms
  • Improved Error Handling: Better error messages with helpful suggestions for troubleshooting

🤖 HuggingFace Integration (Enhanced)

Access any of the 800,000+ models on HuggingFace Hub via their Inference API with improved reliability:

  • meta-llama/Llama-3.1-8B-Instruct - Fast and reliable
  • meta-llama/Llama-3.1-70B-Instruct - Powerful reasoning
  • mistralai/Mistral-7B-Instruct-v0.3 - Efficient French-developed model
  • Qwen/Qwen2.5-7B-Instruct - Alibaba's latest model

🧠 DeepSeek Models

Get opinions from DeepSeek's powerful reasoning models:

  • deepseek-chat (DeepSeek-V3) - Fast and efficient
  • deepseek-reasoner (DeepSeek-R1) - Advanced reasoning

🤔 Grok 4 Thinking

Access xAI's thinking models with enhanced reasoning:

  • grok-4 - Latest flagship model
  • grok-3-thinking - Step-by-step reasoning model, last gen
  • grok-3-mini - Lightweight thinking model with reasoning_effort c

Environment Variables

OPENAI_API_KEYAPI key for OpenAI models
GOOGLE_API_KEYAPI key for Gemini models
XAI_API_KEYAPI key for Grok models
HUGGINGFACE_API_KEYAPI key for HuggingFace Inference API
OPENROUTER_API_KEYAPI key for OpenRouter access
REPLICATE_API_TOKENAPI token for Replicate models

Configuration

claude_desktop_config.json
{
  "mcpServers": {
    "second-opinion": {
      "command": "python",
      "args": ["path/to/main.py"],
      "env": {
        "OPENAI_API_KEY": "your_key",
        "ANTHROPIC_API_KEY": "your_key",
        "GOOGLE_API_KEY": "your_key",
        "XAI_API_KEY": "your_key",
        "HUGGINGFACE_API_KEY": "your_key",
        "OPENROUTER_API_KEY": "your_key",
        "MISTRAL_API_KEY": "your_key",
        "TOGETHER_API_KEY": "your_key",
        "COHERE_API_KEY": "your_key",
        "GROQ_API_KEY": "your_key",
        "PERPLEXITY_API_KEY": "your_key",
        "REPLICATE_API_TOKEN": "your_key",
        "AI21_API_KEY": "your_key",
        "STABILITY_API_KEY": "your_key",
        "FIREWORKS_API_KEY": "your_key",
        "ANYSCALE_API_KEY": "your_key"
      }
    }
  }
}

Try it

Ask Grok-4 for a second opinion on this Python script's efficiency.
Start a group discussion between Gemini Pro and Llama 3.1 about the ethics of AI.
Get a creative writing critique from a 'wise' personality using a HuggingFace model.
Compare how Mistral Large and GPT-4 handle this complex logic puzzle.
Use the 'coach' personality to review my project architecture via DeepSeek.

Frequently Asked Questions

What are the key features of Second Opinion MCP?

Consults over 17 AI platforms and 800,000+ models including OpenAI, Gemini, and Grok.. AI Personality System with 5 distinct characters: honest, friend, coach, wise, and creative.. Intelligent Model Selection that ranks 34+ models by capability and quality.. HuggingFace Integration for direct access to any model via Inference API.. Multi-AI group discussions and debates directly within the chat interface..

What can I use Second Opinion MCP for?

Comparing coding approaches across different models to find the most efficient solution.. Validating complex logic and reasoning by cross-referencing multiple AI perspectives.. Hosting AI debates to explore different sides of a creative or philosophical topic.. Accessing specialized open-source models from Replicate or Together AI for niche tasks.. Getting diverse style feedback on creative writing using specific AI personalities..

How do I install Second Opinion MCP?

Install Second Opinion MCP by running: python main.py

What MCP clients work with Second Opinion MCP?

Second Opinion MCP works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Use Second Opinion MCP with Conare

Manage MCP servers visually, upload persistent context, and never start from zero with Claude Code & Codex.

Try Free