Ask Gemini MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add -e "GEMINI_API_KEY=${GEMINI_API_KEY}" ask-gemini -- npx -y ask-gemini-mcp
Required:GEMINI_API_KEY
README.md

MCP server that connects any AI client to Google Gemini CLI

Ask Gemini MCP

MCP server that connects any AI client to Google Gemini CLI

An MCP server for AI-to-AI collaboration via the Gemini CLI. Available on npm: `ask-gemini-mcp`. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's massive 1M+ token context window for large file and codebase analysis while your primary AI handles interaction and code editing.

Why?

  • Get a second opinion — Ask Gemini to review your coding approach before committing to it
  • Debate plans — Send architecture proposals to Gemini for critique and alternative suggestions
  • Review changes — Have Gemini analyze diffs or modified files to catch issues your primary AI might miss
  • Massive context — Gemini reads entire codebases (1M+ tokens) that would overflow other models

Quick Start

Claude Code

# Project scope (available in current project only)
claude mcp add gemini-cli -- npx -y ask-gemini-mcp

# User scope (available across all projects)
claude mcp add --scope user gemini-cli -- npx -y ask-gemini-mcp

Claude Desktop

Add to your config file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "gemini-cli": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    }
  }
}
Other config file locations
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/claude/claude_desktop_config.json

Cursor

Add to .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):

{
  "mcpServers": {
    "gemini-cli": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    }
  }
}

Codex CLI

Add to ~/.codex/config.toml (or .codex/config.toml in your project):

[mcp_servers.gemini-cli]
command = "npx"
args = ["-y", "ask-gemini-mcp"]

Or via CLI:

codex mcp add gemini-cli -- npx -y ask-gemini-mcp

OpenCode

Add to opencode.json in your project (or ~/.config/opencode/opencode.json for global):

{
  "mcp": {
    "gemini-cli": {
      "type": "local",
      "command": ["npx", "-y", "ask-gemini-mcp"]
    }
  }
}

Any MCP Client (STDIO Transport)

{
  "transport": {
    "type": "stdio",
    "command": "npx",
    "args": ["-y", "ask-gemini-mcp"]
  }
}

Prerequisites

Tools

Tool Purpose
ask-gemini Send prompts to Gemini CLI. Supports @ file syntax, model selection, sandbox mode, and changeMode for structured edits
fetch-chunk Retrieve subsequent chunks from cached large responses
ping Connection test — verify MCP setup without using Gemini tokens

Usage Examples

File analysis (@ syntax):

  • ask gemini to analyze @src/main.js and explain what it does
  • use gemini to summarize @. the current directory

Code review:

  • ask gemini to review the changes in @src/auth.ts for security issues
  • use gemini to compare @old.js and @new.js

General questions:

  • ask gemini about best practices for React state management

Sandbox mode:

  • use gemini sandbox to create and run a Python script

Models

Model Use Case
gemini-3.1-pro-preview Default — best quality reasoning
gemini-3-flash-preview Faster responses, large codebases

The server automatically falls back to Flash when Pro quota is exceeded.

Contributing

Contributions are welcome! See open issues for things to work on.

License

MIT License. See LICENSE for details.

Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.

Tools (3)

ask-geminiSend prompts to Gemini CLI with support for file syntax, model selection, and structured edits.
fetch-chunkRetrieve subsequent chunks from cached large responses.
pingConnection test to verify MCP setup without using Gemini tokens.

Environment Variables

GEMINI_API_KEYrequiredRequired for the underlying Google Gemini CLI authentication.

Configuration

claude_desktop_config.json
{"mcpServers": {"gemini-cli": {"command": "npx", "args": ["-y", "ask-gemini-mcp"]}}}

Try it

Ask gemini to analyze @src/main.js and explain what it does
Use gemini to summarize @. the current directory
Ask gemini to review the changes in @src/auth.ts for security issues
Use gemini to compare @old.js and @new.js
Use gemini sandbox to create and run a Python script

Frequently Asked Questions

What are the key features of Ask Gemini?

Leverages Gemini's 1M+ token context window for large codebase analysis. Supports AI-to-AI collaboration by connecting primary AI to Gemini. Includes sandbox mode for creating and running Python scripts. Automatic fallback to Gemini Flash model when Pro quota is exceeded. Supports file-based context via @ syntax.

What can I use Ask Gemini for?

Getting a second opinion on coding approaches before committing changes. Critiquing architecture proposals and generating alternative suggestions. Analyzing diffs or modified files to catch issues missed by primary AI. Reading and summarizing entire large codebases that exceed standard context limits.

How do I install Ask Gemini?

Install Ask Gemini by running: npx -y ask-gemini-mcp

What MCP clients work with Ask Gemini?

Ask Gemini works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Ask Gemini docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare