Ollama MCP Server

$claude mcp add --transport stdio ollama -- node /path/to/ollama-mcp-server/dist/index.js
README.md

A bridge to use Ollama as an MCP server from Claude Code.

Ollama MCP Server

A bridge to use Ollama as an MCP server from Claude Code.

日本語版 README はこちら

Features

  • ollama_generate: Single-turn text generation (supports vision models with image input)
  • ollama_chat: Multi-turn chat conversations (supports vision models with image input)
  • ollama_list: List available models
  • ollama_show: Show model details
  • ollama_pull: Download models
  • ollama_embeddings: Generate text embeddings

Supported Vision Models

  • llava - General-purpose vision model
  • llama3.2-vision - Meta's multimodal model
  • deepseek-ocr - OCR-specialized vision model

Prerequisites

  1. Ollama installed and running

    # Install Ollama (macOS)
    brew install ollama
    
    # Start Ollama server
    ollama serve
    
  2. At least one model downloaded

    ollama pull llama3.2
    

Installation

cd ollama-mcp-server
npm install
npm run build

Claude Code Configuration

Method 1: Using CLI (Recommended)

# Add to local scope (current project)
claude mcp add --transport stdio ollama -- node /path/to/ollama-mcp-server/dist/index.js

# Add to user scope (all projects)
claude mcp add --transport stdio ollama --scope user -- node /path/to/ollama-mcp-server/dist/index.js

To add environment variables:

claude mcp add --transport stdio ollama \
  --env OLLAMA_BASE_URL=http://localhost:11434 \
  -- node /path/to/ollama-mcp-server/dist/index.js

Method 2: Manual Configuration

Project scope (.mcp.json in project root):

{
  "mcpServers": {
    "ollama": {
      "command": "node",
      "args": ["/path/to/ollama-mcp-server/dist/index.js"],
      "env": {
        "OLLAMA_BASE_URL": "http://localhost:11434"
      }
    }
  }
}

User scope (~/.claude.json):

{
  "mcpServers": {
    "ollama": {
      "command": "node",
      "args": ["/path/to/ollama-mcp-server/dist/index.js"],
      "env": {
        "OLLAMA_BASE_URL": "http://localhost:11434"
      }
    }
  }
}

Verify Installation

# List configured MCP servers
claude mcp list

# Inside Claude Code
/mcp

Auto-approve Tool Calls (Optional)

By default, Claude Code asks for confirmation each time an Ollama tool is called. To skip confirmations, add the following to ~/.claude/settings.json:

{
  "permissions": {
    "allow": [
      "mcp__ollama__ollama_generate",
      "mcp__ollama__ollama_chat",
      "mcp__ollama__ollama_list",
      "mcp__ollama__ollama_show",
      "mcp__ollama__ollama_pull",
      "mcp__ollama__ollama_embeddings"
    ]
  }
}

Environment Variables

Variable Default Description
OLLAMA_BASE_URL http://localhost:11434 Ollama server URL

Usage Examples

From Claude Code:

List Models

List available Ollama models

Text Generation

Generate "3 features of Rust" using Ollama's llama3.2 model

Chat

I'd like to have Ollama do a code review

Vision / Image Analysis

Analyze this image using llava: /path/to/image.jpg
Use deepseek-ocr to extract text from this document: /path/to/document.png

Troubleshooting

Cannot connect to Ollama

# Check if Ollama is running
curl http://localhost:11434/api/tags

# If not running
ollama serve

No models available

ollama pull llama3.2

MCP server not showing up

# Verify server is registered
claude mcp list

# Check server health
claude mcp get ollama

License

MIT

Tools (6)

ollama_generateSingle-turn text generation (supports vision models with image input)
ollama_chatMulti-turn chat conversations (supports vision models with image input)
ollama_listList available models
ollama_showShow model details
ollama_pullDownload models
ollama_embeddingsGenerate text embeddings

Environment Variables

OLLAMA_BASE_URLOllama server URL (Default: http://localhost:11434)

Configuration

claude_desktop_config.json
{
  "mcpServers": {
    "ollama": {
      "command": "node",
      "args": ["/path/to/ollama-mcp-server/dist/index.js"],
      "env": {
        "OLLAMA_BASE_URL": "http://localhost:11434"
      }
    }
  }
}

Try it

List available Ollama models
Generate "3 features of Rust" using Ollama's llama3.2 model
Analyze this image using llava: /path/to/image.jpg
Use deepseek-ocr to extract text from this document: /path/to/document.png
I'd like to have Ollama do a code review

Frequently Asked Questions

What are the key features of Ollama MCP Server?

Single-turn and multi-turn text generation via local Ollama instances.. Support for vision models including llava, llama3.2-vision, and deepseek-ocr.. Model management capabilities including listing, pulling, and showing details.. Text embedding generation for RAG or similarity tasks.. Configurable environment variables for custom Ollama server URLs..

What can I use Ollama MCP Server for?

Running private, local LLM inferences directly from the Claude Code interface.. Performing OCR on local documents using specialized models like deepseek-ocr.. Analyzing local images with multimodal models without uploading to the cloud.. Managing local model libraries (pulling/listing) through natural language commands.. Generating embeddings for local text data using Ollama-hosted models..

How do I install Ollama MCP Server?

Install Ollama MCP Server by running: claude mcp add --transport stdio ollama -- node /path/to/ollama-mcp-server/dist/index.js

What MCP clients work with Ollama MCP Server?

Ollama MCP Server works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Use Ollama MCP Server with Conare

Manage MCP servers visually, upload persistent context, and never start from zero with Claude Code & Codex.

Try Free