Local LLM MCP Server

$git clone https://github.com/georgepok/local-llm-mcp-server.git && cd local-llm-mcp-server && npm install && npm run build
README.md

Bridges local LLMs running in LM Studio with MCP clients like Claude Desktop

Local LLM MCP Server

A Model Context Protocol (MCP) server that bridges local LLMs running in LM Studio with Claude Desktop and other MCP clients. Keep your sensitive data private by running AI tasks locally while seamlessly integrating with cloud-based AI assistants.

🌟 Features

🔒 Privacy-First Design

  • Local Processing: All sensitive data stays on your machine
  • No Cloud Exposure: Private analysis, code review, and content processing happens locally
  • Privacy Levels: Configurable privacy protection (strict, moderate, minimal)
  • No Telemetry: Zero usage tracking or data collection

🤖 Dynamic Multi-Model Support

  • Auto-Discovery: Automatically detects all models loaded in LM Studio
  • Flexible Selection: Use different models for different tasks
  • Runtime Switching: Change default models during your session
  • Per-Request Override: Specify model for individual requests
  • Smart Initialization: First available model auto-selected as default

🛠️ Comprehensive Tool Suite

Local Reasoning - General-purpose AI tasks with complete privacy

  • Complex problem solving and multi-step reasoning
  • Question answering and task planning
  • Context-aware responses

Private Analysis - 7 analysis types for sensitive content

  • Sentiment Analysis (domain-aware)
  • Entity Extraction (people, orgs, locations, domain-specific)
  • Content Classification
  • Summarization with key points
  • Privacy Scanning (PII, GDPR compliance)
  • Security Auditing (vulnerabilities, misconfigurations)

Secure Rewriting - Transform text while maintaining privacy

  • Style adaptation (formal, casual, professional)
  • Sensitive information removal
  • Privacy-preserving transformations

Code Analysis - Local code review and security

  • Security vulnerability detection
  • Code quality assessment
  • Bug detection and optimization suggestions

Template Completion - Intelligent form and document filling

🎯 Domain-Specific Intelligence

Specialized analysis for:

  • Medical: Healthcare context, HIPAA compliance, clinical terminology
  • Legal: Legal terminology, regulatory compliance, confidentiality
  • Financial: Financial regulations, market analysis, data protection
  • Technical: Software development, engineering contexts
  • Academic: Scholarly research, methodology, citations

🚀 Quick Start

Prerequisites

  1. Node.js 18+

    node --version  # Should be >= 18.0.0
    
  2. LM Studio

    • Download from lmstudio.ai
    • Load at least one model (e.g., Llama 3.2, Qwen, Mistral)
    • Start the local server (Server tab → Start Server)
    • Default URL: http://localhost:1234

Installation

git clone https://github.com/yourusername/local-llm-mcp-server.git
cd local-llm-mcp-server
npm install
npm run build

Configure Claude Desktop

Edit your Claude Desktop config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Add this configuration:

{
  "mcpServers": {
    "local-llm": {
      "command": "node",
      "args": ["/absolute/path/to/local-llm-mcp-server/dist/index.js"]
    }
  }
}

Important: Use the absolute path to your installation.

Start Using

  1. Restart Claude Desktop - The server starts automatically
  2. Discover Models - Read resource local://models to see available models
  3. Try It - Ask Claude to use the local_reasoning tool with a simple prompt

The server automatically:

  • Discovers all models loaded in LM Studio
  • Sets the first model as default
  • Provides full capability documentation via local://capabilities

Convenience Scripts

For easier server management, use the included scripts:

# Start in local mode (stdio - for Claude Desktop)
npm run start:local

# Start in remote mode (HTTP - for network access)
npm run start:remote

# Start in secure mode (HTTPS - encrypted network access)
npm run start:https

# Start in dual mode (stdio + HTTP for both local and remote)
npm run start:dual

# Generate SSL certificates for HTTPS
npm run generate:certs

# Stop all running servers
npm run stop

See SCRIPTS_GUIDE.md for detailed usage.

🌐 Remote Network Access

Access the server from other devices on your home network or connect Claude Desktop remotely!

Connect Claude Desktop Remotely

Quick Start (3 steps):

# 1. Start HTTPS server
npm run start:https

# 2. Add to claude_desktop_config.json:
{
  "mcpServers": {
    "local-llm-remote": {
      "command": "npx",
      "args": ["-y", "mcp-remote", "https://localhost:301

Tools (5)

local_reasoningGeneral-purpose AI tasks with complete privacy including complex problem solving and task planning.
private_analysisPerforms 7 analysis types including sentiment, entity extraction, classification, and privacy scanning.
secure_rewritingTransform text style or remove sensitive information while maintaining privacy.
code_analysisLocal code review for security vulnerabilities, quality assessment, and bug detection.
template_completionIntelligent form and document filling using local models.

Configuration

claude_desktop_config.json
{"mcpServers": {"local-llm": {"command": "node", "args": ["/absolute/path/to/local-llm-mcp-server/dist/index.js"]}}}

Try it

Use the local_reasoning tool to analyze this sensitive project plan without sending it to the cloud.
Run a private_analysis on this document to check for GDPR compliance and PII.
Perform a local code_analysis on this snippet to find security vulnerabilities.
Rewrite this professional email using the secure_rewriting tool to remove all specific names and locations.
Check local://models to see which LM Studio models are currently available for reasoning.

Frequently Asked Questions

What are the key features of Local LLM MCP Server?

Auto-discovery of all models loaded in LM Studio for dynamic runtime switching.. Privacy-first design with configurable protection levels (strict, moderate, minimal).. Domain-specific intelligence for Medical, Legal, Financial, and Technical contexts.. Multi-mode operation supporting stdio, HTTP, and encrypted HTTPS connections.. Comprehensive tool suite for reasoning, analysis, rewriting, and code auditing..

What can I use Local LLM MCP Server for?

Enterprises needing to perform AI analysis on sensitive data without cloud exposure.. Developers requiring local code reviews and security audits for proprietary source code.. Healthcare or Legal professionals processing documents with strict HIPAA or confidentiality requirements.. Users wanting to leverage local GPU power for AI tasks within the Claude Desktop interface..

How do I install Local LLM MCP Server?

Install Local LLM MCP Server by running: git clone https://github.com/georgepok/local-llm-mcp-server.git && cd local-llm-mcp-server && npm install && npm run build

What MCP clients work with Local LLM MCP Server?

Local LLM MCP Server works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Use Local LLM MCP Server with Conare

Manage MCP servers visually, upload persistent context, and never start from zero with Claude Code & Codex.

Try Free