JIT Tool Synthesis MCP Server

Local setup required. This server has to be cloned and prepared on your machine before you register it in Claude Code.
1

Set the server up locally

Run this once to clone and prepare the server before adding it to Claude Code.

Run in terminal
npm install
npm run build
2

Register it in Claude Code

After the local setup is done, run this command to point Claude Code at the built server.

Run in terminal
claude mcp add -e "LLM_API_KEY=${LLM_API_KEY}" jit-tool-synthesis -- node "<FULL_PATH_TO_JIT_TOOL_SYNTHESIS>/dist/index.js"

Replace <FULL_PATH_TO_JIT_TOOL_SYNTHESIS>/dist/index.js with the actual folder you prepared in step 1.

Required:LLM_API_KEY+ 2 optional
README.md

LLM-powered on-demand tool generation with human-in-the-loop approval

JIT Tool Synthesis v3

LLM-powered on-demand tool generation with human-in-the-loop approval and safe execution.

Overview

This system generates TypeScript tools dynamically using an LLM, requires human approval before execution, and runs them in a sandboxed environment.

Architecture

┌─────────────┐     ┌──────────────┐     ┌─────────────┐
│ Synthesizer │────▶│   Approval   │────▶│  Sandbox    │
│   (LLM)     │     │ (Human Gate) │     │ (Execution) │
└─────────────┘     └──────────────┘     └─────────────┘
       │                   │                    │
       ▼                   ▼                    ▼
  Generates TS        Waits for            Runs in
  tool code          human approval       isolated env

Components

File Purpose
synthesizer.ts Generates tool code using any OpenAI-compatible LLM
approval.ts Human-in-the-loop gate — requires approval before execution
sandbox.ts Safe execution environment for generated code
registry.ts Tool persistence and storage
server.ts MCP server integration
config.ts Runtime configuration management

Provider-Agnostic

This tool works with any OpenAI-compatible LLM API:

  • OpenRouter — 100+ models (Claude, GPT, Llama, etc.)
  • OpenAI — GPT-5.4, Codex 5.3
  • Ollama — Local models (Llama, Qwen, etc.)
  • LM Studio — Local models with GUI
  • Groq — Fast inference
  • Any other OpenAI-compatible API

Setup

# Install dependencies
npm install

# Copy environment template
cp .env.example .env

Configure Your LLM Provider

Edit .env with your provider details:

# Option 1: OpenRouter (default - 100+ models)
LLM_API_KEY=your-openrouter-key
LLM_BASE_URL=https://openrouter.ai/api/v1
LLM_MODEL=anthropic/claude-4-5-sonnet

# Option 2: OpenAI direct
LLM_API_KEY=sk-...
LLM_BASE_URL=https://api.openai.com/v1
LLM_MODEL=gpt-4o

# Option 3: Ollama (local)
LLM_BASE_URL=http://localhost:11434/v1
LLM_MODEL=llama3.1

# Option 4: Groq
LLM_API_KEY=gsk_...
LLM_BASE_URL=https://api.groq.com/openai/v1
LLM_MODEL=llama-3.1-70b-versatile

Usage

Start the MCP Server

npm run build
node dist/server.js

Runtime Configuration

You can change the LLM provider without restarting:

# View current config
get_config

# Change model at runtime
set_config model=openai/gpt-4o

MCP Tools

Tool Description
synthesize_tool Generate a new tool from natural language
approve_tool Activate a pending tool
reject_tool Discard a pending tool
execute_tool Run an approved tool
list_generated_tools List all approved tools
get_tool View tool details
remove_tool Delete a tool
list_pending List tools waiting for approval
get_config View LLM configuration
set_config Change LLM provider/model at runtime

Workflow

  1. Request — User asks for a tool (e.g., "create a weather fetcher")
  2. Synthesize — LLM generates tool code
  3. Approve — Human reviews and approves the code
  4. Execute — Tool runs in sandboxed environment
  5. Store — Approved tools persist in registry

Environment Variables

Variable Description Default
LLM_API_KEY API key for your provider (required for cloud)
LLM_BASE_URL API endpoint https://openrouter.ai/api/v1
LLM_MODEL Model to use anthropic/claude-3-5-sonnet-20241022

Also supported (legacy): OPENROUTER_API_KEY, OPENAI_API_KEY, OPENAI_BASE_URL, SYNTHESIZER_MODEL

Security

  • Generated code runs in isolated VM sandbox
  • Blocked patterns prevent dangerous code (process, require, eval, etc.)
  • API keys not stored in config file

Status

Production Ready — Phase 1 complete.

Tools (10)

synthesize_toolGenerate a new tool from natural language
approve_toolActivate a pending tool
reject_toolDiscard a pending tool
execute_toolRun an approved tool
list_generated_toolsList all approved tools
get_toolView tool details
remove_toolDelete a tool
list_pendingList tools waiting for approval
get_configView LLM configuration
set_configChange LLM provider/model at runtime

Environment Variables

LLM_API_KEYrequiredAPI key for your provider
LLM_BASE_URLAPI endpoint
LLM_MODELModel to use

Configuration

claude_desktop_config.json
{"mcpServers": {"jit-tool-synthesis": {"command": "node", "args": ["/path/to/jit-tool-synthesis/dist/server.js"], "env": {"LLM_API_KEY": "your-key-here"}}}}

Try it

Create a tool that fetches the current weather for a given city.
List all the tools I have generated so far.
Approve the pending tool with ID 123.
Execute the weather fetcher tool for New York.
Change the LLM model to gpt-4o.

Frequently Asked Questions

What are the key features of JIT Tool Synthesis?

Dynamic TypeScript tool generation using LLMs. Human-in-the-loop approval gate for security. Isolated VM sandbox for safe code execution. Provider-agnostic support for OpenAI-compatible APIs. Runtime configuration management for LLM switching.

What can I use JIT Tool Synthesis for?

Rapidly prototyping custom automation scripts via natural language. Extending Claude's capabilities with persistent, user-defined tools. Safely executing generated code in an isolated environment. Managing a library of custom tools for repetitive tasks.

How do I install JIT Tool Synthesis?

Install JIT Tool Synthesis by running: npm install && npm run build

What MCP clients work with JIT Tool Synthesis?

JIT Tool Synthesis works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep JIT Tool Synthesis docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare