HistGradientBoostingClassifier MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add --transport http histgradientboosting-classifier https://web-production-a620a.up.railway.app
README.md

Train, predict, and manage sklearn's HistGradientBoostingClassifier models.

HistGradientBoostingClassifier MCP Server

A Model Context Protocol (MCP) server that provides tools for training, predicting, and managing sklearn's HistGradientBoostingClassifier models.

Features

This MCP server exposes the following tools:

  • create_classifier: Create a new HistGradientBoostingClassifier with custom parameters
  • train_model: Train a classifier on provided data
  • predict: Make class predictions on new data
  • predict_proba: Get class probabilities for predictions
  • score_model: Evaluate model accuracy on test data
  • get_model_info: Get detailed information about a model
  • list_models: List all available models
  • delete_model: Remove a model from memory
  • save_model: Serialize a model to base64 string
  • load_model: Load a model from serialized string

Installation

pip install -r requirements.txt

Local Development

Run the server locally:

uvrun --with mcp server.py

The server will start on http://localhost:8000 by default.

Railway Deployment

Prerequisites

  1. A Railway account (sign up at https://railway.app)
  2. Railway CLI installed (optional, can use web interface)
  3. Git repository with your code (GitHub, GitLab, or Bitbucket)

Deploy via Railway Web Interface

  1. Go to https://railway.app and create a new project
  2. Click "New Project" → "Deploy from GitHub repo"
  3. Select your repository containing this MCP server
  4. Railway will automatically detect the Python project and use the Procfile
  5. The server will be deployed and you'll get a public URL (e.g., https://your-app.railway.app)

Deploy via Railway CLI

# Install Railway CLI
npm i -g @railway/cli

# Login to Railway
railway login

# Initialize project (in your project directory)
railway init

# Link to existing project or create new one
railway link

# Deploy
railway up

Environment Variables

No environment variables are required for basic operation. Railway automatically provides:

  • PORT: The port your application should listen on
  • The server automatically binds to 0.0.0.0 to accept external connections

Verifying Deployment

Once deployed, your MCP server will be available at your Railway URL. You can test it by:

  1. Visiting https://your-app.railway.app in a browser (should show MCP server info or 404, which is normal)
  2. Using the MCP Inspector: npx -y @modelcontextprotocol/inspector and connecting to your Railway URL
  3. Connecting from an MCP client using the streamable-http transport

Current Deployment URL: https://web-production-a620a.up.railway.app

Usage

Once deployed, the MCP server will be accessible at your Railway URL. You can connect to it using any MCP-compatible client.

Example: Using with Claude Desktop

Add to your Claude Desktop MCP configuration (~/Library/Application Support/Claude/claude_desktop_config.json on Mac):

{
  "mcpServers": {
    "histgradientboosting": {
      "url": "https://web-production-a620a.up.railway.app",
      "transport": "streamable-http"
    }
  }
}

Example API Calls

The server exposes tools that can be called via MCP protocol. Here's what each tool does:

Create a classifier:

create_classifier(
    model_id="my_model",
    learning_rate=0.1,
    max_iter=100,
    max_leaf_nodes=31
)

Train the model:

train_model(
    model_id="my_model",
    X=[[1, 2], [3, 4], [5, 6]],
    y=[0, 1, 0]
)

Make predictions:

predict(
    model_id="my_model",
    X=[[2, 3], [4, 5]]
)

Get probabilities:

predict_proba(
    model_id="my_model",
    X=[[2, 3], [4, 5]]
)

Model Storage

Currently, models are stored in-memory. This means:

  • Models persist only during the server's lifetime
  • Restarting the server will lose all models
  • For production use, consider implementing persistent storage (database, file system, or cloud storage)

API Reference

HistGradientBoostingClassifier Parameters

All standard sklearn HistGradientBoostingClassifier parameters are supported:

  • loss: Loss function (default: 'log_loss')
  • learning_rate: Learning rate/shrinkage (default: 0.1)
  • max_iter: Maximum boosting iterations (default: 100)
  • max_leaf_nodes: Maximum leaves per tree (default: 31)
  • max_depth: Maximum tree depth (default: None)
  • min_samples_leaf: Minimum samples per leaf (default: 20)
  • l2_regularization: L2 regularization (default: 0.0)
  • max_features: Feature subsampling proportion (default: 1.0)
  • max_bins: Maximum histogram bins (default: 255)
  • early_stopping: Enable early stopping (default: 'auto')
  • validation_fraction: Validation set fraction (default: 0.1)
  • n_iter_no_change: Early stopping patience (default: 10)
  • random_state: Random seed (default: None)
  • verbose: Verbosity level (default: 0)

See the [sklearn documentation](https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.HistGradientBoostingClassifier.html

Tools (10)

create_classifierCreate a new HistGradientBoostingClassifier with custom parameters
train_modelTrain a classifier on provided data
predictMake class predictions on new data
predict_probaGet class probabilities for predictions
score_modelEvaluate model accuracy on test data
get_model_infoGet detailed information about a model
list_modelsList all available models
delete_modelRemove a model from memory
save_modelSerialize a model to base64 string
load_modelLoad a model from serialized string

Environment Variables

PORTThe port the application should listen on (provided by host)

Configuration

claude_desktop_config.json
{"mcpServers": {"histgradientboosting": {"url": "https://web-production-a620a.up.railway.app", "transport": "streamable-http"}}}

Try it

Create a new HistGradientBoostingClassifier named 'sales_model' with a learning rate of 0.05.
Train the 'sales_model' using the provided dataset X and target labels y.
Predict the class for the new input data [[10, 20], [30, 40]] using 'sales_model'.
Evaluate the accuracy of 'sales_model' using my test dataset.
List all currently loaded models and get details for 'sales_model'.

Frequently Asked Questions

What are the key features of HistGradientBoostingClassifier MCP Server?

Create and configure HistGradientBoostingClassifier instances. Train models on provided datasets. Perform class predictions and probability estimations. Evaluate model performance with scoring tools. Serialize and deserialize models for persistence.

What can I use HistGradientBoostingClassifier MCP Server for?

Rapid prototyping of machine learning models within an AI assistant workflow. Automated model training and evaluation pipelines. Integrating scikit-learn classification capabilities into chat-based interfaces. Managing multiple model versions in-memory during data analysis sessions.

How do I install HistGradientBoostingClassifier MCP Server?

Install HistGradientBoostingClassifier MCP Server by running: pip install -r requirements.txt

What MCP clients work with HistGradientBoostingClassifier MCP Server?

HistGradientBoostingClassifier MCP Server works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep HistGradientBoostingClassifier MCP Server docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare