Hue MCP Server

$git clone <your-repo-url> && cd hueclientrest-mpc && uv sync
README.md

Interact with Hadoop Hue for executing SQL queries and managing HDFS files.

Hue MCP Server

An MCP (Model Context Protocol) server that exposes HueClientRest functionality, allowing AI assistants to interact with Hadoop Hue for executing SQL queries and managing HDFS files.

What is This?

This server enables AI assistants (like GitHub Copilot, Claude Desktop, or other MCP-compatible clients) to:

  • Execute SQL queries on Hadoop Hue using Hive, SparkSQL, or Impala
  • Manage HDFS files (list, upload, download)
  • Export query results to CSV files
  • Browse and manage directory structures

The Model Context Protocol (MCP) is an open standard for connecting AI assistants to external tools and data sources, making them more powerful and context-aware.

Features

  • SQL Query Execution: Execute queries using Hive, SparkSQL, or Impala dialects
  • Result Export: Save query results to CSV files with automatic retry on large datasets
  • HDFS Operations: List, upload, and download files from HDFS
  • Directory Management: Check directory existence and browse file structures
  • Robust Error Handling: Built-in retry mechanisms and detailed error reporting

Prerequisites

Before installing this MCP server, you need:

  1. Python 3.10 or higher - Download Python
  2. Astral uv - Fast Python package installer and environment manager
  3. Visual Studio Code - For MCP integration with GitHub Copilot
  4. GitHub Copilot subscription - Required for VS Code MCP integration
  5. Access to a Hadoop Hue server - You'll need the host URL, username, and password

Dependencies

This project uses the following key dependencies:

  • Astral uv - An extremely fast Python package and project manager, written in Rust. It's 10-100x faster than pip and handles dependency resolution much better.
  • mcp[cli] - The official Python SDK for the Model Context Protocol, including CLI tools
  • hueclientrest - Python client library for interacting with Hadoop Hue REST API
  • pydantic - Data validation using Python type annotations

Installation

Step 1: Install Astral uv

uv is a modern, fast Python package manager that we use for dependency management.

On Windows (PowerShell):

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

On macOS/Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

After installation, restart your terminal or add uv to your PATH as instructed by the installer.

Verify installation:

uv --version

Step 2: Clone and Install the Project

# Clone the repository
git clone <your-repo-url>
cd hueclientrest-mpc

# Install dependencies and create virtual environment
uv sync

The uv sync command will:

  • Create a virtual environment (.venv)
  • Install all dependencies from pyproject.toml
  • Set up the project for development

Alternative: Using pip

If you prefer pip over uv:

pip install -e .

However, uv is strongly recommended for better performance and dependency management.

Configuration

Environment Variables

The server requires the following environment variables to connect to your Hue server:

Variable Required Description
HUE_HOST Yes Hue server URL (e.g., https://hue.example.com)
HUE_USERNAME Yes Username for Hue authentication
HUE_PASSWORD Yes Password for Hue authentication
HUE_VERIFY_SSL No Verify SSL certificates (default: true)
HUE_SSL_WARNINGS No Show SSL warnings (default: false)

Setting Up Environment Variables

Option 1: Using .env file (Recommended for local development)

# Create a .env file in the project root
HUE_HOST=https://your-hue-server.com
HUE_USERNAME=your_username
HUE_PASSWORD=your_password
HUE_VERIFY_SSL=true
HUE_SSL_WARNINGS=false

Option 2: System environment variables

On Windows (PowerShell):

$env:HUE_HOST="https://your-hue-server.com"
$env:HUE_USERNAME="your_username"
$env:HUE_PASSWORD="your_password"

On macOS/Linux:

export HUE_HOST="https://your-hue-server.com"
export HUE_USERNAME="your_username"
export HUE_PASSWORD="your_password"

VS Code Integration with GitHub Copilot

Prerequisites for VS Code Integration

  1. Visual Studio Code - Download VS Code
  2. GitHub Copilot extension - Install from VS Code marketplace
  3. GitHub Copilot subscription - Required for MCP support
  4. This MCP server installed and configured

Step 1: Locate Your MCP Configuration File

The MCP configuration file location depends on your operating system:

  • Windows: %APPDATA%\Code\User\mcp.json
    • Full path: `C:\Users<YourUsername>\AppData\Roaming\Code\Us

Tools (5)

execute_queryExecute SQL queries on Hadoop Hue using Hive, SparkSQL, or Impala dialects
export_to_csvSave query results to CSV files with automatic retry on large datasets
list_hdfs_filesList files and browse directory structures in HDFS
upload_hdfs_fileUpload files from local system to HDFS
download_hdfs_fileDownload files from HDFS to local system

Environment Variables

HUE_HOSTrequiredHue server URL (e.g., https://hue.example.com)
HUE_USERNAMErequiredUsername for Hue authentication
HUE_PASSWORDrequiredPassword for Hue authentication
HUE_VERIFY_SSLVerify SSL certificates (default: true)
HUE_SSL_WARNINGSShow SSL warnings (default: false)

Configuration

claude_desktop_config.json
{"mcpServers":{"hue":{"command":"uv","args":["--directory","/path/to/hueclientrest-mpc","run","hue-mcp-server"],"env":{"HUE_HOST":"https://hue.example.com","HUE_USERNAME":"your_username","HUE_PASSWORD":"your_password","HUE_VERIFY_SSL":"true"}}}}

Try it

Run a Hive query to count the number of rows in the production_logs table.
List all files in the /user/data/reports directory on HDFS.
Execute a SparkSQL query to find the top 10 customers and export the results to a file named top_customers.csv.
Download the file /tmp/analysis_results.parquet from HDFS to my local machine.
Check if the directory /user/analytics/2023 exists in HDFS.

Frequently Asked Questions

What are the key features of Hue MCP Server?

SQL Query Execution using Hive, SparkSQL, or Impala dialects. HDFS Operations including listing, uploading, and downloading files. Query Result Export to CSV with automatic retry for large datasets. Directory Management and file structure browsing. Robust Error Handling with built-in retry mechanisms.

What can I use Hue MCP Server for?

Data analysts needing to query Hadoop data directly from an AI assistant. Data engineers managing HDFS file structures and transfers via chat. Automating the export of large SQL result sets to CSV for local analysis. Quickly browsing Hadoop directory structures without opening the Hue web UI. Integrating Hadoop data access into AI-driven development workflows.

How do I install Hue MCP Server?

Install Hue MCP Server by running: git clone <your-repo-url> && cd hueclientrest-mpc && uv sync

What MCP clients work with Hue MCP Server?

Hue MCP Server works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Use Hue MCP Server with Conare

Manage MCP servers visually, upload persistent context, and never start from zero with Claude Code & Codex.

Try Free