RocketRide MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add rocketride-server -- docker run -i --rm ghcr.io/rocketride-org/rocketride-engine:latest
README.md

Self-hosted, open-source AI pipeline platform for MCP tools

RocketRide is a high-performance data processing engine built on a C++ core with a Python-extensible node system. With 50+ pipeline nodes, native AI/ML support, and SDKs for TypeScript, Python, and MCP, it lets you process, transform, and analyze data at scale — entirely on your own infrastructure.

Key Capabilities

  • Stay in your IDE — Build, debug, test, and scale heavy AI and data workloads with an intuitive visual builder in the environment you're used to. Stop using your browser.
  • High-performance C++ engine — Native multithreading. No bottleneck. Purpose-built for throughput, not prototypes.
  • Multi-agent workflows — Orchestrate and scale agents with built-in support for CrewAI and LangChain.
  • 50+ pipeline nodes — Python-extensible, with 13 LLM providers, 8 vector databases, OCR, NER, PII anonymization, and more.
  • TypeScript, Python & MCP SDKs — Integrate pipelines into native applications or expose them as tools for AI assistants.
  • One-click deploy — Run on Docker, on-prem, or RocketRide Cloud (👀coming soon). Our architecture is made for production, not demos.

⚡ Quick Start

  1. Install the extension for your IDE. Search for RocketRide in the extension marketplace:

    Not seeing your IDE? Open an issue · Download directly

  2. Click the RocketRide (🚀) extension in your IDE

  3. Deploy a server — you'll be prompted on how you want to run the server. Choose the option that fits your setup:

    • Local (Recommended) — This pulls the server directly into your IDE without any additional setup.
    • On-Premises — Run the server on your own hardware for full control and data residency. Pull the image and deploy to Docker or clone this repo and build from source.
    • RocketRide Cloud (👀coming soon) — Managed hosting with our proprietary model server. No infrastructure to maintain.
  4. Create a .pipe file and start building

🔧 Building your first pipe

  1. All pipelines are recognized with the *.pipe format. Each pipeline and configuration is a JSON object - but the extension in your IDE will render within our visual builder canvas.

  2. All pipelines begin with source node: webhook, chat, or dropper. For specific usage, examples, and inspiration 💡 on how to build pipelines, check out our guides and documentation

  3. Connect input lanes and output lanes by type to properly wire your pipeline. Some nodes like agents or LLMs can be invoked as tools for use by a parent node as shown below:

  4. You can run a pipeline from the canvas by pressing the ▶️ button on the source node or from the Connection Manager directly.

  5. View all available and running pipelines below the Connection Manager. Selecting running pipelines allows for in depth analytics. Trace call trees, token usage, memory consumption, and more to optimize your pipelines before scaling and deploying.

  6. 📦 Deploy your pipelines to RocketRide.ai cloud or run them on your own infrastructure.

    • Docker — Download the RocketRide server image and create a container. Requires Docker to be installed.

      docker pull ghcr.io/rocketride-org/rocketride-engine:latest
      docker create --name rocketride-engine -p 5565:5565 ghcr.io/rocketride-org/rocketride-engine:latest
      
    • RocketRide Cloud (👀coming soon) — Managed hosting with our proprietary model server and batched processing. The cheapest option to run AI workflows and pipelines at scale (seriously).

  7. Run your pipelines as standalone processes or integrate them into your existing Python and [

Tools (1)

execute_pipelineExecutes a defined .pipe file pipeline and returns the processed data.

Configuration

claude_desktop_config.json
{ "mcpServers": { "rocketride": { "command": "docker", "args": [ "run", "-i", "--rm", "ghcr.io/rocketride-org/rocketride-engine:latest" ] } } }

Try it

Run the audio transcription pipeline on the file located at ./recordings/meeting.mp3
Execute the advanced RAG pipeline using the query 'How do I configure the RocketRide engine?'
Process the video file at ./data/demo.mp4 using the frame grabber pipeline
Run the PII anonymization pipeline on the text file in my current directory

Frequently Asked Questions

What are the key features of RocketRide?

High-performance C++ engine with native multithreading. 50+ pipeline nodes including LLM providers and vector databases. Multi-agent workflow orchestration with CrewAI and LangChain support. Visual builder canvas for creating and debugging pipelines. Native MCP SDK support for integration with AI assistants.

What can I use RocketRide for?

Automating complex data transformation and analysis workflows. Building custom RAG pipelines for local document sets. Integrating multi-agent AI workflows directly into IDE environments. Performing large-scale media processing like audio transcription and video frame extraction.

How do I install RocketRide?

Install RocketRide by running: docker pull ghcr.io/rocketride-org/rocketride-engine:latest && docker create --name rocketride-engine -p 5565:5565 ghcr.io/rocketride-org/rocketride-engine:latest

What MCP clients work with RocketRide?

RocketRide works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep RocketRide docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare