ICON MCP v108 MCP Server

$git clone https://github.com/Traia-IO/icon-mcp-v108-mcp-server.git && cd icon-mcp-v108-mcp-server && ./run_local_docker.sh
README.md

Provides AI agents and LLMs with standardized tools to interact with the ICON MCP v108 API.

ICON MCP v108 MCP Server

This is an MCP (Model Context Protocol) server that provides access to the ICON MCP v108 API. It enables AI agents and LLMs to interact with ICON MCP v108 through standardized tools.

Features

  • 🔧 MCP Protocol: Built on the Model Context Protocol for seamless AI integration
  • 🌐 Full API Access: Provides tools for interacting with ICON MCP v108 endpoints
  • 🐳 Docker Support: Easy deployment with Docker and Docker Compose
  • Async Operations: Built with FastMCP for efficient async handling

API Documentation

Available Tools

This server provides the following tools:

  • example_tool: Placeholder tool (to be implemented)

Note: Replace example_tool with actual ICON MCP v108 API tools based on the documentation.

Installation

Using Docker (Recommended)

  1. Clone this repository:

    git clone https://github.com/Traia-IO/icon-mcp-v108-mcp-server.git
    cd icon-mcp-v108-mcp-server
    
  2. Run with Docker:

    ./run_local_docker.sh
    

Using Docker Compose

  1. Create a .env file with your configuration:
    
    

PORT=8000


2. Start the server:
```bash
docker-compose up

Manual Installation

  1. Install dependencies using uv:

    uv pip install -e .
    
  2. Run the server:

    
    

uv run python -m server


## Usage

### Health Check

Test if the server is running:
```bash
python mcp_health_check.py

Using with CrewAI

from traia_iatp.mcp.traia_mcp_adapter import create_mcp_adapter

# Connect to the MCP server
with create_mcp_adapter(
    url="http://localhost:8000/mcp/"
) as tools:
    # Use the tools
    for tool in tools:
        print(f"Available tool: {tool.name}")
        
    # Example usage
    result = await tool.example_tool(query="test")
    print(result)

Development

Testing the Server

  1. Start the server locally
  2. Run the health check: python mcp_health_check.py
  3. Test individual tools using the CrewAI adapter

Adding New Tools

To add new tools, edit server.py and:

  1. Create API client functions for ICON MCP v108 endpoints
  2. Add @mcp.tool() decorated functions
  3. Update this README with the new tools
  4. Update deployment_params.json with the tool names in the capabilities array

Deployment

Deployment Configuration

The deployment_params.json file contains the deployment configuration for this MCP server:

{
  "github_url": "https://github.com/Traia-IO/icon-mcp-v108-mcp-server",
  "mcp_server": {
    "name": "icon-mcp-v108-mcp",
    "description": "V108 another icon mcp",
    "server_type": "streamable-http",
"capabilities": [
      // List all implemented tool names here
      "example_tool"
    ]
  },
  "deployment_method": "cloud_run",
  "gcp_project_id": "traia-mcp-servers",
  "gcp_region": "us-central1",
  "tags": ["icon mcp v108", "api"],
  "ref": "main"
}

Important: Always update the capabilities array when you add or remove tools!

Google Cloud Run

This server is designed to be deployed on Google Cloud Run. The deployment will:

  1. Build a container from the Dockerfile
  2. Deploy to Cloud Run with the specified configuration
  3. Expose the /mcp endpoint for client connections

Environment Variables

  • PORT: Server port (default: 8000)
  • STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)
  • LOG_LEVEL: Logging level (default: INFO)

Troubleshooting

  1. Server not starting: Check Docker logs with docker logs <container-id>
  2. Connection errors: Ensure the server is running on the expected port3. Tool errors: Check the server logs for detailed error messages

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Implement new tools or improvements
  4. Update the README and deployment_params.json
  5. Submit a pull request

License

MIT License

Tools (1)

example_toolPlaceholder tool (to be implemented) for interacting with ICON MCP v108 API.

Environment Variables

PORTServer port (default: 8000)
STAGEEnvironment stage (MAINNET, TESTNET)
LOG_LEVELLogging level (default: INFO)

Configuration

claude_desktop_config.json
{
  "mcpServers": {
    "icon-mcp-v108": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "-p", "8000:8000", "icon-mcp-v108-mcp-server"],
      "env": {
        "PORT": "8000",
        "STAGE": "MAINNET"
      }
    }
  }
}

Try it

Run the example tool with the query 'test' to verify the ICON MCP v108 connection.
Check the health status of the ICON MCP v108 server.
List all available tools provided by the ICON MCP v108 adapter.

Frequently Asked Questions

What are the key features of ICON MCP v108?

Built on the Model Context Protocol for seamless AI integration. Provides tools for interacting with ICON MCP v108 endpoints. Easy deployment with Docker and Docker Compose. Built with FastMCP for efficient async handling.

What can I use ICON MCP v108 for?

Integrating ICON MCP v108 API capabilities into CrewAI agent workflows. Deploying a standardized API interface for LLMs on Google Cloud Run. Asynchronous interaction with ICON blockchain or protocol services via AI agents.

How do I install ICON MCP v108?

Install ICON MCP v108 by running: git clone https://github.com/Traia-IO/icon-mcp-v108-mcp-server.git && cd icon-mcp-v108-mcp-server && ./run_local_docker.sh

What MCP clients work with ICON MCP v108?

ICON MCP v108 works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Use ICON MCP v108 with Conare

Manage MCP servers visually, upload persistent context, and never start from zero with Claude Code & Codex.

Try Free