Dynamically analyze Global/Indian stocks and local NSE BhavData using AI.
š MCP Stock Analyzer
An official Model Context Protocol (MCP) server for dynamically analyzing Global/Indian stocks and offline local NSE BhavData documents using AI.
ā Core Capabilities:
- Global Stocks: Automatically grabs live prices, history, fundamentals, and ticker resolution via
yfinance. - Local BhavData: AI dynamically writes SQL statements to extract massive local datasets.
- ā ļø CRITICAL WARNING: Do not use the "Add Context" button (or drag-and-drop) to upload huge BhavData CSVs directly into your chat. This will instantly blow up your AI's token limits and cost a fortune. Instead, simply paste the absolute file path (e.g.,
"Analyze C:\downloads\bhav.csv") as normal text in your prompt, and let this MCP server securely query it in the background!
- ā ļø CRITICAL WARNING: Do not use the "Add Context" button (or drag-and-drop) to upload huge BhavData CSVs directly into your chat. This will instantly blow up your AI's token limits and cost a fortune. Instead, simply paste the absolute file path (e.g.,
- Visual Dashboards: The AI creates completely interactive, localized HTML graphing dashboards on demand.
šļø System Architecture
graph TD
User([š¤ End User])
LLM_Client[š§ AI Client / IDE e.g., Cline, Claude, Ollama]
subgraph "Python MCP Server (.venv)"
Server[š server.py FastMCP Entrypoint]
GlobalMod[š global_stocks.py yfinance API]
BhavMod[š bhavdata_analyzer.py SQLite & Pandas]
DashMod[š dashboard_generator.py Chart.js & HTML]
end
subgraph "External Providers"
YFinance[(Yahoo Finance API)]
LocalDisk[(User's Local C: Drive .csv files)]
CDN[(Chart.js CDN)]
end
User -->|Sends Prompt & Files| LLM_Client
LLM_Client -->|Calls JSON Tools via stdio| Server
Server -->|Routes query| GlobalMod
Server -->|Routes query| BhavMod
Server -->|Routes query| DashMod
GlobalMod <-->|Fetches real-time/historical data| YFinance
BhavMod <-->|Loads/Runs SQL on| LocalDisk
DashMod -->|Embeds| CDN
DashMod -->|Outputs temp HTML file to| LocalDisk
Server -.->|Returns result context| LLM_Client
LLM_Client -.->|Streams final answer to| User
š ļø Installation & Quick Setup
To guarantee there are zero package or import errors, please set up the isolated environment:
- Open a terminal navigating to the project folder (
d:\Projects\MCPAgentStockAnalyzer). - Run these configuration commands to set up the dependencies firmly in a
.venv:python -m venv .venv .\.venv\Scripts\activate pip install -r requirements.txt
(If you use VSCode, .vscode/settings.json is automatically pre-configured to select this virtual environment.)
š Connecting to LLM Clients
To have your AI interact with this server, you'll simply embed its configuration into your specific tool's config file.
Here is the master Configuration JSON you will use for every client listed below:
"StockAnalyzer": {
"command": "d:/Projects/MCPAgentStockAnalyzer/.venv/Scripts/python.exe",
"args": ["d:/Projects/MCPAgentStockAnalyzer/src/server.py"]
}
1. Claude Desktop (Mac / Windows)
- Open up Claude Desktop application.
- Go to Settings > Settings file or navigate to
%APPDATA%\Claude\claude_desktop_config.json. - Add the
StockAnalyzerblock above right inside the"mcpServers": { ... }object. - Restart Claude Desktop.
2. Cline (VS Code Extension)
- In VS Code, click the Cline Extensions icon on the sidebar.
- Click the specific MCP server (plugin) settings config near the bottom.
- Paste the configuration block directly into your
mcp_settings.json.
3. Antigravity (Local IDE Agent)
- Inside your
~/.gemini/antigravity/folder (or active brain/project.geminifolder). - Edit the
mcp_config.jsonfile. - Drop the
StockAnalyzerblock into"mcpServers". Keep chatting and it hot-reloads dynamically!
4. GitHub Copilot
Currently, GitHub Copilot integrates officially directly with the Claude or OpenAI engines on newer IDE builds via specific marketplace extensions. If utilizing Copilot Chat, ensure you rely on an editor like VSCode or Cursor equipped directly with extensible Tool/Plugin settings (similar to Cline's mcp_settings.json) that bridge custom MCP standard definitions.
5. Claude Code (CLI)
If you're using Anthropic's new direct CLI tool (claude-code), configure it simply by defining it as a server in its explicit config file:
claude config set --mcp-server StockAnalyzer "d:/Projects/MCPAgentStockAnalyzer/.venv/Scripts/python.exe d:/Projects/MCPAgentStockAnalyzer/src/server.py"
š¦ Running completely FREE (Local LLMs)
You do not need a paid Claude 3.5 Sonnet or OpenAI API key to use this. You can direct local frameworks through Cline or Cursor directly to a local engine.
Using Ollama
- Download Ollama
- Open terminal and run a fast coder model:
ollama run qwen2.5-coder:7b - Point Cline's settings to Base URL:
http://localhost:11434/v1
Using LM Studio
- Download [LM Studio
Tools (3)
analyze_global_stocksFetches live prices, history, fundamentals, and ticker resolution via yfinance.query_bhavdataExecutes AI-generated SQL statements to extract data from local NSE BhavData CSV files.generate_dashboardCreates interactive HTML graphing dashboards for visual performance tracking.Configuration
{"StockAnalyzer": {"command": "d:/Projects/MCPAgentStockAnalyzer/.venv/Scripts/python.exe", "args": ["d:/Projects/MCPAgentStockAnalyzer/src/server.py"]}}