Founder Intelligence Engine MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add -e "SUPABASE_URL=${SUPABASE_URL}" -e "SUPABASE_SERVICE_KEY=${SUPABASE_SERVICE_KEY}" -e "APIFY_API_TOKEN=${APIFY_API_TOKEN}" -e "GROQ_API_KEY=${GROQ_API_KEY}" -e "EMBEDDING_API_URL=${EMBEDDING_API_URL}" -e "EMBEDDING_API_KEY=${EMBEDDING_API_KEY}" founder-intelligence -- docker build -t founder-intelligence-mcp .
Required:SUPABASE_URLSUPABASE_SERVICE_KEYAPIFY_API_TOKENGROQ_API_KEYEMBEDDING_API_URLEMBEDDING_API_KEY
README.md

Transforms founder profiles into actionable strategic intelligence.

Founder Intelligence Engine — MCP Server

A production-grade Model Context Protocol (MCP) server that transforms founder profiles into actionable strategic intelligence.


Architecture

┌───────────────────────────────────────────────────────────┐
│                     MCP Client (Claude, etc.)             │
│                          ▲ stdio                          │
│               ┌──────────┴──────────┐                     │
│               │   MCP Server (Node) │                     │
│               │   3 registered tools│                     │
│               └──────┬──────────────┘                     │
│          ┌───────────┬┼──────────────┐                    │
│          ▼           ▼▼              ▼                    │
│  ┌──────────┐  ┌───────────┐  ┌──────────────┐           │
│  │  Apify   │  │   Groq    │  │  Embeddings  │           │
│  │  Scraping│  │   LLM     │  │  API         │           │
│  └────┬─────┘  └─────┬─────┘  └──────┬───────┘           │
│       └──────────────┬┘──────────────┘                    │
│                      ▼                                    │
│            ┌─────────────────┐                            │
│            │  Supabase       │                            │
│            │  (Postgres +    │                            │
│            │   pgvector)     │                            │
│            └─────────────────┘                            │
└───────────────────────────────────────────────────────────┘

Data Flow

  1. collect_profile — Scrapes LinkedIn + Twitter via Apify → merges data → generates embedding → stores in Supabase
  2. analyze_profile — Fetches stored profile → calls Groq LLM for strategic analysis → caches result
  3. fetch_personalized_news — Checks cache freshness → if stale: generates search queries → scrapes Google News → embeds articles → ranks by cosine similarity → summarizes with Groq → stores; if fresh: returns cached articles

Caching & Cost Optimization

Operation Cost When It Runs
LinkedIn/Twitter scraping High Only on profile creation
Groq profile analysis Medium Once per profile (cached)
Google News + embeddings High Only when news > 24h stale
Read cached articles Free Every subsequent request

The fetch_history table tracks last_profile_scrape and last_news_fetch timestamps. The staleCheck.js module compares these against configurable thresholds.


Setup

1. Prerequisites

  • Node.js 20+
  • Supabase project (with pgvector enabled)
  • API keys: Apify, Groq, OpenAI-compatible Embeddings

2. Install

cd /Users/praveenkumar/Desktop/mcp
cp .env.example .env
# Edit .env with your real keys
npm install

3. Database

Run the migration against your Supabase SQL Editor:

-- Paste contents of migrations/001_init.sql

Or via psql:

psql $DATABASE_URL < migrations/001_init.sql

4. Run MCP Server

node src/index.js

5. Configure MCP Client

Add to your MCP client config (e.g., Claude Desktop claude_desktop_config.json):

{
  "mcpServers": {
    "founder-intelligence": {
      "command": "node",
      "args": ["/Users/praveenkumar/Desktop/mcp/src/index.js"],
      "env": {
        "SUPABASE_URL": "...",
        "SUPABASE_SERVICE_KEY": "...",
        "APIFY_API_TOKEN": "...",
        "GROQ_API_KEY": "...",
        "EMBEDDING_API_URL": "...",
        "EMBEDDING_API_KEY": "..."
      }
    }
  }
}

6. Background Worker (Optional)

# Single run (for cron)
node src/backgroundWorker.js

# Daemon mode
BACKGROUND_LOOP=true node src/backgroundWorker.js

Cron example (every 6 hours):

0 */6 * * * cd /app && node src/backgroundWorker.js >> /var/log/worker.log 2>&1

Project Structure

/Users/praveenkumar/Desktop/mcp/
├── migrations/
│   └── 001_init.sql
├── src/
│   ├── db/
│   │   └── supabaseClient.js
│   ├── services/
│   │   ├── apifyService.js
│   │   ├── embeddingService.js
│   │   └── llmService.js
│   ├── tools/
│   │   ├── collectProfile.js
│   │   ├── analyzeProfile.js
│   │   └── fetchPersonalizedNews.js
│   ├── utils/
│   │   ├── similarity.js
│   │   └── staleCheck.js
│   ├── backgroundWorker.js
│   └── index.js
├── .env.example
├── .gitignore
├── .dockerignore
├── Dockerfile
├── package.json
└── README.md

Docker Deployment

Build & Run

docker build -t founder-intelligence-mcp .
docker run --env-file .env founder-intelligence-mcp

Background Worker Container

docker run --env-file .env founder-intelligence-mcp node src/backgroundWorker.js

Docker Compose (production)

version: '3.8'
services:
  mcp-server:
    build: .
    env_file: .env
    stdin_open: true
    restart: unless-stopped

  worker:
    build: .
    env_file: .env
    command: ["node", "src/backgroundWorker.js"]
    environment:
      - BACKGROUND_LOOP=true
    restart: unless-sto

Tools (3)

collect_profileScrapes LinkedIn and Twitter profiles, merges data, generates embeddings, and stores them in Supabase.
analyze_profileFetches a stored profile and uses Groq LLM to perform strategic analysis.
fetch_personalized_newsChecks for news updates, scrapes Google News, embeds articles, and summarizes them using Groq.

Environment Variables

SUPABASE_URLrequiredURL for the Supabase project
SUPABASE_SERVICE_KEYrequiredService role key for Supabase database access
APIFY_API_TOKENrequiredAPI token for Apify scraping services
GROQ_API_KEYrequiredAPI key for Groq LLM services
EMBEDDING_API_URLrequiredEndpoint for embedding generation
EMBEDDING_API_KEYrequiredAPI key for embedding service

Configuration

claude_desktop_config.json
{"mcpServers": {"founder-intelligence": {"command": "node", "args": ["/path/to/src/index.js"], "env": {"SUPABASE_URL": "...", "SUPABASE_SERVICE_KEY": "...", "APIFY_API_TOKEN": "...", "GROQ_API_KEY": "...", "EMBEDDING_API_URL": "...", "EMBEDDING_API_KEY": "..."}}}}

Try it

Collect the profile for the founder at [LinkedIn URL] and store it in the database.
Analyze the strategic profile for the founder with ID [ID] and summarize their recent focus.
Fetch personalized news for the founder [Name] to see if there are any updates in the last 24 hours.
Run a full analysis on the founder profile [ID] and provide a summary of their recent news.

Frequently Asked Questions

What are the key features of Founder Intelligence Engine?

Automated scraping of LinkedIn and Twitter profiles via Apify. Strategic profile analysis powered by Groq LLM. Personalized news tracking with automated summarization. Vector-based storage and retrieval using Supabase and pgvector. Caching mechanism to optimize costs and performance.

What can I use Founder Intelligence Engine for?

Venture capitalists tracking founder activity and strategic shifts. Market researchers monitoring key industry figures. Sales professionals gathering intelligence on potential leads. Competitive intelligence gathering for startup ecosystems.

How do I install Founder Intelligence Engine?

Install Founder Intelligence Engine by running: npm install

What MCP clients work with Founder Intelligence Engine?

Founder Intelligence Engine works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Founder Intelligence Engine docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare