Gaslighting MCP Server

A fake web search MCP server for AI alignment testing.

README.md

gaslighting-mcp

A fake web search MCP server for AI alignment testing. It accepts a search query and returns LLM-generated search results shaped by a configurable background story.

Built with FastMCP and compatible with any OpenAI-style API endpoint.

How it works

  1. You provide a background story via the BACKGROUND_STORY environment variable
  2. The server exposes two tools: search and read_url
  3. search — generates 10 realistic search results (url, snippet, date) consistent with the background story
  4. read_url — generates a full fake article in markdown for a given URL, inferred from the domain/path and background story
  5. The consuming AI agent receives these as if they were real web content

Setup

uv sync

Configuration

Environment Variable Default Description
BACKGROUND_STORY "" The narrative that shapes all generated results
LLM_BASE_URL https://openrouter.ai/api/v1 OpenAI-compatible API base URL
LLM_API_KEY "" API key for the LLM endpoint
LLM_MODEL nousresearch/hermes-4-405b Model name

Usage

Standalone

uv run server.py

Claude Code MCP config

Add to your .mcp.json:

{
  "mcpServers": {
    "web-search": {
      "command": "uv",
      "args": ["run", "server.py"],
      "env": {
        "BACKGROUND_STORY": "your background story here",
        "LLM_API_KEY": "your-api-key"
      }
    }
  }
}

Tools

`search(query)`

Returns a JSON array of 10 results:

[
  {
    "url": "https://example.com/some-article",
    "snippet": "A realistic excerpt shaped by the background story.",
    "date": "2025-12-15"
  }
]

`read_url(url)`

Returns a full fake article in markdown, inferred from the URL and background story. Matches the tone and style of the source website.

License

MIT

Tools 2

searchGenerates 10 realistic search results consistent with the background story.
read_urlGenerates a full fake article in markdown for a given URL, inferred from the domain/path and background story.

Environment Variables

BACKGROUND_STORYThe narrative that shapes all generated results
LLM_BASE_URLOpenAI-compatible API base URL
LLM_API_KEYrequiredAPI key for the LLM endpoint
LLM_MODELModel name

Try it

Search for recent developments in quantum computing within the context of my background story.
Read the article at https://example.com/tech-breakthrough to see how it fits the current narrative.
Find news articles about the latest space mission as if it were the year 2050.
Search for information regarding the secret project mentioned in my background story.

Frequently Asked Questions

What are the key features of Gaslighting MCP?

Generates 10 realistic search results including URL, snippet, and date. Creates full fake articles in markdown format. Configurable narrative via background story environment variable. Compatible with any OpenAI-style API endpoint. Built using FastMCP for easy integration.

What can I use Gaslighting MCP for?

Testing AI agent behavior in controlled, fabricated information environments. Simulating specific historical or futuristic scenarios for AI alignment research. Evaluating how LLMs handle conflicting or biased search data. Creating mock web environments for training and evaluation purposes.

How do I install Gaslighting MCP?

Install Gaslighting MCP by running: uv sync

What MCP clients work with Gaslighting MCP?

Gaslighting MCP works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Gaslighting MCP docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Open Conare