AI HR Leave Management MCP Server

Local setup required. This server has to be cloned and prepared on your machine before you register it in Claude Code.
1

Set the server up locally

Run this once to clone and prepare the server before adding it to Claude Code.

Run in terminal
pip install -r requirements.txt
2

Register it in Claude Code

After the local setup is done, run this command to point Claude Code at the built server.

Run in terminal
claude mcp add -e "GROQ_API_KEY=${GROQ_API_KEY}" -e "GROQ_MODEL=${GROQ_MODEL}" ai-hr-leave -- python "<FULL_PATH_TO_MCP_TOOL>/dist/index.js"

Replace <FULL_PATH_TO_MCP_TOOL>/dist/index.js with the actual folder you prepared in step 1.

Required:GROQ_API_KEYGROQ_MODEL+ 2 optional
README.md

Real MCP Protocol based HR chatbot using Groq LLM, FastAPI, and Streamlit.

AI HR Leave Management Chatbot

Real MCP Protocol + Groq LLM + FastAPI + Streamlit A real MCP (Model Context Protocol) based HR chatbot using Groq LLM + FastAPI + Streamlit.

Architecture

User → Streamlit → FastAPI → Groq LLM → MCP Client → MCP Server → Tools

Tech Stack

  • MCP SDK — Real stdio transport protocol
  • Groq LLM — llama-3.3-70b-versatile
  • FastAPI — Backend REST API
  • Streamlit — Frontend UI

Project Structure

mcp_chatbot/
├── .env
├── requirements.txt
├── mcp_server/server.py   ← MCP Tools
├── backend/agent.py       ← MCP Client + Groq
├── backend/app.py         ← FastAPI
├── frontend/app.py        ← Streamlit UI
└── utils/config.py

Setup

1. Install dependencies

pip install -r requirements.txt

2. Configure `.env`

GROQ_API_KEY=your_groq_api_key_here
GROQ_MODEL=llama-3.3-70b-versatile
FASTAPI_HOST=127.0.0.1
FASTAPI_PORT=8000

3. Run Backend

python backend/app.py

4. Run Frontend

streamlit run frontend/app.py

MCP Tools

Tool Description
apply_leave Apply leave request
check_leave_balance Check remaining leaves
get_leave_history Past leave records
cancel_leave Cancel last leave
get_holidays Upcoming holidays
get_employee_info Employee details

Step 4 — Initialize and push bashgit init git add . git commit -m "Initial commit - AI HR MCP Chatbot" git branch -M main git remote add origin https://github.com/anjalimahapatra2004/mcp_tool.git git push -u origin main

Tools (6)

apply_leaveApply leave request
check_leave_balanceCheck remaining leaves
get_leave_historyPast leave records
cancel_leaveCancel last leave
get_holidaysUpcoming holidays
get_employee_infoEmployee details

Environment Variables

GROQ_API_KEYrequiredAPI key for Groq LLM access
GROQ_MODELrequiredThe Groq model to use (e.g., llama-3.3-70b-versatile)
FASTAPI_HOSTHost address for the backend server
FASTAPI_PORTPort for the backend server

Configuration

claude_desktop_config.json
{"mcpServers": {"hr-leave": {"command": "python", "args": ["path/to/mcp_server/server.py"]}}}

Try it

How many leave days do I have remaining?
Apply for a leave request for next Friday.
Can you show me my past leave history?
What are the upcoming company holidays?
Get my employee information details.

Frequently Asked Questions

What are the key features of AI HR Leave Management?

Automated leave request processing. Real-time leave balance tracking. Access to historical leave records. Company holiday schedule retrieval. Employee information management.

What can I use AI HR Leave Management for?

Employees checking their remaining vacation days without HR intervention. Streamlining the leave application process via natural language. Quickly verifying upcoming company holidays for planning. Reviewing past leave history for personal record keeping.

How do I install AI HR Leave Management?

Install AI HR Leave Management by running: pip install -r requirements.txt

What MCP clients work with AI HR Leave Management?

AI HR Leave Management works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep AI HR Leave Management docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare