Proxy server for OpenAI Codex and Google Gemini CLI tools via MCP
๐ Language: ํ๊ตญ์ด | English
codex-gemini-mcp
AI ์์ด์ ํธ(Claude, Cursor ๋ฑ)๊ฐ OpenAI Codex CLI์ Google Gemini CLI๋ฅผ MCP ๋๊ตฌ๋ก ์ง์ ํธ์ถํ ์ ์๊ฒ ํด์ฃผ๋ ํ๋ก์ ์๋ฒ์ ๋๋ค.
์ฃผ์ ๊ธฐ๋ฅ
ask_codexโ ์์ด์ ํธ๊ฐ Codex์๊ฒ ์ฝ๋ ์์ฑยท๋ฆฌํฉํฐ๋งยท๋๋ฒ๊น ์ ์์ฒญask_geminiโ ์์ด์ ํธ๊ฐ Gemini์๊ฒ ๋ถ์ยท์์ฝยท์ฝ๋ ๋ฆฌ๋ทฐ๋ฅผ ์์ฒญ- ๋ฐฑ๊ทธ๋ผ์ด๋ ์คํ โ ์ค๋ ๊ฑธ๋ฆฌ๋ ์์
์ ๋ฐฑ๊ทธ๋ผ์ด๋๋ก ๋๋ฆฌ๊ณ , ์ํ ํ์ธ(
check_job_status)ยท๋๊ธฐ(wait_for_job)ยท์ค๋จ(kill_job)ยท๋ชฉ๋ก ์กฐํ(list_jobs)๋ก ๊ด๋ฆฌ - ๋ฉํฐ๋ชจ๋ธ ์ค์ผ์คํธ๋ ์ด์ โ ํ๋์ ์์ด์ ํธ๊ฐ Codex์ Gemini๋ฅผ ๋์์ ํ์ฉํ์ฌ ์์ ๋ถ๋ด ๊ฐ๋ฅ
ํ๋์ ํจํค์ง์์ codex-mcp์ gemini-mcp ๋ ๊ฐ์ MCP ์๋ฒ ๋ฐ์ด๋๋ฆฌ๋ฅผ ์ ๊ณตํ๋ฉฐ, stdio transport ๊ธฐ๋ฐ์ผ๋ก ๋์ํฉ๋๋ค.
Requirements
- Node.js 20+
codexCLI ์ค์น (npm i -g @openai/codex)geminiCLI ์ค์น (npm i -g @google/gemini-cli)
MCP ์๋ฒ๋ ๊ฐ๊ฐ์ CLI๋ฅผ ๊ทธ๋๋ก ์คํํ๋ฏ๋ก, ๋จผ์ ๋ก์ปฌ ํฐ๋ฏธ๋์์ ๋ก๊ทธ์ธ/์ธ์ฆ์ด ์๋ฃ๋์ด codex / gemini CLI๋ฅผ ๋ฐ๋ก ์คํํ ์ ์๋ ์ํ์ธ์ง ํ์ธํ์ธ์.
Install
npm์์ ์ค์น(๋ฐฐํฌ๋ ๊ฒฝ์ฐ):
npm i -g @donghae0414/codex-gemini-mcp
์ ์ญ ์ค์น ์์ด npx ์ฌ์ฉ:
npx -y -p @donghae0414/codex-gemini-mcp codex-mcp
npx -y -p @donghae0414/codex-gemini-mcp gemini-mcp
์์ค์์ ์ค์น(๊ฐ๋ฐ/ํ ์คํธ):
npm install
npm run build
npm link
ํด๋ผ์ด์ธํธ๋ณ MCP ์ค์ ์์
์ ์ญ ์ค์น ๊ธฐ์ค:
{
"mcpServers": {
"codex-mcp": {
"command": "codex-mcp",
"args": []
},
"gemini-mcp": {
"command": "gemini-mcp",
"args": []
}
}
}
์ ์ญ ์ค์น ์์ด npx ๊ธฐ์ค:
{
"mcpServers": {
"codex-mcp": {
"command": "npx",
"args": ["-y", "-p", "@donghae0414/codex-gemini-mcp", "codex-mcp"]
},
"gemini-mcp": {
"command": "npx",
"args": ["-y", "-p", "@donghae0414/codex-gemini-mcp", "gemini-mcp"]
}
}
}
opencode (opencode.json):
{
"mcp": {
"codex-mcp": {
"type": "local",
"command": ["npx", "-y", "-p", "@donghae0414/codex-gemini-mcp", "codex-mcp"]
},
"gemini-mcp": {
"type": "local",
"command": ["npx", "-y", "-p", "@donghae0414/codex-gemini-mcp", "gemini-mcp"]
}
}
}
ํด๋ผ์ด์ธํธ๋ณ ์ค์ ํ์ผ ์์น(์ฐธ๊ณ ):
- Claude Code: ํ๋ก์ ํธ ๋ฃจํธ
.mcp.json(ํ๋ก์ ํธ๋ณ) ๋๋~/.claude.json(์ ์ญ) - Claude Desktop (macOS):
~/Library/Application Support/Claude/claude_desktop_config.json - Claude Desktop (Windows):
%APPDATA%\Claude\claude_desktop_config.json - Claude Desktop (Linux):
~/.config/Claude/claude_desktop_config.json - opencode:
~/.config/opencode/opencode.json
ํ๊ฒฝ ๋ณ์๋ ์
ธ ํ๋กํ(.zshrc ๋ฑ)์์ ์๋์ผ๋ก ์ฃผ์
๋์ง ์์ ์ ์์ผ๋ฏ๋ก, ๊ฐ๋ฅํ๋ฉด ์ค์ ํ์ผ์ env ๋ธ๋ก์ผ๋ก ์ ๋ฌํ์ธ์.
Default Models
๊ธฐ๋ณธ ๋ชจ๋ธ์ src/config.ts์ ํ๋์ฝ๋ฉ๋์ด ์์ผ๋ฉฐ, ํ๊ฒฝ ๋ณ์๋ก overrideํ ์ ์์ต๋๋ค.
| Provider | ๊ธฐ๋ณธ ๋ชจ๋ธ | ํ๊ฒฝ ๋ณ์ override |
|---|---|---|
| codex | gpt-5.3-codex |
MCP_CODEX_DEFAULT_MODEL |
| gemini | gemini-3-pro-preview |
MCP_GEMINI_DEFAULT_MODEL |
๋ชจ๋ธ ์ ํ ์ฐ์ ์์: ์์ฒญ ํ๋ผ๋ฏธํฐ model > ํ๊ฒฝ ๋ณ์ > ํ๋์ฝ๋ฉ ๊ธฐ๋ณธ๊ฐ
Local development
npm install
npm run build
npm run start:codex
npm run start:gemini
๊ฐ๋ฐ ๋ชจ๋:
npm run dev:codex
npm run dev:gemini
Runtime Files
- ๊ธฐ๋ณธ ๋ฐํ์ ๋๋ ํ ๋ฆฌ:
<cwd>/.codex-gemini-mcp/- background job ์ํ:
jobs/ - background job ์
์ถ๋ ฅ(content):
prompts/ - ๊ตฌ์กฐํ ๋ก๊น
(JSONL):
logs/
- background job ์ํ:
- ๋ฐํ์ ๊ฒฝ๋ก override:
MCP_RUNTIME_DIR: ๋ฐํ์ ๋ฃจํธ ๋๋ ํ ๋ฆฌMCP_LOG_DIR: ๋ก๊ทธ ๋๋ ํ ๋ฆฌ
์ ๋ฆฌ(๊ธฐ๋ณธ ๊ฒฝ๋ก ์ฌ์ฉ ์):
rm -rf .codex-gemini-mcp
Security / Privacy Notes
background: true(๊ธฐ๋ณธ๊ฐ) ์์ฒญ์.codex-gemini-mcp/prompts/*content*.json์ prompt/response๋ฅผ ์ ์ฅํฉ๋๋ค.- ํ๋กฌํํธ์ ์ํฌ๋ฆฟ(ํ ํฐ, ๋น๋ฐ๋ฒํธ, ๊ฐ์ธ ์ ๋ณด ๋ฑ)์ ๋ฃ์ผ๋ฉด ๋ก์ปฌ ํ์ผ์ ๋จ์ ์ ์์ต๋๋ค.
- ๋ก๊น
์ ๊ธฐ๋ณธ์ ์ผ๋ก ๋ณธ๋ฌธ ๋ฏธ์ ์ฅ์ด์ง๋ง, ์๋ ํ๋๊ทธ๋ฅผ ์ผ๋ฉด ๋ก๊ทธ์ ํ
์คํธ๊ฐ ํฌํจ๋ ์ ์์ต๋๋ค:
MCP_LOG_PREVIEW=1MCP_LOG_FULL_TEXT=1
Tool Schemas
ask_codex
prompt(string, required)model(string, optional)model์[A-Za-z0-9][A-Za-z0-9._:-]*ํจํด(์ต๋ 128์)๋ง ํ์ฉworking_directory(string, optional): CLI ํ๋ก์ธ์ค์ ์คํ ๋๋ ํ ๋ฆฌ(cwd)background(boolean, optional, defaulttrue)reasoning_effort(string, optional:minimal|low|medium|high|xhigh)
ask_gemini
prompt(string, required)model(string, optional)model์[A-Za-z0-9][A-Za-z0-9._:-]*ํจํด(์ต๋ 128์)๋ง ํ์ฉworking_directory(string, optional): CLI ํ๋ก์ธ์ค์ ์คํ ๋๋ ํ ๋ฆฌ(cwd)background(boolean, optional, defaulttrue)
wait_for_job
job_id(string, required, 8์๋ฆฌ hex)timeout_ms(number, optional, default 3600000, max 3600000; 3600000 ์ด๊ณผ ๊ฐ์ 3600000์ผ๋ก cap)
check_job_status
job_id(string, required, 8์๋ฆฌ hex)
kill_job
job_id(string, required, 8์๋ฆฌ hex)signal(string, optional:SIGTERM|SIGINT, defaultSIGTERM)
list_jobs
status_filter(string, optional:active(spawned/running) |completed|failed(failed/timeout) |all, defaultactive)limit(number, optional, default50)
Runtime Notes
ask_codex:codex exec --ephemeralํธ์ถ (reasoning_effort์ง์ ์-c model_reasoning_effort=...์ถ๊ฐ)ask_gemini:gemini --prompt <text>ํธ์ถask_*๋background๋ฏธ์ง์ ์ ๊ธฐ๋ณธtrue๋ก ์คํbackground: trueํธ์ถ ์.codex-gemini-mcp/jobs,.codex-gemini-mcp/prompts์ ์ํ/์ ์ถ๋ ฅ(content) ํ์ผ ์ ์ฅ- ๊ตฌ์กฐํ ๋ก๊น
(JSONL):
.codex-gemini-mcp/logs/mcp-YYYY-MM-DD.jsonl- ๊ธฐ๋ณธ: ๋ฉํ๋ฐ์ดํฐ๋ง ์ ์ฅ (๋ณธ
Tools (6)
ask_codexRequest code generation, refactoring, or debugging from Codex.ask_geminiRequest analysis, summarization, or code review from Gemini.wait_for_jobWait for a background job to complete.check_job_statusCheck the status of a background job.kill_jobTerminate a running background job.list_jobsList background jobs with optional status filtering.Environment Variables
MCP_CODEX_DEFAULT_MODELOverride the default Codex model.MCP_GEMINI_DEFAULT_MODELOverride the default Gemini model.MCP_RUNTIME_DIROverride the runtime root directory.MCP_LOG_DIROverride the log directory.Configuration
{"mcpServers": {"codex-mcp": {"command": "npx", "args": ["-y", "-p", "@donghae0414/codex-gemini-mcp", "codex-mcp"]}, "gemini-mcp": {"command": "npx", "args": ["-y", "-p", "@donghae0414/codex-gemini-mcp", "gemini-mcp"]}}}