by erans
LunaRoute is a high-performance local proxy for AI coding assistants like Claude Code, OpenAI Codex CLI, and OpenCode. Get complete visibility into every LLM interaction with zero-overhead passthrough, comprehensive session recording, and powerful debugging capabilities.
# Add to your Claude Code skills
git clone https://github.com/erans/lunaroute ___---___
.-- --.
./ () .-. \.
/ o . ( ) \
/ . '-' \ _ ____ _
| () . O . | | | _ _ _ __ __ _| _ \ ___ _ _| |_ ___
| | | | | | | | '_ \ / _` | |_) / _ \| | | | __/ _ \
| o () | | |__| |_| | | | | (_| | _ < (_) | |_| | || __/
| .--. O | |_____\__,_|_| |_|\__,_|_| \_\___/ \__,_|\__\___|
| . | | |
\ `.__.' o . /
\ /
`\ o () /
`--___ ___--'
---
Your AI Coding Assistant's Best Friend
A blazing-fast local proxy for AI coding assistants that gives you complete visibility into every LLM interaction. Zero configuration, sub-millisecond overhead, and powerful debugging capabilities.
eval $(lunaroute-server env)
That's it! This single command:
ANTHROPIC_BASE_URL)OPENAI_BASE_URL)Start coding with your AI assistant immediately - both APIs are ready to use!
If you prefer to run the server manually:
# Terminal 1: Start the server
lunaroute-server
# Terminal 2: Point your AI tools to it
export ANTHROPIC_BASE_URL=http://localhost:8081 # For Claude Code
export OPENAI_BASE_URL=http://localhost:8081/v1 # For Codex CLI
That's it. No API keys to configure, no YAML files to write, nothing. LunaRoute automatically:
gpt-* → OpenAI, claude-* → Anthropic)Stop flying blind. LunaRoute records every interaction with zero configuration:
# Literally just one command
eval $(lunaroute-server env)
What you get instantly:
/v1/chat/completions + Anthropic /v1/messages