ccproxy

by starbased-co

Pending

Build mods for Claude Code: Hook any request, modify any response, /model "with-your-custom-model", intelligent model routing using your logic or ours, and even use your Claude subscription as an API

116stars
15forks
Python
Added 12/27/2025
CLI Toolsaiai-gatewayai-proxyai-toolsanthropicclaudeclaude-aiclaude-apiclaude-codeclaude-maxclaudecodegeminigemini-clilitellmllmllm-gatewayllm-proxyllmopsopenaiopenrouter
Installation
# Add to your Claude Code skills
git clone https://github.com/starbased-co/ccproxy
README.md

ccproxy - Claude Code Proxy Version

Join starbased HQ for questions, sharing setups, and contributing to development.

ccproxy unlocks the full potential of your Claude MAX subscription by enabling Claude Code to seamlessly use unlimited Claude models alongside other LLM providers like OpenAI, Gemini, and Perplexity.

It works by intercepting Claude Code's requests through a LiteLLM Proxy Server, allowing you to route different types of requests to the most suitable model - keep your unlimited Claude for standard coding, send large contexts to Gemini's 2M token window, route web searches to Perplexity, all while Claude Code thinks it's talking to the standard API.

New ✨: Use your subscription without Claude Code! The Anthropic SDK and LiteLLM SDK examples in examples/ allow you to use your logged in claude.ai account for arbitrary API requests:

 # Streaming with litellm.acompletion()
response = await litellm.acompletion(
    messages=[{"role": "user", "content": "Count from 1 to 5."}],
    model="claude-haiku-4-5-20251001",
    max_tokens=200,
    stream=True,
    api_base="http://127.0.0.1:4000",
    api_key="sk-proxy-dummy",  # key is not real, `ccproxy` handles real auth
)

⚠️ Note: While core functionality is complete, real-world testing and community input are welcomed. Please open an issue to share your experience, report bugs, or suggest improvements, or even better, submit a PR!

Installation

Important: ccproxy must be installed with LiteLLM in the same environment so that LiteLLM can import the ccproxy handler.

Recommended: Install as uv tool

# Install from PyPI
uv tool install claude-ccproxy --with 'litellm[proxy]'

# Or install from GitHub (latest)
uv tool install git+https://github.com/starbased-co/ccproxy.git --with 'litellm[proxy]'

This installs:

  • ccproxy command (for managing the proxy)
  • litellm bundled in the same environment (so it can import ccproxy's handler)

Alternative: Install with pip

# Install both packages in the same virtual environment
pip install git+https://github.com/starbased-co/ccproxy.git
pip install 'litellm[proxy]'

Note: With pip, both packages must be in the same virtual environment.

Verify Installation

ccproxy --help
# Should show ccproxy commands

which litellm
# Should point to litellm in ccproxy's environment

Usage

Run the automated setup:

# This will create all necessary configuration files in ~/.ccproxy
ccproxy install

tree ~/.ccproxy
# ~/.ccproxy
# ├── ccproxy.yaml
# └── config.yaml

# ccproxy.py is auto-generated when you start the proxy

# Start the proxy server
ccproxy start --detach

# Start Claude Code
ccproxy run claude
# Or add to your .zshr...