Use your OpenCode Go subscription with Claude Code.
# Add to your Claude Code skills
git clone https://github.com/samueltuyizere/oc-go-ccA Go CLI proxy that lets you use your OpenCode Go subscription with Claude Code.
oc-go-cc sits between Claude Code and OpenCode Go, intercepting Anthropic API requests, transforming them to OpenAI format, and forwarding them to OpenCode Go's endpoint. Claude Code thinks it's talking to Anthropic — but your requests go to affordable open models instead.
OpenCode Go gives you access to powerful open coding models for $5/month (then $10/month). This proxy makes those models work seamlessly with Claude Code's interface — no patches, no forks, just set two environment variables and go.
${VAR} interpolationNo comments yet. Be the first to share your thoughts!
brew tap samueltuyizere/tap
brew install oc-go-cc
git clone https://github.com/samueltuyizere/oc-go-cc.git
cd oc-go-cc
make build
# Binary is at bin/oc-go-cc
# Optionally install to $GOPATH/bin
make install
Download the latest release for your platform from the Releases page:
| Platform | File |
| --------------------- | ---------------------------- |
| macOS (Apple Silicon) | oc-go-cc_darwin-arm64 |
| macOS (Intel) | oc-go-cc_darwin-amd64 |
| Linux (x86_64) | oc-go-cc_linux-amd64 |
| Linux (ARM64) | oc-go-cc_linux-arm64 |
| Windows (x86_64) | oc-go-cc_windows-amd64.exe |
| Windows (ARM64) | oc-go-cc_windows-arm64.exe |
# Example: macOS Apple Silicon
curl -L -o oc-go-cc https://github.com/samueltuyizere/oc-go-cc/releases/latest/download/oc-go-cc_darwin-arm64
chmod +x oc-go-cc
sudo mv oc-go-cc /usr/local/bin/
oc-go-cc init
Creates a default config at ~/.config/oc-go-cc/config.json.
export OC_GO_CC_API_KEY=sk-opencode-your-key-here
oc-go-cc serve
You'll see output like:
Starting oc-go-cc v0.1.0
Listening on 127.0.0.1:3456
Forwarding to: https://opencode.ai/zen/go/v1/chat/completions
Configure Claude Code with:
export ANTHROPIC_BASE_URL=http://127.0.0.1:3456
export ANTHROPIC_AUTH_TOKEN=unused
To run the proxy in the background (detached from terminal):
oc-go-cc serve --background
# or
oc-go-cc serve -b
This starts the server as a background daemon and returns immediately. Logs are written to ~/.config/oc-go-cc/oc-go-cc.log.
To start the proxy automatically when you log in:
oc-go-cc autostart enable
This creates a launchd plist on macOS. To disable:
oc-go-cc autostart disable
Check status:
oc-go-cc autostart status
In a separate terminal (or the same one before running claude):
export ANTHROPIC_BASE_URL=http://127.0.0.1:3456
export ANTHROPIC_AUTH_TOKEN=unused
claude
That's it. Claude Code will now route all requests through oc-go-cc to OpenCode Go.
┌─────────────┐ Anthropic API ┌─────────────┐ OpenAI API ┌─────────────┐
│ Claude Code ├──────────────────────►│ oc-go-cc ├────────────────────►│ OpenCode Go │
│ (CLI) │ POST /v1/messages │ (Proxy) │ /chat/completions │ (Upstream) │
│ │◄──────────────────────┤ │◄────────────────────┤ │
└─────────────┘ Anthropic SSE └─────────────┘ OpenAI SSE └─────────────┘
| Anthropic | OpenAI |
| ------------------------------------------------------------ | --------------------------------------- |
| system (string or array) | messages[0] with role: "system" |
| content: [{"type":"text","text":"..."}] | content: "..." |
| tool_use content blocks | tool_calls array |
| tool_result content blocks | role: "tool" messages |
| thinking content blocks | reasoning_content |
| stop_reason: "end_turn" | finish_reason: "stop" |
| stop_reason: "tool_use" | finish_reason: "tool_calls" |
| SSE message_start / content_block_delta / message_stop | SSE role / delta.content / [DONE] |
DeepSeek V4 Pro and Flash use the OpenAI-compatible /chat/completions endpoint through OpenCode Go. They support thinking mode and configurable reasoning effort.
For Claude Code and other agentic coding workflows, configure DeepSeek V4 models with:
{
"provider": "opencode-go",
"model_id": "deepseek-v4-pro",
"max_tokens": 8192,
"reasoning_effort": "max",
"thinking": {
"type": "enabled"
}
}
oc-go-cc forwards these fields to OpenCode Go as OpenAI Chat Completions parameters:
reasoning_effort: controls DeepSeek V4 thinking effort (high or max)thinking: enables or disables DeepSeek V4 thinking modeDeepSeek V4 thinking responses are returned as OpenAI reasoning_content and transformed back into Anthropic thinking blocks for Claude Code.
Location: ~/.config/oc-go-cc/config.json
Override with OC_GO_CC_CONFIG environment variable.
{
"api_key": "${OC_GO_CC_API_KEY}",
"host": "127.0.0.1",
"port": 3456,
"models": {
"default": {
"provider": "opencode-go",
"model_id": "kimi-k2.6",
"temperature": 0.7,
"max_tokens": 4096
},
"background": {
"provider": "opencode-go",
"model_id": "qwen3.5-plus",
"temperature": 0.5,
"max_tokens": 2048
},
"think": {
"provider": "opencode-go",
"model_id": "glm-5.1",
"temperature": 0.7,
"max_tokens": 8192
},
"long_context": {
"provider": "opencode-go",
"model_id": "minimax-m2.7",
"temperature": 0.7,
"max_tokens": 16384,
"context_threshold": 60000
},
"deepseek_v4_max": {
"provider": "opencode-go",
"model_id": "deepseek-v4-pro",
"temperature": 0.1,
"max_tokens": 8192,
"reasoning_effort": "max",
"thinking": {
"type": "enabled"
}
}
},
"fallbacks": {
"default": [
{ "provider": "opencode-go", "model_id": "glm-5" },
{ "provider": "opencode-go", "model_id": "qwen3.6-plus" }
],
"think": [{ "provider": "opencode-go", "model_id": "glm-5" }],
"long_context": [{ "provider": "opencode-go", "model_id": "minimax-m2.5" }]
},
"opencode_go": {
"base_url": "https://opencode.ai/zen/go/v1/chat/completions",
"timeout_ms": 300000
},
"logging": {
"level": "info",
"requests": true
}
}
Environment variables override config file values. Config values also support ${VAR} interpolation.
| Variable | Description | Default |
| ----------------------- | ------------------------------------------- | ------------------------------------------------ |
| OC_GO_CC_API_KEY | OpenCode Go API key (required) | — |
| OC_GO_CC_CONFIG | Custom config file path | ~/.config/oc-go-cc/config.json |
| OC_GO_CC_HOST | Proxy listen host | 127.0.0.1 |
| OC_GO_CC_PORT | Proxy listen port | 3456 |
| OC_GO_CC_OPENCODE_URL | OpenCode Go API endpoint | https://opencode.ai/zen/go/v1/chat/completions |
| OC_GO_CC_LOG_LEVEL | Log level: debug, info, warn, `err