by arch3rPro
An intelligent API proxy tool designed to intercept and redirect OpenAI API requests.一个智能的API代理工具,专门用于拦截和重定向OpenAI API请求到自定义后端服务。支持多后端配置、动态模型映射和流式响应处理。
# Add to your Claude Code skills
git clone https://github.com/arch3rPro/Trae-ProxyA high-performance, low-latency API proxy middleware designed for large language model applications, capable of seamlessly intercepting and redirecting OpenAI API requests to any custom backend service. Supports multi-backend load balancing, intelligent routing, dynamic model mapping, and streaming response handling, allowing you to break through official API limitations and freely choose and integrate various LLM services while maintaining complete compatibility with existing applications.
No comments yet. Be the first to share your thoughts!
Trae-Proxy installation and usage consists of the following steps:
# Clone the repository
git clone https://github.com/arch3rpro/trae-proxy.git
cd trae-proxy
# Start the service
docker-compose up -d
# View logs
docker-compose logs -f
# Install dependencies
pip install -r requirements.txt
# Generate certificates
python generate_certs.py
# Start the proxy server
python trae_proxy.py
Trae-Proxy uses a YAML format configuration file config.yaml:
# Trae-Proxy configuration file
# Proxy domain configuration
domain: api.openai.com
# Backend API configuration list
apis:
- name: "deepseek-r1"
endpoint: "https://api.deepseek.com"
custom_model_id: "deepseek-reasoner"
target_model_id: "deepseek-reasoner"
stream_mode: null
active: true
- name: "kimi-k2"
endpoint: "https://api.moonshot.cn"
custom_model_id: "kimi-k2-0711-preview"
target_model_id: "kimi-k2-0711-preview"
stream_mode: null
active: true
- name: "qwen3-coder-plus"
endpoint: "https://dashscope.aliyuncs.com/compatible-mode"
custom_model_id: "qwen3-coder-plus"
target_model_id: "qwen3-coder-plus"
stream_mode: null
active: true
# Proxy server configuration
server:
port: 443
debug: true
Copy the CA certificate from the server to your local machine:
# Copy CA certificate from server
scp user@your-server-ip:/path/to/trae-proxy/ca/api.openai.com.crt .
api.openai.com.crt fileapi.openai.com.crt file, which will open "Keychain Access"C:\Windows\System32\drivers\etc\hosts as administratoryour-server-ip api.openai.com
sudo vim /etc/hostsyour-server-ip api.openai.com
curl https://api.openai.com/v1/models
If configured correctly, you should see the model list returned by the proxy server.
trae-proxy/
├── trae_proxy.py # Main proxy server
├── trae_proxy_cli.py # Command-line management tool
├── generate_certs.py # Certificate generation tool
├── config.yaml # Configuration file
├── docker-compose.yml # Docker deployment configuration
├── requirements.txt # Python dependencies
└── ca/ # Certificates and keys directory
+------------------+ +--------------+ +------------------+
| | | | | |
| | | | | |
| DeepSeek API +--->+ +--->+ Trae IDE |
| | | | | |
| Moonshot API +--->+ +--->+ VSCode |
| | | | | |
| Aliyun API +--->+ Trae-Proxy +--->+ JetBrains |
| | | | | |
| Self-hosted LLM +--->+ +--->+ OpenAI Clients |
| | | | | |
| Other API Svcs +--->+ | | |
| | | | | |
| | | | | |
+------------------+ +--------------+ +------------------+
Backend Services Proxy Server Client Apps
This project is licensed under the MIT License - see the LICENSE file for details.