A middleware to provide an openAI compatible endpoint that can call MCP tools
# Add to your Claude Code skills
git clone https://github.com/SecretiveShell/MCP-BridgeMCP-Bridge acts as a bridge between the OpenAI API and MCP (MCP) tools, allowing developers to leverage MCP tools through the OpenAI API interface.
[!NOTE]
Looking for new maintainers to assist with the project. Reach out in the Discord or open an issue if you are interested.
Additionally, Open WebUI natively supports MCP (Model Context Protocol) starting in v0.6.31, so MCP-Bridge should be considered as soft deprecated now.
MCP-Bridge is designed to facilitate the integration of MCP tools with the OpenAI API. It provides a set of endpoints that can be used to interact with MCP tools in a way that is compatible with the OpenAI API. This allows you to use any client with any MCP tool without explicit support for MCP. For example, see this example of using Open Web UI with the official MCP fetch tool.

working features:
non streaming chat completions with MCP
streaming chat completions with MCP
non streaming completions without MCP
MCP tools
MCP sampling
SSE Bridge for external clients
planned features:
streaming completions are not implemented yet
No comments yet. Be the first to share your thoughts!
MCP resources are planned to be supported
The recommended way to install MCP-Bridge is to use Docker. See the example compose.yml file for an example of how to set up docker.
Note that this requires an inference engine with tool call support. I have tested this with vLLM with success, though ollama should also be compatible.
Clone the repository
Edit the compose.yml file
You will need to add a reference to the config.json file in the compose.yml file. Pick any of
see below for an example of each option:
environment:
- MCP_BRIDGE__CONFIG__FILE=config.json # mount the config file for this to work
- MCP_BRIDGE__CONFIG__HTTP_URL=http://10.88.100.170:8888/config.json
- MCP_BRIDGE__CONFIG__JSON={"inference_server":{"base_url":"http://example.com/v1","api_key":"None"},"mcp_servers":{"fetch":{"command":"uvx","args":["mcp-server-fetch"]}}}
The mount point for using the config file would look like:
volumes:
- ./config.json:/mcp_bridge/config.json
docker-compose up --build -d