by Canner
The open context engine for AI agents support 15+ data sources. Built on Rust and Apache DataFusion.
# Add to your Claude Code skills
git clone https://github.com/Canner/wren-engineWren Engine is the open foundation behind Wren AI: a semantic, governed, agent-ready context layer for business data.
https://github.com/user-attachments/assets/037f2317-d8e5-41f2-9563-1e0bce4ef50c
AI agents can already call tools, browse docs, and write code. What they still struggle with is business context.
Enterprise data is not just rows in a warehouse. It is definitions, metrics, relationships, permissions, lineage, and intent. An agent that can connect to PostgreSQL or Snowflake still does not know what "net revenue", "active customer", or "pipeline coverage" actually mean in your company.
This is not just our thesis. In Your Data Agents Need Context, a16z argues that data agents break down when they only have connectivity and SQL generation, but lack business definitions, source-of-truth context, and the operational knowledge that explains how a company actually runs.
Wren Engine exists to solve that gap.
It gives AI agents a context engine they can reason over, so they can:
This is the open source context engine for teams building the next generation of agent experiences.
We believe the future of AI is not tool calling alone. It is context-rich systems where agents can reason, retrieve, plan, and act on top of a shared understanding of business reality.
Wren Engine is our open source contribution to that future.
It is the semantic and execution foundation beneath Wren AI, and it is designed to be useful well beyond a single product:
No comments yet. Be the first to share your thoughts!
If Wren AI is the full vision, Wren Engine is the open core that makes that vision interoperable.
Wren Engine turns business data into agent-usable context.
At a high level:
This is the practical open source path from text-to-SQL toward context-aware data agents.
That means your agent is no longer asking, "Which raw table should I query?"
It is asking, "Which business concept, metric, or governed slice of context do I need to answer this task correctly?"
Wren Engine is especially useful for the open source community building agent-native workflows in tools like:
If your environment can speak MCP, call HTTP APIs, or embed a semantic service, Wren Engine can become the context layer behind your agent.
Use it to power experiences like:
This is especially important in developer-facing agent environments, where the assistant may understand your codebase but still lacks the business context required to answer data questions correctly.
Wren Engine is built to work across modern data stacks, including warehouses, databases, and file-based sources.
Current open source support includes connectors such as:
See the connector API docs in the project documentation for the latest connection schemas and capabilities.
If you want to use Wren Engine from a Claude Code or other AI Agents, start here:
The MCP server includes:
People often compare Wren Engine to catalog services like DataHub, raw database MCP servers, BI semantic tools, or text-to-SQL agents.
The simple difference is:
| Tool type | What it gives the agent | What Wren Engine adds | | --- | --- | --- | | Data catalog services | Tables, columns, lineage, owners, descriptions | Business models, metrics, relationships, and governed query planning | | Raw database or schema access | Direct access to schemas and SQL execution | A business layer above raw tables so the agent does not have to guess intent | | BI or semantic tools | Curated metrics and entities for analytics workflows | An open context layer designed for MCP and agent workflows | | Text-to-SQL agents | Fast SQL generation from natural language | Better accuracy by grounding generation in explicit business definitions |
Many teams will want both:
Why that matters:
Without Wren, an agent may know where the data is but still not know how to answer the question correctly.
This repository contains the core engine modules:
| Module | What it does |
| --- | --- |
| wren-core | Rust context engine powered by Apache DataFusion for MDL analysis, planning, and optimization |
| wren-core-base | Shared manifest and modeling types |
| wren-core-py | PyO3 bindings that expose the engine to Python |
| ibis-server | FastAPI server for query execution, validation, metadata, and connectors |
| mcp-server | MCP server for AI agents and MCP-compatible clients |
Supporting modules include wren-core-legacy, example, mock-web-server, and benchmarking utilities.
wren-core/README.mdwren-core-py/README.mdibis-server/README.mdmcp-server/README.md](./mcp-server/R