by fim-ai
LLM-powered Agent Runtime with Dynamic DAG Planning & Concurrent Execution
# Add to your Claude Code skills
git clone https://github.com/fim-ai/fim-one
π English | π¨π³ δΈζ | π―π΅ ζ₯ζ¬θͺ | π°π· νκ΅μ΄ | π©πͺ Deutsch | π«π· FranΓ§ais
The AI connector hub for global teams with complex enterprise stacks. Embed as a Copilot, or connect every system as a Hub β including the ones your global platform can't reach.
π Website Β· π Docs Β· π Changelog Β· π Report Bug Β· π¬ Discord Β· π¦ Twitter Β· π Product Hunt
[!TIP] βοΈ Skip the setup β try FIM One on Cloud. A managed version is live at cloud.fim.ai: no Docker, no API keys, no config. Sign in and start connecting your systems in seconds. Early access, feedback welcome.
Global enterprises run a sprawl of systems that don't talk to each other β ERP, CRM, OA, HR, finance, databases, IM platforms across regions. FIM One is the that orchestrates across all of them, including the China-only systems (Feishu, WeCom, DingTalk, DM, Kingbase, etc.) that your global AI platform can't reach.
No comments yet. Be the first to share your thoughts!
| Mode | What it is | Access | | -------------- | ------------------------------------------------------- | ----------------------- | | Standalone | General-purpose AI assistant β search, code, KB | Portal | | Copilot | AI embedded in a host system's UI | iframe / widget / embed | | Hub | Central AI orchestration across all connected systems | Portal / API |
graph LR
ERP <--> Hub["π FIM One Hub"]
Database <--> Hub
Lark <--> Hub
Hub <--> CRM
Hub <--> OA
Hub <--> API[Custom API]
Dashboard β stats, activity trends, token usage, and quick access to agents and conversations.

Agent Chat β ReAct reasoning with multi-step tool calling against a connected database.

DAG Planner β LLM-generated execution plan with parallel steps and live status tracking.

Using Agents
Using Planner Mode
git clone https://github.com/fim-ai/fim-one.git
cd fim-one
cp example.env .env
# Edit .env: set LLM_API_KEY (and optionally LLM_BASE_URL, LLM_MODEL)
docker compose up --build -d
Open http://localhost:3000 β on first launch you'll create an admin account. That's it.
docker compose up -d # start
docker compose down # stop
docker compose logs -f # view logs
Prerequisites: Python 3.11+, uv, Node.js 18+, pnpm.
git clone https://github.com/fim-ai/fim-one.git && cd fim-one
cp example.env .env # Edit: set LLM_API_KEY
uv sync --all-extras
cd frontend && pnpm install && cd ..
./start.sh dev # hot reload: Python --reload + Next.js HMR
| Command | What starts | URL |
| ---------------- | --------------------------------- | ------------------------------ |
| ./start.sh | Next.js + FastAPI | localhost:3000 (UI) + :8000 |
| ./start.sh dev | Same, with hot reload | Same |
| ./start.sh dev:api | API only, dev mode (hot reload) | localhost:8000 |
| ./start.sh dev:ui | Frontend only, dev mode (HMR) | localhost:3000 |
| ./start.sh api | FastAPI only (headless) | localhost:8000/api |
For production deployment (Docker, reverse proxy, zero-downtime updates), see the Deployment Guide.
FeishuGateHook gates sensitive tool calls behind a human approval card posted to a Feishu group. Extensible to audit logging, read-only-mode guards, and rate limits (v0.9).AUTO_ROUTING.read_uploaded_file tool. Intelligent document processing: PDFs, DOCX, and PPTX files get vision-aware processing with embedded image extraction when the model supports vision. Smart PDF mode extracts text from text-rich pages and renders scanned pages as images.CODE_EXEC_BACKEND=docker).[N] citations.BaseChannel abstraction for outbound messaging across Slack, Microsoft Teams, Discord, Feishu (Lark), WeCom, and DingTalk. First shipping implementation is Feishu; Slack / Teams / WeCom / Email are next on the v0.9 roadmap.GateHook (Feishu today, Slack/Teams next) posts an Approve / Reject card to your group when a sensitive tool call fires; the tool blocks until a group member taps a verdict. Human-in-the-loop approval without a custom workflow engine.Deep dive: Architecture Β· Hook System Β· Channels Β· Execution Modes Β· Why FIM One Β· Competitive Landscape
graph TB
subgraph app["Application Layer"]
a["Portal Β· API Β· iframe Β· Feishu Β· Slack Β· WeCom Β· DingTalk Β· Teams Β· Email Β· Contract Systems Β· Custom Webhooks"]
end
subgraph mid["FIM One"]
direction LR
m1["Connectors<br/>+ MCP Hub"] ~~~ m2["Orch Engine<br/>ReAct / DAG"] ~~~ m3["RAG /<br/>Knowledge"] ~~~ m5["Hook System<br/>+ Channels"] ~~~ m4["Auth /<br/>Admin"]
end
subgraph biz["Business Systems"]
b["ERP Β· CRM Β· OA Β· Finance Β· Databases Β· Contract Mgmt Β· Custom APIs"]
end
app --> mid --> biz
Each connector and channel is a standardized bridge β the agent doesn't know or care whether it's talking to SAP, a custom contract system, or a Feishu group. The Hook System runs platform code outside the LLM loop for approvals,