Manifold is an experimental platform for enabling long horizon workflow automation using teams of AI assistants.
# Add to your Claude Code skills
git clone https://github.com/intelligencedev/manifoldManifold is an experimental platform for long-horizon workflow automation with teams of AI assistants.
It supports OpenAI, Google, and Anthropic models, along with OpenAI-compatible APIs for self-hosted open-weight models served through llama.cpp or vLLM.
[!WARNING] Manifold is an experimental frontier AI platform. Do not deploy it in production environments that require strong stability guarantees unless this README explicitly states otherwise.
Manifold is built for workflows that go beyond one-shot prompts. It gives you a workspace where specialists, tools, projects, and workflows can work together on multi-step objectives over extended periods.
Use a traditional chat interface to assign objectives to specialists.

Specialists can collaborate across multiple turns. Manifold is designed to take advantage of the long-horizon capabilities of frontier models and can work on complex objectives for hours.

Design agent workflows with a visual flow editor. MCP tools are exposed as nodes automagically. Saved workflows become tools that can be invoked by specialists or inserted as nodes into other workflows. It's workflows all the way down.


Manifold supports image generation with OpenAI and Google models, as well as local image generation through a custom ComfyUI MCP client.
No comments yet. Be the first to share your thoughts!

Example ComfyUI-generated image using a custom workflow.
Define and configure AI agents, then build your own team of experts.

Configure projects as agent workspaces.
Each project is isolated to its own root path. Agents only load skills from that project's .skills/ folder, so every project that needs reusable skills must define its own .skills directory inside the project root.

Manifold includes built-in tools for agent workflows and supports MCP to extend agent capabilities. You can configure multiple MCP servers and enable tools individually to manage context size more precisely.

Create, iterate on, and version prompts that can be assigned to agents. Configure datasets and run experiments to understand how prompt changes affect agent behavior.

The recommended first-run path is Docker-based and does not require a local Go, Node, or pnpm toolchain.
For a basic local deployment, you need:
WORKDIROptional local tooling is only needed if you are developing Manifold itself:
pnpm for running the frontend outside Dockercp example.env .env
cp config.yaml.example config.yaml
# Edit .env and set at minimum:
# OPENAI_API_KEY=...
# WORKDIR=/absolute/path/to/your/manifold-workdir
docker compose up -d pg-manifold manifold
Then open http://localhost:32180.
Manifold can also run without external database or telemetry services when you build agentd locally. Enable the embedded Postgres runtime and keep ClickHouse/OTLP unset:
databases:
embedded: true
defaultDSN: ""
obs:
otlp: ""
local:
enabled: true
clickhouse:
dsn: ""
With that configuration, agentd starts a bundled PostgreSQL process for durable state and serves metrics, logs, and traces from bounded process-local telemetry. You still need an LLM provider, which can be a remote API key or a local OpenAI-compatible endpoint.
For the full deployment walkthrough, see:
make build-manifold builds agentd with the embedded frontend using the stable UI feature gate. Stable builds do render frontend undocumented features still in active development.
To build the same backend and embedded frontend with beta UI links enabled, use either command:
make build-manifold-beta
make build-manifold FEATURE_GATE=beta
The build passes FEATURE_GATE through to Vite as VITE_MANIFOLD_FEATURE_GATE.