The AI Operating System for Delphi. 100% native framework with RAG 2.0 for knowledge retrieval, autonomous agents with semantic memory, visual workflow orchestration, and universal LLM connector. Supports OpenAI, Claude, Gemini, Ollama, and more. Enterprise-grade AI for Delphi 10.3+
# Add to your Claude Code skills
git clone https://github.com/gustavoeenriquez/MakerAiπ Official Website: https://makerai.cimamaker.com
Free Pascal / Lazarus port available β Full port of MakerAI Suite for FPC 3.2+ (12 LLM drivers, RAG, Agents, MCP, Embeddings). See the
fpcbranch.
Most AI libraries for Delphi stop at wrapping REST calls. MakerAI is different.
Yes, MakerAI includes native, provider-specific components that give you direct, full-fidelity access to each provider's API β every model parameter, every response field, every streaming event, exactly as the provider defines it.
But on top of that, MakerAI is a complete AI application ecosystem that lets you build production-grade intelligent systems entirely in Delphi:
No comments yet. Be the first to share your thoughts!
Whether you need a simple one-provider integration or a multi-agent, multi-provider, retrieval-augmented production system, MakerAI covers the full stack β natively in Delphi.
The biggest architectural change in v3.3 is the TAiCapabilities system, which replaces scattered per-provider flags with a unified, declarative model of what each model can do and what a session needs:
ModelCaps β what the model natively supports (e.g. [cap_Image, cap_Reasoning])SessionCaps β what capabilities the current session requiresSessionCaps exceeds ModelCaps, MakerAI automatically activates bridges (tool-assisted OCR, vision bridges, etc.) without changing your codeThinkingLevel β unified reasoning depth control (tlLow, tlMedium, tlHigh) across all providers that support extended thinking| Provider | New / Updated Models | |----------|----------------------| | OpenAI | gpt-5.2, gpt-image-1, o3, o3-mini | | Claude | claude-opus-4-6, claude-sonnet-4-6, claude-3-7-sonnet | | Gemini | gemini-3.0, gemini-2.5-flash, gemini-2.5-flash-image | | Grok | grok-4, grok-3, grok-imagine-image | | Mistral | Magistral (reasoning), mistral-ocr-latest | | DeepSeek | deepseek-reasoner (extended thinking) | | Kimi | kimi-k2.5 (extended thinking) |
TAiFileCheckpointer β persists agent graph state to disk; resume workflows after crashes or restartsTAiWaitApprovalTool β suspends a node and waits for human approval before continuingTAIAgentManager.OnSuspend event for building approval UIsResumeThread(ThreadID, NodeName, Input) to continue suspended workflowsuMakerAi.RAG.Graph.Documents.pas β full document lifecycle management (ingest, chunk, embed, link) directly into the knowledge graphreasoning_content is now correctly preserved and re-sent in multi-turn tool call conversations for all providers that require it (DeepSeek-reasoner, Kimi k2.5, Groq reasoning models)TAiEmbeddingsConnection β abstract connector for swappable embedding providersTAiAudioPushStream β push-based audio streaming utilityMCP concurrent tool calls β race condition (uMakerAi.MCPClient.Core.pas): When a model responded with two or more tools from the same MCP server in a single turn, ParseChat launched all tool calls as parallel TTasks. Since TMCPClientStdIo shares a single process/pipe per instance (no synchronization), concurrent calls corrupted the JSON-RPC communication, causing intermittent failures. Fixed by adding FCallLock: TCriticalSection to TMCPClientCustom β calls to the same server are now serialized while calls to different servers still run in parallel.
EAggregateException on tool errors β Claude driver (uMakerAi.Chat.Claude.pas): The local _CreateTask procedure in TAiClaudeChat.ParseChat lacked the try/except present in the base class. Any exception raised inside a tool task (MCP timeout, network error, etc.) escaped unhandled, causing TTask.WaitForAll to wrap it in an EAggregateException and crash the application. Fixed to match base class behavior: exceptions are caught, reported via OnError, and the tool receives an error response so the conversation can continue.
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Your Delphi Application β
ββββββ¬βββββββββββββββββββ¬ββββββββββββββββββ¬βββββββββββββββββββββββββ
β β β
ββββββΌβββββ βββββββββββΌβββββββββββ ββββΌβββββββββββββββββββββββββ
β ChatUI β β Agents β β Design-Time β
β FMX β β TAIAgentManager β β Property Editors β
β Visual β β TAIBlackboard β β Object Inspector support β
β Comps β β Checkpoint/Approve β βββββββββββββββββββββββββββββ
ββββββ¬βββββ βββββββββββ¬βββββββββββ
β β
ββββββΌβββββββββββββββββββΌβββββββββββββββββββββββββββββββββββββββββββ
β TAiChatConnection β Universal Connector β
β Switch provider at runtime via DriverName property β
ββββββββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββ
β
ββββββββββββββββββββββββββββββββΌββββββββββββββββββββββββββββββββββββ
β Native Provider Drivers (direct API access, full fidelity) β
β OpenAI Β· Claude Β· Gemini Β· Grok Β· Mistral Β· DeepSeek Β· Kimi β
β Groq Β· Cohere Β· Ollama Β· LM Studio Β· GenericLLM β
ββββββββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββΌβββββββββββββββββββββββββ
β β β
ββββββΌβββββββββ ββββββββββββββΌβββββββββ βββββββββββββΌββββββββββ
β ChatTools β β RAG β β MCP β
β PDF/Vision β β Vector (VQL) β β Server (HTTP/SSE β
β Speech/STT β β Graph (GQL) β β StdIO/Direct) β
β Web Search β β PostgreSQL/SQLite β β Client β
β Shell β β HNSW Β· BM25 Β· RRF β β TAiFunctions bridgeβ
β ComputerUseβ β Rerank Β· Documents β βββββββββββββββββββββββ
βββββββββββββββ βββββββββββββββββββββββ
MakerAI gives you two ways to work with each provider, which you can mix freely:
Full, provider-specific access to every API feature. Use when you need complete control:
| Component | Provider | Latest Models |
|-----------|----------|---------------|
| TAiOpenChat | OpenAI | gpt-5.2, o3, o3-mini |
| TAiClaudeChat | Anthropic | claude-opus-4-6, claude-sonnet-4-6 |
| TAiGeminiChat | Google | gemini-3.0, gemini-2.5-flash |
| TAiGrokChat | xAI | grok-4, grok-3 |
| TAiMistralChat | Mistral AI | Magistral, mistral-large |
| TAiDeepSeekChat | DeepSeek | deepseek-reasoner, deepseek-chat |
| TAiKimiChat | Moonshot | kimi-k2.5 |
| TAiGroqChat | Groq | llama-3.3, deepseek-r1 |
| TCohereChat | Cohere | command-r-plus |
| TAiOllamaChat | Ollama | Any local model |
| TAiLMStudioChat | LM Studio | Any local model |
| TAiGenericChat | OpenAI-compatible | Any OpenAI-API endpoint |
Provider-agnostic code. Switch models or providers by changing one property:
AiConn.DriverName := 'OpenAI';
AiConn.Model := 'gpt-5.2';
AiConn.ApiKey := '@OPENAI_API_KEY'; // resolved from environment variable
// Switch to Gemini without changing anything else
AiConn.DriverName := 'Gemini';
AiConn.Model := 'gemini-3.0-flash';
AiConn.ApiKey := '@GEMINI_API_KEY';
| Feature | OpenAI (gpt-5.2) | Claude (4.6) | Gemini (3.0) | Grok (4) | Mistral | DeepSeek | Ollama | |:--------|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | Text Generation | β | β | β | β | β | β | β | | Streaming (SSE) | β | β | β | β | β | β | β | | Function Calling | β | β | β | β | β | β | β | | JSON Mode / Schema | β | β | β | β | β | β | β | | Image Input | β | β | β | β | β | β | β | | PDF / Files | β | β | β | β οΈ | β | β | β οΈ | | Image Generation | β | β | β | β | β | β | β | | Video Generation | β | β | β | β | β | β | β | | Extended Thinking | β | β | β | β | β | β | β οΈ | | Speech (TTS/STT) | β | β | β | β | β | β | β οΈ | | Web Search | β | β | β | β | β | β | β | | Com