yomo
by yomorun
š¦ Serverless AI Agent Framework with Geo-distributed Edge AI Infra.
# Add to your Claude Code skills
git clone https://github.com/yomorun/yomoYoMo

YoMo is an open-source LLM Function Calling Framework for building scalable and ultra-fast AI Agents. š We care about: Empowering Exceptional Customer Experiences in the Age of AI
We believe that seamless and responsive AI interactions are key to delivering outstanding customer experiences. YoMo is built with this principle at its core, focusing on speed, reliability, and scalability.
š¶ Features
| | Features | | | -- | ------------ | -- | | ā”ļø | Low-Latency MCP | Guaranteed by implementing atop the QUIC Protocol. Experience significantly faster communication between AI agents and MCP server. | | š | Enhanced Security | TLS v1.3 encryption is applied to every data packet by design, ensuring robust security for your AI agent communications. | | š | Strongly-Typed Language | Build robust AI agents with complete confidence through type-safe function calling, enhanced error detection, and seamless integration capabilities. Type safety prevents runtime errors, simplifies testing, and enables IDE auto-completion. Currently support TypeScript and Go. | | šø | Effortless Serverless DevOps | Streamline the entire lifecycle of your LLM tools, from development to deployment. Significantly reduces operational overhead, allowing you to focus exclusively on creating innovative AI agent functionalities. | | š | Geo-Distributed Architecture | Bring AI inference and tools closer to your users with our globally distributed architecture, resulting in significantly faster response times and a superior user experience for your AI agents. |
š Getting Started
Let's build a simple AI agent with LLM Function Calling to provide weather information:
Step 1. Install CLI
curl -fsSL https://get.yomo.run | sh
Verify the installation:
yomo version
Step 2. Start the server
Create a configuration file my-agent.yaml:
name: my-agent
host: 0.0.0.0
port: 9000
auth:
type: token
token: SECRET_TOKEN
bridge:
ai:
server:
addr: 0.0.0.0:9000 ## OpenAI API compatible endpoint
provider: vllm ## llm to use
providers:
vllm:
api_endpoint: http://127.0.0.1:8000/v1
model: meta-llama/Llama-4-Scout-17B-16E-Instruct
ollama:
api_endpoint: http://localhost:11434
Launch the server:
yomo serve -c my-agent.yaml
Step 3. Implement the LLM Function Calling
Create a type-safe function ...