by schmitech
One API for 20+ LLM providers, your databases, and your files — self-hosted, open-source AI gateway with RAG, voice, and guardrails.
# Add to your Claude Code skills
git clone https://github.com/schmitech/orbitgit clone https://github.com/schmitech/orbit.git && cd orbit/docker
docker compose up -d
Then test it:
curl -X POST http://localhost:3000/v1/chat \
-H 'Content-Type: application/json' \
-H 'X-API-Key: default-key' \
-H 'X-Session-ID: local-test' \
-d '{
"messages": [{"role": "user", "content": "Summarize ORBIT in one sentence."}],
"stream": false
}'
That's it. ORBIT is listening on port 3000 with an admin panel at localhost:3000/admin (default login: admin / admin123).
For GPU acceleration: docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d
The public sandbox hosts one chat workspace per adapter. Pick a demo to see ORBIT in action.
| Demo | Data Source | Try it | | :--- | :--- | :--- | | Simple Chat | LLM | simple-chat | | Multimodal Chat | LLM + Files | chat-with-files | | Multi-Source Chat | SQLite + PostgreSQL + DuckDB | composite-multi-source-explorer | | Customer Orders | PostgreSQL | intent-sql-postgres | | HR Database | SQLite | intent-sql-sqlite-hr | | DuckDB Analytics | DuckDB | intent-duckdb-analytics | | EV Population Stats | DuckDB | intent-duckdb-ev-population | | JSONPlaceholder REST API | HTTP (JSON) | intent-http-jsonplaceholder | | Paris Open Data API | HTTP (JSON) | intent-http-paris-opendata | | MFlix Sample Collection | MongoDB | intent-mongodb-mflix | | SpaceX GraphQL | GraphQL | intent-graphql-spacex |
Adapter wiring and sample domains live in config/adapters/ and examples/intent-templates/.
LLM Providers: OpenAI, Anthropic, Google Gemini, Cohere, Groq, DeepSeek, Mistral, AWS Bedrock, Azure, Together, Ollama, vLLM, llama.cpp
Data Sources: PostgreSQL, MySQL, MongoDB, Elasticsearch, DuckDB, SQLite, HTTP/REST APIs, GraphQL
Vector Stores: Chroma, Qdrant, Pinecone, Milvus, Weaviate
| Without ORBIT | With ORBIT | | :--- | :--- | | One SDK per provider, rewrites when you switch | One OpenAI-compatible API across all providers | | Separate pipelines for retrieval and inference | Unified model + retrieval + tooling gateway | | Fragile glue scripts between data sources and LLMs | Production-ready connectors with policy controls | | No visibility into what models are doing | Built-in RBAC, rate limiting, and audit logging |
Using ORBIT in production? Let us know and we'll add your project here.
| Client | Description |
| :--- | :--- |
| Web Chat | React UI |
| CLI | pip install schmitech-orbit-client |
| Mobile | iOS & Android (Expo) |
| Node SDK | Or use any OpenAI-compatible SDK |
git clone https://github.com/schmitech/orbit.git && cd orbit/docker
docker compose up -d
Starts ORBIT + Ollama with SmolLM2, auto-pulls models, and exposes the API on port 3000. The web admin UI is at /admin on the same host. Connect orbitchat from your host:
ORBIT_ADAPTER_KEYS='{"simple-chat":"default-key"}' npx orbitchat
See the full Docker Guide for GPU mode, volumes, and configuration.
docker pull schmitech/orbit:basic
docker run -d --name orbit-basic -p 3000:3000 schmitech/orbit:basic
If Ollama runs on your host, add -e OLLAMA_HOST=host.docker.internal:11434 so the container can reach it. Includes simple-chat only.
curl -L https://github.com/schmitech/orbit/releases/download/v2.6.5/orbit-2.6.5.tar.gz -o orbit-2.6.5.tar.gz
tar -xzf orbit-2.6.5.tar.gz && cd orbit-2.6.5
cp env.example .env && ./install/setup.sh
source venv/bin/activate
./bin/orbit.sh start && cat ./logs/orbit.log
Contributions are welcome! Check the issues for good first tasks, or open a new one to discuss your idea.
If you find ORBIT useful, a star helps others discover the project.
Apache 2.0 — see LICENSE.
No comments yet. Be the first to share your thoughts!