by bitsky-tech
Bridgic is the next-generation agent development framework for building intelligent systems.
# Add to your Claude Code skills
git clone https://github.com/bitsky-tech/bridgicBridgic is the next-generation agent development framework for building intelligent systems.
By redefining the boundary between workflows and agents, Bridgic introduces a unified orchestration and runtime model which enables developers to seamlessly transition between predictable workflows and autonomous, creative agents within one system.
✨ The name "Bridgic" embodies our core philosophy — "Bridging Logic and Magic", where:
- Logic represents deterministic and predictable execution flows, forming the foundation of reliable systems.
- Magic refers to the autonomous parts that can make dynamic decisions and solve problems creatively.
graph
subgraph " "
A["Deterministic Workflows<br/>(Logic)"]
B["Autonomous Agents<br/>(Magic)"]
end
A ---> B
B ---> A
style A fill:#f9f9f9,stroke:#333,stroke-width:2px
style B fill:#f9f9f9,stroke:#333,stroke-width:2px
Bridgic requires Python 3.9 or newer.
pip install bridgic
python -c "from bridgic.core import __version__; print(f'Bridgic version: {__version__}')"
No comments yet. Be the first to share your thoughts!
uv add bridgic
uv run python -c "from bridgic.core import __version__; print(f'Bridgic version: {__version__}')"
ferry_to() API.This section demonstrates Bridgic's core capabilities through practical examples.
You'll learn how to build an intelligent system with Bridgic, from a simple chatbot to autonomous agentic system. In these cases, you will see features such as worker orchestration, dynamic routing, dynamic topology changing, and parameter resolving.
Part-I examples includes both implementations in normal APIs and ASL, showing how ASL simplifies workflow definition with declarative syntax.
Before diving into the examples, set up your LLM instance.
import os
from bridgic.llms.openai import OpenAILlm, OpenAIConfiguration
_api_key = os.environ.get("OPENAI_API_KEY")
_api_base = os.environ.get("OPENAI_API_BASE")
_model_name = os.environ.get("OPENAI_MODEL_NAME")
llm = OpenAILlm(
api_key=_api_key,
api_base=_api_base,
configuration=OpenAIConfiguration(model=_model_name),
timeout=120,
)
Each example in this part provides two implementations:
Core Features:
from typing import List, Dict, Optional
from bridgic.core.model.types import Message
from bridgic.core.automa import GraphAutoma, worker, RunningOptions
class DivideConquerWorkflow(GraphAutoma):
"""Break down a query into sub-queries and answer each one."""
@worker(is_start=True)
async def break_down_query(self, user_input: str) -> List[str]:
"""Break down the query into a list of sub-queries."""
llm_response = await llm.achat(
messages=[
Message.from_text(
text="Break down the query into multiple sub-queries and only return the sub-queries",
role="system"
),
Message.from_text(text=user_input, role="user"),
]
)
return [item.strip() for item in llm_response.message.content.split("\n") if item.strip()]
@worker(dependencies=["break_down_query"], is_output=True)
async def query_answer(self, queries: List[str]) -> Dict[str, str]:
"""Generate answers for each sub-query."""
answers = []
for query in queries:
response = await llm.achat(
messages=[
Message.from_text(text="Answer the given query briefly", role="system"),
Message.from_text(text=query, role="user"),
]
)
answers.append(response.message.content)
return {
query: answer
for query, answer in zip(queries, answers)
}
class QuestionSolverBot(GraphAutoma):
"""A bot that solves questions by breaking them down and merging answers."""
def __init__(self, name: Optional[str] = None, running_options: Optional[RunningOptions] = None):
super().__init__(name=name, running_options=running_options)
# Add DivideConquerWorkflow as a sub-automa
divide_conquer = DivideConquerWorkflow()
self.add_worker(
key="divide_conquer_workflow",
worker=divide_conquer,
is_start=True
)
# Set dependency: merge_answers depends on divide_conquer_workflow
self.add_dependency("merge_answers", "divide_conquer_workflow")
@worker(is_output=True)
async def merge_answers(self, qa_pairs: Dict[str, str], user_input: str) -> str:
"""Merge individual answers into a unified response."""
ans