Workflow
Many real-world tasks are too complex for one LLM call. The solution is Task Decomposition: decompose them into a chain of multiple Nodes.

You don't want to make each task too coarse, because it may be too complex for one LLM call. You don't want to make each task too granular, because then the LLM call doesn't have enough context and results are not consistent across nodes.
You usually need multiple iterations to find the sweet spot. If the task has too many edge cases, consider using Agents.
Example: Article Writing
import asyncio
from brainyflow import Node, Flow, Memory
# Assume call_llm is defined elsewhere
# async def call_llm(prompt: str) -> str: ...
class GenerateOutline(Node):
async def prep(self, memory): return memory.topic
async def exec(self, topic): return await call_llm(f"Create a detailed outline for an article about {topic}")
async def post(self, memory, prep_res, exec_res):
memory.outline = exec_res
self.trigger('default')
class WriteSection(Node):
async def prep(self, memory): return memory.outline
async def exec(self, outline): return await call_llm(f"Write content based on this outline: {outline}")
async def post(self, memory, prep_res, exec_res):
memory.draft = exec_res
self.trigger('default')
class ReviewAndRefine(Node):
async def prep(self, memory): return memory.draft
async def exec(self, draft): return await call_llm(f"Review and improve this draft: {draft}")
async def post(self, memory, prep_res, exec_res):
memory.final_article = exec_res
# No trigger needed if this is the end of the flow
# Connect nodes
outline = GenerateOutline()
write = WriteSection()
review = ReviewAndRefine()
outline >> write >> review
# Create and run flow
writing_flow = Flow(start=outline)
async def main():
memory = {"topic": "AI Safety"}
await writing_flow.run(memory) # Pass memory object
print("Final Article:", memory.get("final_article", "Not generated")) # Access memory object
if __name__ == "__main__":
asyncio.run(main())
For dynamic cases, consider using Agents.
Last updated