Memory
In BrainyFlow, the Memory
object is the central mechanism for state management and communication between nodes in a flow. It's designed to be flexible yet robust, providing both shared global state and isolated local state for different execution paths.
Creating Memory
The proxied memory instance is automatically created when you pass the initial memory object to a Flow. Alternatively, you can explicitly create it using the createMemory
function (in TypeScript) or the standard constructor Memory()
(in Python):
from brainyflow import Memory
global_store = {"initial_config": "abc"}
local_store_for_start_node = {"start_node_specific": 123} # Optional
memory_instance = Memory(global_store, local_store_for_start_node)
# or just: memory_instance = Memory(global_store)
Memory Scopes: Global vs. Local
BrainyFlow's Memory
object manages two distinct scopes:
Global Store (
memory
): A single object shared across all nodes within a singleflow.run()
execution. Changes made here persist throughout the flow. Think of it as the main shared state.Local Store (
memory.local
): An object specific to a particular execution path within the flow. It's created when a nodetrigger
s a successor. Changes here are isolated to that specific branch and its descendants. Accessingmemory.local
directly (e.g.,memory.local.someKey
) allows you to read from or write to only the local store of the current memory instance.
This dual-scope system allows for both shared application state (global) and controlled, path-specific data propagation (local).
Real-World Analogies:
Think of the memory system like a river delta:
Global Store: The main river, carrying essential data that all branches need.
Local Store: The smaller streams and tributaries that branch off. They carry specific data relevant only to their path, but can still access the main river's water (global store).
Or, consider programming scopes:
Global Store: Like variables declared in the outermost scope of a program, accessible everywhere.
Local Store: Like variables declared inside a function or block. They are only accessible within that block and any nested blocks (downstream nodes in the flow). If a local variable has the same name as a global one, the local variable "shadows" the global one within its scope.
This model gives you the flexibility to share data across your entire flow (global) or isolate context to specific execution paths (local).
Accessing Memory (Reading)
Nodes access data stored in either scope through the memory
proxy instance passed to their prep
and post
methods. When you read a property (e.g., memory.someValue
), the proxy automatically performs a lookup:
It checks the local store (
memory.local
) first.If the property is not found locally, it checks the global store (
memory
).
import { Memory, Node } from 'brainyflow'
interface MyGlobal {
config?: object
commonData?: string
pathSpecificData?: string // Can be global or shadowed by local
}
interface MyLocal {
pathSpecificData?: string
} // Can shadow global properties
class MyNode extends Node<MyGlobal, MyLocal> {
async prep(memory: Memory<MyGlobal, MyLocal>): Promise<void> {
// Reads from global store (assuming not set locally)
const config = memory.config
const common = memory.commonData
// Reads 'pathSpecificData' from local store if it exists there,
// otherwise falls back to reading from the global store.
const specific = memory.pathSpecificData
// To read ONLY from the local store:
const onlyLocal = memory.local.pathSpecificData
}
// ... exec, post ...
}
As a rule of thumb, when accessing memory, you should always prefer using memory.someValue
and let the Memory
manager figure out where to fetch the value for you.
Even though you could directly access the entire local store object using memory.local
- or a value at memory.local.someValue
- that approach adds little value and is pattern that can be safely avoided, unless you want to be very explicit about your design choice.
As you will see in the next section, it's at the writing time that you want to be more careful about where to place your data.
async post(memory: Memory<MyGlobal, MyLocal>, /*...*/) {
const allLocalData = memory.local; // Access the internal __local object directly
console.log('Current local store:', allLocalData);
}
Writing to Memory
Writing to Global Store: Assigning a value directly to a property on the
memory
object (e.g.,memory.someValue = 'new data'
) writes to the global store. The proxy automatically removes the property from the local store first if it exists there.Writing to Local Store: You can write directly to the local store of the current memory instance using
memory.local.someValue = 'new data'
. This affects only the current node's local context and any downstream nodes that inherit this specific memory clone. However, the most convenient way to populate the local store for newly created branches (i.e., for successor nodes) is by providing theforkingData
argument inthis.trigger(action[, forkingData])
.
Deleting from Memory
Deleting from Global/Local (via main proxy): Using
del memory.someKey
(Python) ordelete memory.someKey
(TypeScript) will attempt to delete the key from the global store and also from the current local store.Deleting from Local Only (via
memory.local
): Usingdel memory.local.someKey
(Python) ordelete memory.local.someKey
(TypeScript) will delete the key only from the current local store.
Checking for Existence (in
operator)
in
operator)'key' in memory
: Checks if'key'
exists in either the local store or the global store.'key' in memory.local
: Checks if'key'
exists only in the local store.
Note that you can set types to the memory, like in TypeScript! That is optional, but helps you keep your code organized.
from typing import List, TypedDict
from brainyflow import Memory, Node
class GlobalStore(TypedDict, total=False):
fileList: List[str]
config: dict
results: dict
class DataWriterLocalStore(TypedDict, total=False):
processedCount: int
file: str
branch_id: str
# Assume exec returns a dict like {"files": [...], "count": ...}
class DataWriterNode(Node[GlobalStore, DataWriterLocalStore]):
async def post(self, memory, prep_res, exec_res) -> None:
# --- Writing to Global Store ---
# Accessible to all nodes in the flow and outside
memory.fileList = exec_res["files"]
print(f"Memory updated globally: fileList={memory.fileList}")
# --- Writing to Local Store ---
# Accessible to this node and all descendants
memory.local.processedCount = exec_res["count"]
print(f"Memory updated locally: processedCount={memory.processedCount}")
# --- Triggering with Local Data (Forking Data) ---
# 'file' will be added to the local store of the memory clone
# passed to the node(s) triggered by the 'process_file' action.
for file_item in exec_res["files"]:
self.trigger('process_file', { "file": file_item })
# Example Processor Node (triggered by 'process_file')
class FileProcessorNode(Node):
async def prep(self, memory):
# Reads 'file' from the local store first, then global
file_to_process = memory.file
print(f"Processing file (fetched from local memory): {file_to_process}")
return file_to_process
# ... exec, post ...
Best Practices
Read in
prep()
: Gather necessary input data frommemory
at the beginning of a node's execution.Write Global State in
post()
: Update the shared global store by assigning tomemory
properties (e.g.,memory.results = ...
) in thepost()
phase after processing is complete.Set Local State via
forkingData
: Pass branch-specific context to successors by providing theforkingData
argument inthis.trigger()
within the parent'spost()
method. This populates thelocal
store for the next node(s).Read Transparently: Always read data via the
memory
proxy (e.g.,memory.someValue
). It handles the local-then-global lookup automatically. Avoid reading directly frommemory.local
or other internal properties unless strictly needed.
When to Use The Memory
Ideal for: Sharing data results, large content, or information needed by multiple components
Benefits: Separates data from computation logic (separation of concerns)
Global Memory: Use for application-wide state, configuration, and final results
Local Memory: Use for passing contextual data down a specific execution path
Technical Concepts
The memory system in BrainyFlow implements several established computer science patterns:
Lexical Scoping: Local memory "shadows" global memory, similar to how local variables in functions can shadow global variables
Context Propagation: Local memory propagates down the execution tree, similar to how context flows in React or middleware systems
Transparent Resolution: The system automatically resolves properties from the appropriate memory scope
Remember
Reading: Always read via the
memory
proxy (e.g.,memory.value
). It checks local then global.Writing to Global: Direct assignment
memory.property = value
writes to the global store (and removesproperty
from local if it was there).Writing to Local (Current Node & Successors): Assignment
memory.local.property = value
writes only to the current memory instance's local store. and its descendents.Creating Local State for Successors: Use
trigger(action, forkingData)
inpost()
to populate thelocal
store for the next node(s) in a specific branch.Lifecycle: Read from
memory
inprep
, compute inexec
(no memory access), write global state tomemory
and trigger successors (potentially withforkingData
for local state) inpost
.Cloning: When a flow proceeds to a new node, or when
memory.clone()
is called, the global store is shared by reference, while the local store is deeply cloned.forkingData
provided toclone
is also deeply cloned and merged into the new local store.
Advanced: memory.clone()
memory.clone()
The memory.clone(forkingData?)
method is primarily used internally by the Flow
execution logic when transitioning between nodes. However, you can also use it manually if you need to create a new Memory
instance that shares the same global store but has an independent, optionally modified, local store.
This cloning mechanism is fundamental to how BrainyFlow isolates state between different branches of execution within a flow.
Last updated