The Callback system in GraphAI provides a powerful mechanism for streaming data between nodes, particularly useful for handling streaming LLM outputs or other incremental data processing.
The Graph automatically creates a callback when needed, but you can also create and customize one:
Copy
Ask AI
from graphai import Callback# Create a callback with default settingscallback = Callback()# Create a callback with custom settingscallback = Callback( identifier="custom_id", # Used for special tokens special_token_format="<{identifier}:{token}:{params}>", # Format for special tokens token_format="{token}" # Format for regular tokens)
To use callbacks in a node, mark it with stream=True:
Copy
Ask AI
from graphai import node@node(stream=True)async def streaming_node(input: dict, callback): """This node receives a callback parameter because stream=True.""" # Process input for chunk in process_chunks(input["data"]): # Stream output chunks await callback.acall(chunk) # Return final result return {"result": "streaming complete"}
Important points:
The stream=True parameter tells GraphAI to inject a callback
# Mark the start of a nodeawait callback.start_node(node_name="my_node")# Mark the end of a nodeawait callback.end_node(node_name="my_node")# Close the callback streamawait callback.close()
GraphAI uses special tokens to mark events in the stream:
Copy
Ask AI
<graphai:node_name:start> # Marks the start of a node<graphai:node_name> # Identifies the node<graphai:node_name:end> # Marks the end of a node<graphai:END> # Marks the end of the stream
These tokens can be customized using the special_token_format parameter.
callback = Callback( # Custom identifier (default is "graphai") identifier="myapp", # Custom format for special tokens special_token_format="<[{identifier}|{token}|{params}]>", # Custom format for regular tokens token_format="TOKEN: {token}")