Callback Basics
At its core, the Callback is an asyncio-based system that:- Provides a queue for passing streaming data between components
- Handles special tokens to mark node start/end events
- Structures streaming content for easy consumption by downstream processes
- Can be integrated with any async compatible streaming system
Creating a Callback
The Graph automatically creates a callback when needed, but you can also create and customize one:Callback In Nodes
To use callbacks in a node, mark it withstream=True
:
- The
stream=True
parameter tells GraphAI to inject a callback - The node must have a
callback
parameter - The callback can be used to stream output chunks
Streaming from LLMs
A common use case is streaming output from an LLM:Callback Methods
The Callback provides several key methods:Streaming Content
Node Management
Consuming a Callback Stream
You can consume a callback’s stream using its async iterator:Special Tokens
GraphAI uses special tokens to mark events in the stream:special_token_format
parameter.
Example: Web Server with Streaming
Here’s how to use callbacks with a FastAPI server:Callback Configuration
You can customize the callback’s behavior:Advanced: Custom Processing of Special Tokens
You can implement custom processing of special tokens:Best Practices
- Use async whenever possible: The callback system is built on asyncio
- Close the callback when done: Always call
await callback.close()
when finished - Keep streaming chunks small: Don’t stream large objects; break them into manageable chunks
- Handle special tokens correctly: When consuming streams, handle special tokens appropriately