Get Started
Quickstart
This guide will help you build a simple LLM-powered agent using GraphAI and the OpenAI API. By the end, you’ll have a functional agent that can:
- Determine whether to search for information or use memory
- Execute the appropriate action
- Generate a response to the user’s query
Prerequisites
- Python 3.9+
- An OpenAI API key
- Basic understanding of async Python
Installation
Building a Simple Agent
Let’s build a simple agent that can route user questions to either search or memory retrieval.
Step 1: Set Up Your Dependencies
Step 2: Define Your Tool Schemas
We’ll create Pydantic models for our tools:
Step 3: Define Your Nodes
GraphAI uses the concept of nodes to process information. Let’s define our nodes:
Step 4: Set Up the Graph
The Graph connects all the nodes and defines the flow of information:
Step 5: Execute the Graph
Now we can run our agent with a user query:
How It Works
- The
node_start
node receives the initial input and passes it to the router. - The
node_router
uses an LLM to decide whether to use search or memory based on the query. - The chosen node (either
search
ormemory
) retrieves information. - The
llm_node
generates a response using the retrieved information. - The
node_end
node returns the final output.
This simple example demonstrates GraphAI’s flexibility. By changing the node implementations, you can easily modify the agent’s behavior without changing its overall structure.