In this guide, you will learn how to build fullstack AI agents with Pydantic AI with the AG-UI protocol. Also, we will cover how to integrate the AG-UI + Pydantic AI agents with CopilotKit to chat with the agent and stream its responses in the frontend.
Before we jump in, here is what we will cover:
Here’s a preview of what we will be building:
Let's break down this cool thing called the Agent User Interaction Protocol, or AG-UI for short. It's made by CopilotKit, and it's an open-source tool that's super lightweight and based on events. Basically, it helps create smooth, real-time chats between your app's frontend (like the user interface) and AI agents (think smart bots that can do tasks).
AG-UI makes it easy to handle things like event-driven chats, keeping track of states (like what's happening right now), using tools, and even streaming responses from the AI so they show up bit by bit, just like in a live conversation.
To pass info back and forth between the frontend and the AI agent, AG-UI uses different kinds of events. Here's a simple rundown:
RUN_STARTED
(hey, we're beginning!) and RUN_FINISHED
(all done!).TEXT_MESSAGE_START
(starting a new message), TEXT_MESSAGE_CONTENT
(here's some text to add), and TEXT_MESSAGE_END
(message complete).STATE_SNAPSHOT
(a full picture of the current state) and STATE_DELTA
(just the changes since last time).TOOL_CALL_START
(starting the tool), TOOL_CALL_ARGS
(passing in the details it needs), and TOOL_CALL_END
(tool's done, here's the result).If you want to dive deeper into how AG-UI works and its setup, check out the docs here: AG-UI docs. Keep coding—you've got this!
Now that we have learned what the AG-UI protocol is, let us see how to integrate it with the Pydantic AI agent framework.
Let’s get started!
To fully understand this tutorial, you need to have a basic understanding of React or Next.js.
We'll also make use of the following:
To get started, clone the Open AG UI Demo repository that consists of a Python-based backend (agent) and a Next.js frontend (frontend).
Next, navigate to the backend directory:
cd agent
Then install the dependencies using Poetry:
poetry install
After that, create a .env
file with OpenAI API Key API key:
OPENAI_API_KEY="your-OpenAI-key-here"
Then run the agent using the command below:
poetry run python main.py
Let us now see how to integrate AG-UI protocol with Pydantic AI agents framework.
Using Pydantic AI's BaseModel
, define an AgentState
class that holds all data the agent needs. In our stock example, we track things like available_cash
, investment_portfolio
, and tool_logs
, as shown in the agent/stock.py
file.
Once you have defined the agent state, initialize your Pydantic AI agent and pass the AgentState
to its shared state container, as shown in the agent/stock.py
file.
First, define the core agent tool, such as the Stock Data Tool, as shown in the agent/stock.py
file.
To configure AG-UI state management events, first define a JSON Patch object for making incremental updates to the application state, as shown in the agent/stock.py
file.
Then configure the AG-UI state management events to update the frontend with UI changes, as shown in the agent/stock.py
file.
To configure Human-in-the-Loop functionality, define an @agent.instructions
async function that instructs the Pydantic AI agent to call the frontend action tool by name to ask the user for feedback, as shown in the agent/stock.py
file.
Finally, set up the FastAPI server and configure the /pydantic-agent
endpoint that processes requests using the pydantic AI agent with AG UI integration, as shown in the agent/stock.py
file.
Congratulations! You have integrated a Pydantic AI agent workflow with the AG-UI protocol. Let’s now see how to add a frontend to the AG-UI + Pydantic AI agent workflow.
In this section, you will learn how to create a connection between your AG-UI + Pydantic AI agent workflow and a frontend using CopilotKit.
Let’s get started.
First, navigate to the frontend directory:
Next, create a .env
file with OpenAI API Key API key:
Then install the dependencies:
After that, start the development server:
Navigate to http://localhost:3000, and you should see the AG-UI + Pydantic agent frontend up and running.
Let’s now see how to build the frontend UI for the AG-UI + Pydantic AI agent using CopilotKit.
Before creating an HttpAgent instance, let’s first understand what the HttpAgent is.
HttpAgent is a client from the AG-UI Library that bridges your frontend application with any AG-UI-compatible AI agent’s server.
To create an HttpAgent instance, define it in an API route as shown in the src/app/api/copilotkit/route.ts
file.
To set up the CopilotKit Provider, the [<CopilotKit>](https://docs.copilotkit.ai/reference/components/CopilotKit)
component must wrap the Copilot-aware parts of your application.
For most use cases, it's appropriate to wrap the CopilotKit provider around the entire app, e.g., in your layout.tsx
, as shown below in the src/app/layout.tsx
file.
CopilotKit ships with several built-in chat components, which include CopilotPopup, CopilotSidebar, and CopilotChat.
To set up a Copilot chat component, define it as shown in the src/app/components/prompt-panel.tsx
file.
In CopilotKit, CoAgents maintain a shared state that seamlessly connects your frontend UI with the agent's execution. This shared state system allows you to:
You can learn more about CoAgents’ shared state here on the CopilotKit docs.
To sync your AG-UI + Pydantic AI agent state with the frontend, use the CopilotKit useCoAgent hook to share the AG-UI + Pydantic AI agent state with your frontend, as shown in the src/app/page.tsx
file.
Then render the AG-UI + Pydantic AI agent's state in the chat UI, which is useful for informing the user about the agent's state in a more in-context way.
To render the AG-UI + Pydantic AI agent's state in the chat UI, you can use the useCoAgentStateRender hook, as shown in the src/app/page.tsx
file.
If you execute a query in the chat, you should see the AG-UI + Pydantic AI agent’s state task execution rendered in the chat UI, as shown below.
Human-in-the-loop (HITL) allows agents to request human input or approval during execution, making AI systems more reliable and trustworthy. This pattern is essential when building AI applications that need to handle complex decisions or actions that require human judgment.
You can learn more about Human in the Loop here on CopilotKit docs.
To implement Human-in-the-Loop (HITL) in the frontend, you need to use the CopilotKit useCopilotKitAction hook with the renderAndWaitForResponse
method, which allows returning values asynchronously from the render function, as shown in the src/app/page.tsx
file.
When an agent triggers frontend actions by tool/action name to request human input or feedback during execution, the end-user is prompted with a choice (rendered inside the chat UI). Then the user can choose by pressing a button in the chat UI, as shown below.
To stream your AG-UI + Pydantic AI agent responses or results in the frontend, pass the agent’s state field values to the frontend components, as shown in the src/app/page.tsx
file.
If you query your agent and approve its feedback request, you should see the agent’s response or results streaming in the UI, as shown below.
In this guide, we have walked through the steps of integrating Pydantic AI agents with AG-UI protocol and then adding a frontend to the agents using CopilotKit.
While we’ve explored a couple of features, we have barely scratched the surface of the countless use cases for CopilotKit, ranging from building interactive AI chatbots to building agentic solutions—in essence, CopilotKit lets you add a ton of useful AI capabilities to your products in minutes.
Hopefully, this guide makes it easier for you to integrate AI-powered Copilots into your existing application.
Follow CopilotKit on Twitter and say hi, and if you'd like to build something cool, join the Discord community.
Subscribe to our blog and get updates on CopilotKit in your inbox.