Back
By Bonnie and Nathan Tarbert
October 3, 2025

TL;DR

In this guide, you will learn how to build fullstack AI agents with Pydantic AI with the AG-UI protocol. Also, we will cover how to integrate the AG-UI + Pydantic AI agents with CopilotKit to chat with the agent and stream its responses in the frontend.

Before we jump in, here is what we will cover:

  • What is AG-UI protocol?
  • Integrating Pydantic AI agents with AG-UI protocol
  • Integrating a frontend to the AG-UI + Pydantic AI agent using CopilotKit

Here’s a preview of what we will be building:

What is AG-UI protocol?

Let's break down this cool thing called the Agent User Interaction Protocol, or AG-UI for short. It's made by CopilotKit, and it's an open-source tool that's super lightweight and based on events. Basically, it helps create smooth, real-time chats between your app's frontend (like the user interface) and AI agents (think smart bots that can do tasks).

AG-UI makes it easy to handle things like event-driven chats, keeping track of states (like what's happening right now), using tools, and even streaming responses from the AI so they show up bit by bit, just like in a live conversation.

To pass info back and forth between the frontend and the AI agent, AG-UI uses different kinds of events. Here's a simple rundown:

  • Lifecycle events: These are like signals for when the agent's job kicks off or wraps up. For example, there's RUN_STARTED (hey, we're beginning!) and RUN_FINISHED (all done!).
  • Text message events: These handle sending the AI's responses in a streaming way to the frontend. You'll see stuff like TEXT_MESSAGE_START (starting a new message), TEXT_MESSAGE_CONTENT (here's some text to add), and TEXT_MESSAGE_END (message complete).
  • State management events: These keep everything in sync between the frontend and the AI, so no one's out of the loop. Examples include STATE_SNAPSHOT (a full picture of the current state) and STATE_DELTA (just the changes since last time).
  • Tool call events: These are for when the agent needs to use a tool (like fetching data or running a function). They include TOOL_CALL_START (starting the tool), TOOL_CALL_ARGS (passing in the details it needs), and TOOL_CALL_END (tool's done, here's the result).

If you want to dive deeper into how AG-UI works and its setup, check out the docs here: AG-UI docs. Keep coding—you've got this!

Image from Notion

Now that we have learned what the AG-UI protocol is, let us see how to integrate it with the Pydantic AI agent framework.

Check out the AG-UI ⭐️ GitHub

Let’s get started!

Prerequisites

To fully understand this tutorial, you need to have a basic understanding of React or Next.js.

We'll also make use of the following:

  • Python - a popular programming language for building AI agents with LangGraph; make sure it is installed on your computer.
  • Pydantic AI - a Python agent framework designed to make it less painful to build production-grade applications with Generative AI.
  • OpenAI API Key - an API key to enable you to perform various tasks using the GPT models; for this tutorial, ensure you have access to the GPT-4 model.
  • CopilotKit - an open-source copilot framework for building custom AI chatbots, in-app AI agents, and text areas.

Integrating Pydantic AI agents with AG-UI protocol

To get started, clone the Open AG UI Demo repository that consists of a Python-based backend (agent) and a Next.js frontend (frontend).

Next, navigate to the backend directory:

cd agent

Then install the dependencies using Poetry:

poetry install

After that, create a .env file with OpenAI API Key API key:

OPENAI_API_KEY="your-OpenAI-key-here"


Then run the agent using the command below:

poetry run python main.py

Let us now see how to integrate AG-UI protocol with Pydantic AI agents framework.

Step 1: Define your Pydantic AI agent state

Using Pydantic AI's BaseModel, define an AgentState class that holds all data the agent needs. In our stock example, we track things like available_cash, investment_portfolio, and tool_logs , as shown in the agent/stock.py file.

Step 2: Initialize your Pydantic AI agent

Once you have defined the agent state, initialize your Pydantic AI agent and pass the AgentState to its shared state container, as shown in the agent/stock.py file.

Step 3: Define your Pydantic AI agent tools

First, define the core agent tool, such as the Stock Data Tool, as shown in the agent/stock.py file.

Step 4: Configure AG-UI state management events

To configure AG-UI state management events, first define a JSON Patch object for making incremental updates to the application state, as shown in the agent/stock.py file.

Then configure the AG-UI state management events to update the frontend with UI changes, as shown in the agent/stock.py file.

Step 5: Configure Human-in-the-Loop (HITL) functionality

To configure Human-in-the-Loop functionality, define an @agent.instructions async function that instructs the Pydantic AI agent to call the frontend action tool by name to ask the user for feedback, as shown in the agent/stock.py file.

Step 6: Set Up the FastAPI Server and Mount the Agent

Finally, set up the FastAPI server and configure the /pydantic-agent endpoint that processes requests using the pydantic AI agent with AG UI integration, as shown in the agent/stock.py file.

Congratulations! You have integrated a Pydantic AI agent workflow with the AG-UI protocol. Let’s now see how to add a frontend to the AG-UI + Pydantic AI agent workflow.

Integrating a frontend to the AG-UI + Pydantic AI agent workflow using CopilotKit

In this section, you will learn how to create a connection between your AG-UI + Pydantic AI agent workflow and a frontend using CopilotKit.

Let’s get started.

First, navigate to the frontend directory:

Next, create a .env file with OpenAI API Key API key:

Then install the dependencies:

After that, start the development server:

Navigate to http://localhost:3000, and you should see the AG-UI + Pydantic agent frontend up and running.

Image from Notion

Let’s now see how to build the frontend UI for the AG-UI + Pydantic AI agent using CopilotKit.

Step 1: Create an HttpAgent instance

Before creating an HttpAgent instance, let’s first understand what the HttpAgent is.

HttpAgent is a client from the AG-UI Library that bridges your frontend application with any AG-UI-compatible AI agent’s server.

To create an HttpAgent instance, define it in an API route as shown in the src/app/api/copilotkit/route.ts file.

Step 2: Set up CopilotKit provider

To set up the CopilotKit Provider, the [<CopilotKit>](https://docs.copilotkit.ai/reference/components/CopilotKit) component must wrap the Copilot-aware parts of your application.

For most use cases, it's appropriate to wrap the CopilotKit provider around the entire app, e.g., in your layout.tsx, as shown below in the src/app/layout.tsx file.

Step 3: Set up a Copilot chat component

CopilotKit ships with several built-in chat components, which include CopilotPopup, CopilotSidebar, and CopilotChat.

To set up a Copilot chat component, define it as shown in the src/app/components/prompt-panel.tsx file.

Step 4: Sync AG-UI + Pydantic AI agent state with the frontend using CopilotKit hooks

In CopilotKit, CoAgents maintain a shared state that seamlessly connects your frontend UI with the agent's execution. This shared state system allows you to:

  • Display the agent's current progress and intermediate results
  • Update the agent's state through UI interactions
  • React to state changes in real-time across your application

You can learn more about CoAgents’ shared state here on the CopilotKit docs.

__wf_reserved_inherit

To sync your AG-UI + Pydantic AI agent state with the frontend, use the CopilotKit useCoAgent hook to share the AG-UI + Pydantic AI agent state with your frontend, as shown in the src/app/page.tsx file.

Then render the AG-UI + Pydantic AI agent's state in the chat UI, which is useful for informing the user about the agent's state in a more in-context way.

To render the AG-UI + Pydantic AI agent's state in the chat UI, you can use the useCoAgentStateRender hook, as shown in the src/app/page.tsx file.

If you execute a query in the chat, you should see the AG-UI + Pydantic AI agent’s state task execution rendered in the chat UI, as shown below.

Image from Notion

Step 5: Implementing Human-in-the-Loop (HITL) in the frontend

Human-in-the-loop (HITL) allows agents to request human input or approval during execution, making AI systems more reliable and trustworthy. This pattern is essential when building AI applications that need to handle complex decisions or actions that require human judgment.

You can learn more about Human in the Loop here on CopilotKit docs.

Image from Notion

To implement Human-in-the-Loop (HITL) in the frontend, you need to use the CopilotKit useCopilotKitAction hook with the renderAndWaitForResponse method, which allows returning values asynchronously from the render function, as shown in the src/app/page.tsx file.

When an agent triggers frontend actions by tool/action name to request human input or feedback during execution, the end-user is prompted with a choice (rendered inside the chat UI). Then the user can choose by pressing a button in the chat UI, as shown below.

Image from Notion

Step 6: Streaming AG-UI + Pydantic AI agent responses in the frontend

To stream your AG-UI + Pydantic AI agent responses or results in the frontend, pass the agent’s state field values to the frontend components, as shown in the src/app/page.tsx file.

If you query your agent and approve its feedback request, you should see the agent’s response or results streaming in the UI, as shown below.

Conclusion

In this guide, we have walked through the steps of integrating Pydantic AI agents with AG-UI protocol and then adding a frontend to the agents using CopilotKit.

While we’ve explored a couple of features, we have barely scratched the surface of the countless use cases for CopilotKit, ranging from building interactive AI chatbots to building agentic solutions—in essence, CopilotKit lets you add a ton of useful AI capabilities to your products in minutes.

Hopefully, this guide makes it easier for you to integrate AI-powered Copilots into your existing application.

Follow CopilotKit on Twitter and say hi, and if you'd like to build something cool, join the Discord community.

Are you ready?

Stay in the know

Subscribe to our blog and get updates on CopilotKit in your inbox.