MCP Apps: Bring MCP Apps to your users!

How CopilotKit + Mastra Enable Real-Time Agent Interaction

By Nathan Tarbert
September 18, 2025
How CopilotKit + Mastra Enable Real-Time Agent Interaction

One of the strongest use cases for copilots is co-creation-working side by side with a user to draft, edit, or build something interactively. The other major use case we see talking to developers, and even large teams are, workflow copilots, which is really just a series of co-creation steps chained together.
Together, these two patterns represent the majority of what developers are building with agents today.

That’s why we we built the AG-UI Canvas Template, powered by CopilotKit + Mastra.

We'll get back to the Canvas but let's talk about agentic apps and where you should start.

Mastra, The TypeScript Agent Framework

Mastra is an open-source agent framework with an agentic backend.

It gives developers the building blocks to go from simple LLM calls to full agent workflows. Some of the core capabilities include:

  • Multi-LLM support: Easily switch between providers like OpenAI, Anthropic, Gemini, and local models. Stream responses with type-safe APIs.
  • Agents with tools: Extend agents with typed functions and external API integrations. Agents can reason, then call into your systems to get work done.
  • Workflows: Durable, graph-based workflows with branching, looping, and error handling. Workflows persist state across steps and can even pause for human input.
  • RAG support: Out-of-the-box chunking, embedding, and vector search for retrieval-augmented generation.
  • Observability: Built-in OpenTelemetry tracing for debugging and monitoring agents in production.
  • Evaluations: Automated evals (model-graded, rule-based, statistical) so you can test and trust your agent outputs.

This makes Mastra a strong foundation for reasoning, execution, and orchestration at scale.

But reasoning is only half the story. For agents to feel useful to end users, they need a way to collaborate in real time, inside the product itself.

The UI Layer-Where CopilotKit Comes In

CopilotKit provides the AG-UI (Agent–User Interaction protocol), a standard way to bring backend agents into the front end.

Instead of showing Mastra’s outputs in a static chat bubble, CopilotKit renders them as interactive React components: editable, clickable, and collaborative. Users can guide, correct, and co-create with the agent, all inside the app.

Getting Started in Under 2 Minutes

The fastest way to try Mastra + CopilotKit together is with our starter template.

Just run:‍

‍This scaffolds a full example app- Mastra agents on the backend, CopilotKit + AG-UI on the frontend in less than two minutes.

You’ll have an interactive agent running locally with no extra setup.

Developers can now pair Mastra’s robust agent framework with CopilotKit’s interaction layer to ship agents that feel natural to work with.

Zero to AG-UI Canvas Template

The Canvas template shows this in action: a Mastra agent reasoning in the background while CopilotKit streams updates to the frontend, letting the user interact, adjust, and co-create continuously.

This template alone covers two-thirds of the most common copilot use cases-and it’s the fastest way to see how Mastra + CopilotKit fit together in practice.

Check it out on GitHub:https://go.copilotkit.ai/ag-ui-canvas-mastra

Start building your first interactive Mastra-powered agentic app today!

Want to learn more?

Happy Building!

‍

Top posts

See All
The Developer's Guide to Generative UI in 2026
Anmol Baranwal and Nathan TarbertJanuary 29, 2026
The Developer's Guide to Generative UI in 2026AI agents have become much better at reasoning and planning. The UI layer has mostly stayed the same, and it is holding back the experience. Most agent experiences still rely on chat, even when the task clearly needs forms, previews, controls, or step-by-step feedback. Generative UI is the idea that allows agents to influence the interface at runtime, so the UI can change as context changes. This is usually done through UI specs like A2UI, Open-JSON-UI, or MCP Apps. We'll break down Generative UI, the three practical patterns, and how CopilotKit supports them (using AG-UI protocol under the hood).
Bring MCP Apps into your OWN app with CopilotKit & AG-UI
Anmol Baranwal and Nathan TarbertJanuary 22, 2026
Bring MCP Apps into your OWN app with CopilotKit & AG-UIToday, we are excited to announce CopilotKit’s support for MCP Apps. Now, MCP servers can finally ship an interactive UI that works out of the box in real agent applications.
How to build a Frontend for LangChain Deep Agents with CopilotKit!
Anmol Baranwal and Nathan TarbertJanuary 20, 2026
How to build a Frontend for LangChain Deep Agents with CopilotKit!LangChain recently introduced Deep Agents: a new way to build structured, multi-agent systems that can plan, delegate, and reason across multiple steps. It comes with built-in planning, a filesystem for context, and subagent spawning. But connecting that agent to a real frontend is still surprisingly hard. Today, we will build a Deep Agents powered job search assistant and connect it to a live Next.js UI with CopilotKit, so the frontend stays in sync with the agent in real time.
Are you ready?

Stay in the know

Subscribe to our blog and get updates on CopilotKit in your inbox.