MCP Apps: Bring MCP Apps to your users!

Intro to AI Agents
By Eli Berman and Nathan Tarbert
August 7, 2025

The Agent landscape can easily feel like alphabet soup (LLM's, RAG, MCP, A2A, AG-UI, LangGraph, Tavily, etc.)

How do you make sense of it all?

We've put together this guide for those trying to understand agents as you onboard yourself into the agent ecosystem.

Together, let's simplify everything in hopes that this useful guide will help you navigate through the AI agent ecosystem.

__wf_reserved_inherit

What are AI Agents?
We like to think of them like mini-coworkers embedded in your product that have the ability to complete complex tasks.

__wf_reserved_inherit

Visualizing the layers, and where agents fit in might be one of the most important factors in understanding how all of the pieces fit together.
Consider four layers:

  • LLM: This is the brain of the agent
  • Frameworks: Agent memory, planning and recursion
  • Tools: This extends the agents capabilities via API's and services
  • Application or UX layer: This is where users interact with agents
__wf_reserved_inherit

Think of a triangle with three points, as the agent stack.

  • MCP: Giving agents contexts and tools
  • A2A or ACP: Agents communicating with agents
  • AG-UI: Bringing agents to the frontend
__wf_reserved_inherit

What is AG-UI?

AG-UI takes agents from backend automation to user-facing applications and brings structure, interactivity, standardization, and reactivity to the agent-powered frontend.

__wf_reserved_inherit

The next question is, what can you build with agents?

Backend Automations:

  • Backend Automations: Trigger agents via API or cron for background workflows
  • Replace scripts with dynamic decision making logic

Fullstack Agentic Applications:

  • Agents embedded directly in your UI, interacting live with users
  • Real-time reasoning + frontend control via AG-UI or similar protocol
__wf_reserved_inherit

Concepts Worth Mastering with four examples:

Retrieval-Augmented Generation (RAG):

  • An AI framework that combines the strengths of information retrieval and generative language models.

Context Engineering:

  • Designing systems that decide what information an AI model sees before it generates a response.

Prompt Engineering:

  • Designing input prompts to effectively interact with AI models, especially LLMs like GPT.

Vibe Coding:

  • Using AI to generate, refine, and debug code.

The more you master these layers, the more control you have over agentic behavior!

__wf_reserved_inherit

Get your hands on these tools and start building!

__wf_reserved_inherit

Let's answer why the agent stack matters.
Autonomous agents will:

  • Replace entire SaaS workflows
  • Supercharge dev productivity
  • Build entirely new UX paradigms

The agent stack and ecosystem turns raw AI into usable, interactive products by connecting reasoning (LLMs), memory and planning (frameworks), actions (tools), and user interfaces (apps).

__wf_reserved_inherit
__wf_reserved_inherit

Top posts

See All
The Developer's Guide to Generative UI in 2026
Anmol Baranwal and Nathan TarbertJanuary 29, 2026
The Developer's Guide to Generative UI in 2026AI agents have become much better at reasoning and planning. The UI layer has mostly stayed the same, and it is holding back the experience. Most agent experiences still rely on chat, even when the task clearly needs forms, previews, controls, or step-by-step feedback. Generative UI is the idea that allows agents to influence the interface at runtime, so the UI can change as context changes. This is usually done through UI specs like A2UI, Open-JSON-UI, or MCP Apps. We'll break down Generative UI, the three practical patterns, and how CopilotKit supports them (using AG-UI protocol under the hood).
Bring MCP Apps into your OWN app with CopilotKit & AG-UI
Anmol Baranwal and Nathan TarbertJanuary 22, 2026
Bring MCP Apps into your OWN app with CopilotKit & AG-UIToday, we are excited to announce CopilotKit’s support for MCP Apps. Now, MCP servers can finally ship an interactive UI that works out of the box in real agent applications.
How to build a Frontend for LangChain Deep Agents with CopilotKit!
Anmol Baranwal and Nathan TarbertJanuary 20, 2026
How to build a Frontend for LangChain Deep Agents with CopilotKit!LangChain recently introduced Deep Agents: a new way to build structured, multi-agent systems that can plan, delegate, and reason across multiple steps. It comes with built-in planning, a filesystem for context, and subagent spawning. But connecting that agent to a real frontend is still surprisingly hard. Today, we will build a Deep Agents powered job search assistant and connect it to a live Next.js UI with CopilotKit, so the frontend stays in sync with the agent in real time.
Are you ready?

Stay in the know

Subscribe to our blog and get updates on CopilotKit in your inbox.