TL;DR
It’s been a huge month for AG-UI, the open protocol for interactive AI agents. Today, we’re expanding support with two new frameworks:
- LlamaIndex — enabling retrieval-augmented agents to connect with users in real-time
- Agno — bringing intuitive multi-agent workflows into the AG-UI ecosystem
Developers now have even more power to build rich, interactive agent experiences.
🚀 AG-UI Hits a Major Milestone
One month after launch, AG-UI continues to grow and evolve. Today’s update brings two new integrations that expand what's possible with agentic interfaces:
- Agno: A modular agent orchestration framework now compatible with AG-UI, so you can expose your Agno agents to users through real-time interfaces.
- LlamaIndex: Build production agents that can find information, synthesize insights, generate reports, and take actions over the most complex enterprise data.
These additions make it easier than ever to build high-quality user-facing AI with your favorite agent frameworks.
What Is AG-UI?
AG-UI (Agent-User Interaction Protocol) is a lightweight spec that connects backend AI agents with frontend applications. It turns agents into live, interactive participants inside your UI, not just silent executors behind an API.
It’s the difference between a black box backend and a fully visible, controllable copilot.
Think: Cursor vs. Devin.
Why It Matters
Today’s agents are often disconnected from the user. If you want real-time interaction, you’re left wiring up:
- Streaming outputs
- Tool invocation feedback
- Shared session state
- UI rendering logic
- Messaging pipelines
AG-UI solves this by giving agents and apps a shared event protocol, so they can speak the same language out of the box.
How It Works
AG-UI defines 16+ event types that power live agent behavior, from tool calls to token streaming to UI state updates.
Agents can either:
- Emit AG-UI events directly
- Or use an adapter to convert outputs into AG-UI format
Clients subscribe to an event stream (via SSE or WebSockets), render the events live, and send back user input or control signals.
This unlocks dynamic, real-time interaction between agents and users.
Adoption & Growth
In less than 30 days, AG-UI has gained serious traction:
- Integrated with: LangChain, CrewAI, Mastra, AG2, Agno, LlamaIndex
- In progress: AWS, A2A, ADK, AgentOps, Human Layer (Slack)
- 3,700+ GitHub stars
- Thousands of developers are building interactive agents
Build with AG-UI
Getting started is fast:
npx create-ag-ui-app
Let me know what you are building!
Follow CopilotKit on Twitter and say hi, and join our active Discord Community!
Top comments (8)
Nice! Congrats on the launch.
Thank you Anmol!
AG-UI is going in great!!
been cool seeing steady progress - it adds up. you think the real driver for this kinda growth is habits or just a bunch of folks showing up day after day?
It's definitely people showing up!
Real-time agent interaction is a huge step forward, love seeing LlamaIndex and Agno joining the mix! Curious, what's been the most unexpected use case you've seen so far with AG-UI?
Glad to hear LlamaIndex is supported
Yes, it's great we can fold in both Agno and LlamaIndex