Continuing the series on A2A and MCP, this blog focuses on their combined implementation. We've previously explored the introduction and individual implementation of both A2A and MCP. Now, we'll bring them together in a single, cohesive setup.
Quick Recap
MCP: Model Context Protocol
Model Context Protocol (MCP) is a set of rules that defines how communication between an MCP client and an MCP server should be managed in order to utilize third-party services effectively.
A2A: Agent-to-Agent Protocol
Agent-to-Agent Protocol (A2A) is an open protocol that allows agents to interoperate across different ecosystems, frameworks, or organizations. It supports a dynamic, multi-agent environment where agents can communicate, discover each other, and collaborate autonomously.
I have developed an A2A Registry for the AI community, do check it out and share in your network.
You may have seen separate implementations of A2A and MCP in various blogs, but there are very few resources that cover their combined usage. This blog aims to bridge that gap and guide you through a practical integration of both protocols.
What is the General Flow of Combined Implementation of A2A and MCP?
Before diving into the actual implementation, it's important to understand the basic flow of how A2A and MCP can work together in a combined setup.
In this setup, we start with a host agent, an agent deployed within our own infrastructure. This agent is configured to communicate with its own MCP servers, enabling it to access required third-party services.
On the other side, we have remote agents that are deployed on external infrastructures. These agents can be discovered using the A2A protocol, typically via the /.well-known/agent.json
endpoint. The host agent uses this URL to find and establish communication with the remote agents.
Just like the host agent, remote agents also have their own MCP servers configured, depending on their specific needs. This setup allows agents across different environments to discover and communicate with each other, while still leveraging their respective MCP servers to interact with services.
What Will We Implement?
The goal of this implementation is to build a system that can respond to natural language queries about companies by dynamically retrieving:
- Current or historical stock prices
- Recent news or contextual information from the web
Rather than hardcoding API calls, we'll leverage the A2A and MCP protocols to enable AI agents to make those requests on demand.
For example, when asked:
“Tell me about Apple’s stock and what the company is up to,”
The system will:
- Break down the question into manageable subtasks
- Route those tasks to the appropriate agents
- Each agent will then query the relevant external service using MCP
- Finally, the results will be synthesized into a single, up-to-date response
Next, we’ll go through the components required to bring this system to life.
Implementation Components
To access real-world information, the project defines two MCP servers that wrap public APIs. These servers act as bridges between agents and external services, translating agent requests into API calls and returning the data in a structured format.
MCP Server Layer
Each MCP server interfaces with a specific external data provider. By adhering to the MCP standard, these servers allow any agent to use them without needing to handle low-level API integrations directly. In this system, we define two MCP servers, one for web search and another for financial data. Once configured, these servers run independently and handle agent requests.
1. Search MCP Server
Path: mcp_server/sse/search_server.py
- Integrates with the Serper.dev API (Google Search).
- Used to retrieve recent news or general company information.
2. Stock MCP Server
Path: mcp_server/stdio/search_server.py
- Integrates with the FinnHub API (Stock Market Data).
- Used to fetch recent or historical stock prices for companies.
Agents Layer
Agents are designed with clear specialization. A central planning agent handles incoming user queries, identifies what sub-information is required, and then delegates tasks to other agents. For example, one agent may perform web searches, while another focuses on financial analysis. These agents communicate with their respective MCP servers to fetch data.
1. Host Agent
- Acts as the main coordinator.
- Receives the user's query, breaks it into subtasks, and routes them to the right agents.
2. Google Search Agent
- Communicates with the Search MCP Server.
- Responsible for retrieving relevant news or web results about the company.
3. Stock Report Agent
- Communicates with the Stock MCP Server.
- Gathers and interprets financial data about the company’s stock performance.
With the architecture and components in place, let's dive into the actual implementation.
Implementation
Before we jump into the code, make sure the following prerequisites are satisfied.
Prerequisites
If you're using the free tier of the Gemini API, you’ll need to update the model name across the project. Replace gemini-2.5-pro-exp-03-25
with an available free-tier model like gemini-2.0-flash
.
Update the following files accordingly:
a2a_servers/agent_servers/gsearch_report_agent_server.py
a2a_servers/agent_servers/host_agent_server.py
a2a_servers/agent_servers/stock_report_agent_server.py
adk_agents_testing/hierarchical_agents.py
adk_agents_testing/single_agent_search_mcp.py
Note:
In this specific implementation, we won’t be using the Google Search agent.
However, if you plan to include it, you’ll need a Serper API key, which you can obtain from serper.dev.
Follow Along
To get started with the implementation, follow these steps:
1. Clone the Git Repository
git clone https://github.com/Tsadoq/a2a-mcp-tutorial.git
2. Create a .env
File
In the root folder of the project, create a .env
file and add the following environment variables:
FINNHUB_API_KEY=<your_api_key>
GOOGLE_API_KEY=<your_api_key>
SERPER_DEV_API_KEY=<your_api_key> # Required only if you plan to use the Google Search agent
3. Start the Stock MCP Server
Run the stock MCP server using the following command:
uv run mcp_server/sse/stocks_server.py
This server provides two tools:
-
get_symbol_from_query
: Retrieves the stock symbol for a given company name. -
get_price_of_stock
: Fetches the actual stock price using the provided symbol.
This MCP server is primarily used by the Stock Report Agent to gather financial data.
4. Start the Stock Report Agent
uv run a2a_servers/agent_servers/stock_report_agent_server.py
- This agent will be available at
http://0.0.0.0:10000
and exposes its capabilities via the/.well-known/agent.json
endpoint. Its configuration looks like this:
{
"name": "stock_report_agent",
"description": "An agent that provides US stock prices and info.",
"url": "http://0.0.0.0:10000",
"version": "1.0.0",
"capabilities": {
"streaming": false,
"pushNotifications": false,
"stateTransitionHistory": true
},
"defaultInputModes": [
"text"
],
"defaultOutputModes": [
"text"
],
"skills": [
{
"id": "SKILL_STOCK_REPORT",
"name": "stock_report",
"description": "Provides stock prices and info."
}
]
}
This agent gains access to the tools defined in the Stock MCP Server through the return_sse_mcp_tools_stocks
function. This function connects to the MCP server (running on port 8181
, as started earlier) to retrieve available tools.
5. Start the Google Search Agent (Required for Host Agent Startup)
Even though we are not actively using the Google Search Agent in this implementation, it still needs to be running. The Host Agent requires all configured agents to be live before it initializes.
Run the following commands:
uv run mcp_server/sse/stocks_server.py
uv run a2a_servers/agent_servers/google_search_agent_server.py
6. Start the Host Agent
uv run a2a_servers/agent_servers/host_agent_server.py
This is the central coordinating agent. It receives the user query, breaks it down into subtasks, and delegates those tasks to the relevant agents (e.g., Stock Report Agent and Google Search Agent).
7. Start the Local Host Client
To send queries to the system, start the local A2A client:
uv run a2a_servers/host_agent_server.py
This script acts as the A2A client, sending user queries to the Host Agent, which we initialized in the earlier steps.
Visualizing the Workflow
The full process can be visualized as follows:
User (Query)
↓
run_from_local_client.py (A2A Client)
↓
Host Agent (plans the task)
├──> Stock Report Agent → MCP Stock Server → FinnHub API
└──> Google Search Agent → MCP Search Server → Serper.dev
↓
Host Agent compiles and sends final report
Deployment Context
In this setup:
- The Host Agent is deployed on our infrastructure and is accessed locally via the A2A client.
- The Stock Report Agent and Google Search Agent are conceptually treated as remote agents. Ideally, they would be deployed on their respective remote domains and accessed via their
/.well-known/agent.json
endpoints.
However, for this local implementation, we're simulating this environment by deploying all agents on different ports of the same machine (localhost).
Querying the System
Now that everything is up and running, let’s try out a few queries using the local client.
Example 1: Discovering Available Agents
Enter your query: what agents do you have access to ?
Answer:
I have access to the following agents:
- google_search_agent: An agent that handles search queries and can read pages online.
- stock_report_agent: An agent that provides US stock prices and info.
In this case, the Host Agent responds directly using its discovery capabilities. No tasks are sent to the remote agents.
Note:
The A2A client can handle only one request at a time, so you'll need to rerun it manually for each query.
Example 2: Retrieve Stock Symbol
Enter your query: which symbol is used to retrieve stock details of Apple company ?
Answer:
AAPL is the symbol used to retrieve stock details of Apple Inc. I used the stock_report_agent to get this information.
Example 3: Retrieve Stock Price
Enter your query: What is the current stock value of APPL ?
Answer:
The current stock price of AAPL is 195.27, with a change of -6.09 and a percentage change of -3.0244. The day's high was 197.7, and the day's low was 193.46. The day's open price was 193.665, and the previous close price was 201.36.
This result matches the real-time data, as seen below:
You can observe request and response logs in the terminal windows running the Stock MCP Server and Stock Report Agent.
You can also experiment with combined queries by enabling the Google Search Agent. This allows the Host Agent to delegate tasks to multiple agents and aggregate richer, contextual responses.
This demo illustrates how AI agents can be enhanced with real-time data retrieval using A2A and MCP protocols. By modularizing responsibilities and allowing agents to independently query relevant data, we create a scalable, interoperable system.
For a detailed explanation of the underlying implementation, check out the full GitHub README.
Conclusion
Combining A2A (Agent-to-Agent) and MCP (Model Context Protocol) unlocks a powerful way for agents to collaborate, delegate tasks, and fetch real-time data dynamically. This architecture not only simplifies the process of connecting to third-party services but also builds a flexible and scalable system where agents can reason, plan, and act across different domains.
This blog demonstrated how to set up such a system from scratch and showed the real-time capabilities in action. From stock price lookups to planning multi-agent workflows, this is a solid foundation to build and extend upon.
If you found this helpful, don’t forget to share and follow for more agent-powered insights. Got an idea or workflow in mind? Join the discussion in the comments or reach out on Twitter or LinkedIn.
- Source Code Credit: Matteo Villosio
- Diagram Credit: Git Diagram
Top comments (2)
Thank you so much for breaking down the full workflow of combining A2A and MCP with such crystal-clear implementation steps! Your diagrams and code walkthrough made it so much easier to understand how agents coordinate and access real-time data. The practical example queries really helped solidify how all the pieces come together. This is exactly the detailed guide I was looking for!
The purpose of the blog served well.
I am very glad to know that 😁. Actually, I was also looking for the same breakdown, but i didn't find so made it by myself.
I do say, this credit also goes to Matteo Villosio for developing step by step code with explanation in readme file. You should definitely checkout github repository mentioned in the blog.
Also, For generating visuals, I used Git Diagram which open sourced and free to use.