This post describes how to use Model Context Protocol tools with Semantic Kernel. Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. MCP standardizes the connection between AI models and various data sources and tools.
The Model Context Protocol is significant because it enhances the way AI models interface with data and tools, promoting interoperability, flexibility, and improved contextual understanding. Its potential applications span various domains including, data integration and knowledge management, making it a valuable component in the development of advanced AI solutions.
The sample described in this post focusses on connecting an AI model to an MCP tool via function calling.
For more information on Model Context Protocol (MCP) please refer to the documentation.
This sample described below uses the official ModelContextProtocol package is influenced by these samples.
The sample shows:
- How to connect to an MCP Server using ModelContextProtocol
- Retrieve the list of tools the MCP Server makes available
- Convert the MCP tools to Semantic Kernel functions so they can be added to a Kernel instance
- Invoke the MCP tools from Semantic Kernel in response to LLM function calling requests
Full source code for the sample is available in the Semantic Kernel repository.
Build a Kernel with OpenAI Chat Completion
The sample use OpenAI so you must provide a valid API Key. You can do this via user secrets or an environment variable.
// Prepare and build kernel
var builder = Kernel.CreateBuilder();
builder.Services.AddLogging(c => c.AddDebug().SetMinimumLevel(Microsoft.Extensions.Logging.LogLevel.Trace));
builder.Services.AddOpenAIChatCompletion(
modelId: config["OpenAI:ChatModelId"] ?? "gpt-4o",
apiKey: config["OpenAI:ApiKey"]!);
Kernel kernel = builder.Build();
Create an MCP Client
MCP follows a client-server architecture where a host application can connect to an MCP server using an MCP client.
For this sample the architecture can be represented as follows:
- MCP Hosts: Programs like IDEs, or AI tools that want to access data through MCP
- MCP Clients: Protocol clients that maintain 1:1 connections with servers
- MCP Servers: Lightweight programs that each expose specific capabilities through the standardized Model Context Protocol
The following code will create an MCP Client which is configured to connect to a local GitHub server using the stdio transport. The code will start the GitHub server using the npx command.
What is npx?
npx is a command-line tool that comes with Node.js (starting from version 5.2.0) and is a part of the npm (Node Package Manager) ecosystem. It is used to execute Node.js packages directly from the command line without needing to install them globally on your system.// Create an MCPClient for the GitHub server
await using IMcpClient mcpClient = await McpClientFactory.CreateAsync(new StdioClientTransport(new()
{
Name = "GitHub",
Command = "npx",
Arguments = ["-y", "@modelcontextprotocol/server-github"],
}));
Retrieve the MCP Tools
A Model Context Protocol (MCP) server can expose executable functionality to clients as tools. Tools can be invoked by LLMs enabling them to interact with external systems, perform computations, and take actions in the real world.
The following code lists the tools exposed by the server and prints out each tool name and description.
// Retrieve the list of tools available on the GitHub server
var tools = await mcpClient.ListToolsAsync().ConfigureAwait(false);
foreach (var tool in tools.Tools)
{
Console.WriteLine($"{tool.Name}: {tool.Description}");
}
The GitHub MCP server exposes the following tools:
create_or_update_file: Create or update a single file in a GitHub repository
search_repositories: Search for GitHub repositories
create_repository: Create a new GitHub repository in your account
get_file_contents: Get the contents of a file or directory from a GitHub repository
push_files: Push multiple files to a GitHub repository in a single commit
create_issue: Create a new issue in a GitHub repository
create_pull_request: Create a new pull request in a GitHub repository
fork_repository: Fork a GitHub repository to your account or specified organization
create_branch: Create a new branch in a GitHub repository
list_commits: Get list of commits of a branch in a GitHub repository
list_issues: List issues in a GitHub repository with filtering options
update_issue: Update an existing issue in a GitHub repository
add_issue_comment: Add a comment to an existing issue
search_code: Search for code across GitHub repositories
search_issues: Search for issues and pull requests across GitHub repositories
search_users: Search for users on GitHub
get_issue: Get details of a specific issue in a GitHub repository
For more information about the GitHub MCP server see the documentation.
Convert MCP Tools to Kernel Functions
Semantic Kernel provides a KernelFunction abstraction which is used to represent a tool which an LLM can invoke. To use MCP server tools with Semantic Kernel is necessary to first adapt each tool to be a KernelFunction.
The code below shows how an MCP tool can be converted to a Semantic Kernel function and added to the Kernel. This functionality has been added to NuGet Gallery | Microsoft.SemanticKernel.Core 1.44.0
kernel.Plugins.AddFromFunctions("GitHub", tools.Select(aiFunction => aiFunction.AsKernelFunction()));
Invoke the MCP Tools via Function Calling
Once the MCP tools have been converted to Semantic Kernel functions they can be added to the Kernel.Plugins and are not available for use with function calling.
// Enable automatic function calling
OpenAIPromptExecutionSettings executionSettings = new()
{
Temperature = 0,
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(options: new() { RetainArgumentTypes = true })
};
// Test using GitHub tools
var prompt = "Summarize the last four commits to the microsoft/semantic-kernel repository?";
var result = await kernel.InvokePromptAsync(prompt, new(executionSettings)).ConfigureAwait(false);
Console.WriteLine($"\n\n{prompt}\n{result}");
Here are some samples results:
Summarize the last four commits to the microsoft/semantic-kernel repository?
Here are the summaries of the last four commits to the `microsoft/semantic-kernel` repository:
1. **Commit f8ee3ac** by Mark Wallace on 2025-03-04:
- **Title:** .Net: Demo showing how to integrate MCP tools with Semantic Kernel
- **Description:** This commit introduces a demo for integrating MCP tools with the Semantic Kernel. It includes necessary changes to ensure the code builds cleanly without errors or warnings and follows the contribution guidelines. [View Commit](https://github.com/microsoft/semantic-kernel/commit/f8ee3ac408532a805ae3e9d7cc912c1df5e2796a)
2. **Commit 7c8dccc** by Roger Barreto on 2025-03-04:
- **Title:** .Net: Add missing Ollama Connector Aspire Friendly Extensions
- **Description:** This commit addresses issue #10532 by adding missing extensions for the Ollama Connector to ensure compatibility and functionality. [View Commit](https://github.com/microsoft/semantic-kernel/commit/7c8dccc2c62d6641aa2cc1c976d6a681c8b2200b)
3. **Commit 7787725** by dependabot[bot] on 2025-03-04:
- **Title:** Python: Bump google-cloud-aiplatform from 1.80.0 to 1.82.0 in /python
- **Description:** This automated commit updates the `google-cloud-aiplatform` dependency from version 1.80.0 to 1.82.0, incorporating new features and bug fixes as detailed in the release notes. [View Commit](https://github.com/microsoft/semantic-kernel/commit/7787725a1ae0b91325bd7602358ed6173b610076)
4. **Commit 022f05e** by Ross Smith on 2025-03-04:
- **Title:** .Net: dotnet format issues with SDK 9.0
- **Description:** This commit resolves formatting issues related to BOM encoding in the .NET codebase, ensuring compatibility with SDK version 9.0. It addresses problems in the `ActivityExtensions.cs` file and removes access modifiers on interface members. [View Commit](https://github.com/microsoft/semantic-kernel/commit/022f05e795ba80c7c5d52af2b1c3271dcc7e80bd)
Invoke the MCP Tools via an Agent
The MCP tools can also be invoked by an Agent. In the code below the same Kernel instance which has the MCP tools already added is being reused. The code also enabled automatic function calling for the Agent. The same GitHub related query can be sent as a message to the agent and the MCP tools will be used to provide content for the response.
// Define the agent
ChatCompletionAgent agent =
new()
{
Instructions = "Answer questions about GitHub repositories.",
Name = "GitHubAgent",
Kernel = kernel,
Arguments = new KernelArguments(new PromptExecutionSettings() { FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(options: new() { RetainArgumentTypes = true }) }),
};
// Respond to user input, invoking functions where appropriate.
ChatMessageContent response = await agent.InvokeAsync("Summarize the last four commits to the microsoft/semantic-kernel repository?").FirstAsync();
Console.WriteLine($"\n\nResponse from GitHubAgent:\n{response.Content}");
What’s Next?
We encourage you to try out this integration possibility and let us know how it goes. Please create issues to request enhancements to the sample and provide your feedback. Full source code for the sample is available in the Semantic Kernel repository.
Next up, the team is looking at building samples showing:
Thanks @Mark!!
At last there is a super streamlined way to consume and expose Kernel "plugins" ;) - MCP!
Mark, can you clarify here what is happening in regards to the mcp client-host-server? as far as I understand, this seems to be happening:
- An MCP client is created.
- This MCP Client is also downloading a MCP server implementation, in the shape of an npx package.
- Which runs it locally as a process, in fact it is acting as a host of the server.
- And the rest is history, once this is done we simply use it...
I didn’t use SK, but rather hoped for a lighter Microsoft Extensions. Ai, may I ask if you know Microsoft Is there any information on the integration of Extensions. Ai and MCP? Or provide ideas, thank you!
What is the Semantic Kernel team’s vision? Does MCP replace the concept of Plugins? I understand that Plugins are similar to the MCP Server layer that abstracts only the annotated functions for use in LLM. What is the advantage of replacing the Semantic Kernel Plugins layer with an MCP Server?
MCP doesn’t replace the Plugin concept but instead MCP tools are consumable as Plugins. We also want to investigate integrate MCP prompts and resources also via Plugins.
One very useful aspect of MCP is that the tools are design to be consumed by an LLM so this should provide increased reliability for function calling scenarios.
Based on this article, I created a project + NuGet which can be used to integrate the Model Context Protocol (MCP), based on the official MCP C# sdk with Microsoft SemanticKernel:
https://github.com/StefH/McpDotNet.Extensions.SemanticKernel
Great article! I spent some time this weekend diving deeper into the topic and found it incredibly engaging. I had assumed that “SK” refers to Microsoft’s MCP offering, but its a different beast, and it appears that there are many active discussions surrounding this area. It’s clear that industry professionals are exploring its potential impact and future developments, which makes the conversation even more exciting. Looking forward to seeing how these discussions evolve!
It’s curious to me that GitHub, of all places, exposes this newfangled MCP business instead of just using an OpenAPI spec which Azure AI Foundry Agents can already ingest and turn expose as function calls to the model.
Open API is a viable choice when the existing endpoints are suitable for the LLM to reason over. However, one advantage of MCP at this time is that its tools are specifically designed for LLM use. As a result, the tools available in GitHub’s MCP represent only a fraction of what is offered as REST endpoints.