This post describes how to use Model Context Protocol tools with Semantic Kernel. Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. MCP standardizes the connection between AI models and various data sources and tools.
The Model Context Protocol is significant because it enhances the way AI models interface with data and tools, promoting interoperability, flexibility, and improved contextual understanding. Its potential applications span various domains including, data integration and knowledge management, making it a valuable component in the development of advanced AI solutions.
The sample described in this post focusses on connecting an AI model to an MCP tool via function calling.
For more information on Model Context Protocol (MCP) please refer to the documentation.
This sample described below uses mcpdotnet is heavily influenced by the samples from that repository.
The sample shows:
- How to connect to an MCP Server using mcpdotnet
- Retrieve the list of tools the MCP Server makes available
- Convert the MCP tools to Semantic Kernel functions so they can be added to a Kernel instance
- Invoke the MCP tools from Semantic Kernel in response to LLM function calling requests
Full source code for the sample is available in the Semantic Kernel repository.
Build a Kernel with OpenAI Chat Completion
The sample use OpenAI so you must provide a valid API Key. You can do this via user secrets or an environment variable.
var config = new ConfigurationBuilder()
.AddUserSecrets<Program>()
.AddEnvironmentVariables()
.Build();
// Prepare and build kernel
var builder = Kernel.CreateBuilder();
builder.Services.AddLogging(c => c.AddDebug().SetMinimumLevel(Microsoft.Extensions.Logging.LogLevel.Trace));
if (config["OpenAI:ApiKey"] is not null)
{
builder.Services.AddOpenAIChatCompletion(
serviceId: "openai",
modelId: config["OpenAI:ChatModelId"] ?? "gpt-4o",
apiKey: config["OpenAI:ApiKey"]!);
}
else
{
Console.Error.WriteLine("Please provide a valid OpenAI:ApiKey to run this sample.");
return;
}
Kernel kernel = builder.Build();
Create an MCP Client
MCP follows a client-server architecture where a host application can connect to an MCP server using an MCP client.
For this sample the architecture can be represented as follows:
- MCP Hosts: Programs like IDEs, or AI tools that want to access data through MCP
- MCP Clients: Protocol clients that maintain 1:1 connections with servers
- MCP Servers: Lightweight programs that each expose specific capabilities through the standardized Model Context Protocol
The following code will create an MCP Client which is configured to connect to a local GitHub server using the stdio transport. The code will start the GitHub server using the npx command.
What is npx?
npx is a command-line tool that comes with Node.js (starting from version 5.2.0) and is a part of the npm (Node Package Manager) ecosystem. It is used to execute Node.js packages directly from the command line without needing to install them globally on your system.internal static async Task<IMcpClient> GetGitHubToolsAsync()
{
McpClientOptions options = new()
{
ClientInfo = new() { Name = "GitHub", Version = "1.0.0" }
};
var config = new McpServerConfig
{
Id = "github",
Name = "GitHub",
TransportType = "stdio",
TransportOptions = new Dictionary<string, string>
{
["command"] = "npx",
["arguments"] = "-y @modelcontextprotocol/server-github",
}
};
var factory = new McpClientFactory(
[config],
options,
NullLoggerFactory.Instance
);
return await factory.GetClientAsync(config.Id).ConfigureAwait(false);
}
Retrieve the MCP Tools
A Model Context Protocol (MCP) server can expose executable functionality to clients as tools. Tools can be invoked by LLMs enabling them to interact with external systems, perform computations, and take actions in the real world.
The following code lists the tools exposed by the server and prints out each tool name and description.
var tools = await mcpClient.ListToolsAsync().ConfigureAwait(false);
foreach (var tool in tools.Tools)
{
Console.WriteLine($"{tool.Name}: {tool.Description}");
}
The GitHub MCP server exposes the following tools:
create_or_update_file: Create or update a single file in a GitHub repository
search_repositories: Search for GitHub repositories
create_repository: Create a new GitHub repository in your account
get_file_contents: Get the contents of a file or directory from a GitHub repository
push_files: Push multiple files to a GitHub repository in a single commit
create_issue: Create a new issue in a GitHub repository
create_pull_request: Create a new pull request in a GitHub repository
fork_repository: Fork a GitHub repository to your account or specified organization
create_branch: Create a new branch in a GitHub repository
list_commits: Get list of commits of a branch in a GitHub repository
list_issues: List issues in a GitHub repository with filtering options
update_issue: Update an existing issue in a GitHub repository
add_issue_comment: Add a comment to an existing issue
search_code: Search for code across GitHub repositories
search_issues: Search for issues and pull requests across GitHub repositories
search_users: Search for users on GitHub
get_issue: Get details of a specific issue in a GitHub repository
For more information about the GitHub MCP server see the documentation.
Convert MCP Tools to Kernel Functions
Semantic Kernel provides a KernelFunction abstraction which is used to represent a tool which an LLM can invoke. To use MCP server tools with Semantic Kernel is necessary to first adapt each tool to be a KernelFunction.
The code below shows how an MCP tool can be converted to a Semantic Kernel function:
- The InvokeToolAsync function is a delegate which wraps the MCP tool invocation and will be called when the associated function is called i.e., when the LLM sends a response which triggers a function call.
- When invoking an MCP tool as arguments must be converted to the correct types or the request will fail. The sample below converts argument values to the correct type before invoking the MCP tool.
- The sample below also converts the tool parameter metadata into the format required by the Semantic Kernel.
What argument types are supported by the sample code?
The sample support arguments of type: string, int, double, boolean, lists of strings and dictionaries with a string key and object value.
private static KernelFunction ToKernelFunction(this Tool tool, IMcpClient mcpClient)
{
async Task<string> InvokeToolAsync(Kernel kernel, KernelFunction function, KernelArguments arguments, CancellationToken cancellationToken)
{
try
{
// Convert arguments to dictionary format expected by mcpdotnet
Dictionary<string, object> mcpArguments = [];
foreach (var arg in arguments)
{
if (arg.Value is not null)
{
mcpArguments[arg.Key] = function.ToArgumentValue(arg.Key, arg.Value);
}
}
// Call the tool through mcpdotnet
var result = await mcpClient.CallToolAsync(
tool.Name,
mcpArguments,
cancellationToken: cancellationToken
).ConfigureAwait(false);
// Extract the text content from the result
return string.Join("\n", result.Content
.Where(c => c.Type == "text")
.Select(c => c.Text));
}
catch (Exception ex)
{
Console.Error.WriteLine($"Error invoking tool '{tool.Name}': {ex.Message}");
// Rethrowing to allow the kernel to handle the exception
throw;
}
}
return KernelFunctionFactory.CreateFromMethod(
method: InvokeToolAsync,
functionName: tool.Name,
description: tool.Description,
parameters: tool.ToParameters(),
returnParameter: ToReturnParameter()
);
}
Invoke the MCP Tools via Function Calling
Once the MCP tools have been converted to Semantic Kernel functions they can be added to the Kernel.Plugins and are not available for use with function calling.
// Add the MCP tools as Kernel functions
var functions = await mcpClient.MapToFunctionsAsync().ConfigureAwait(false);
kernel.Plugins.AddFromFunctions("GitHub", functions);
// Enable automatic function calling
var executionSettings = new OpenAIPromptExecutionSettings
{
Temperature = 0,
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};
// Test using GitHub tools
var prompt = "Summarize the last four commits to the microsoft/semantic-kernel repository?";
var result = await kernel.InvokePromptAsync(prompt, new(executionSettings)).ConfigureAwait(false);
Console.WriteLine($"\n\n{prompt}\n{result}");
Here are some samples results:
Summarize the last four commits to the microsoft/semantic-kernel repository?
Here are the summaries of the last four commits to the `microsoft/semantic-kernel` repository:
1. **Commit f8ee3ac** by Mark Wallace on 2025-03-04:
- **Title:** .Net: Demo showing how to integrate MCP tools with Semantic Kernel
- **Description:** This commit introduces a demo for integrating MCP tools with the Semantic Kernel. It includes necessary changes to ensure the code builds cleanly without errors or warnings and follows the contribution guidelines. [View Commit](https://github.com/microsoft/semantic-kernel/commit/f8ee3ac408532a805ae3e9d7cc912c1df5e2796a)
2. **Commit 7c8dccc** by Roger Barreto on 2025-03-04:
- **Title:** .Net: Add missing Ollama Connector Aspire Friendly Extensions
- **Description:** This commit addresses issue #10532 by adding missing extensions for the Ollama Connector to ensure compatibility and functionality. [View Commit](https://github.com/microsoft/semantic-kernel/commit/7c8dccc2c62d6641aa2cc1c976d6a681c8b2200b)
3. **Commit 7787725** by dependabot[bot] on 2025-03-04:
- **Title:** Python: Bump google-cloud-aiplatform from 1.80.0 to 1.82.0 in /python
- **Description:** This automated commit updates the `google-cloud-aiplatform` dependency from version 1.80.0 to 1.82.0, incorporating new features and bug fixes as detailed in the release notes. [View Commit](https://github.com/microsoft/semantic-kernel/commit/7787725a1ae0b91325bd7602358ed6173b610076)
4. **Commit 022f05e** by Ross Smith on 2025-03-04:
- **Title:** .Net: dotnet format issues with SDK 9.0
- **Description:** This commit resolves formatting issues related to BOM encoding in the .NET codebase, ensuring compatibility with SDK version 9.0. It addresses problems in the `ActivityExtensions.cs` file and removes access modifiers on interface members. [View Commit](https://github.com/microsoft/semantic-kernel/commit/022f05e795ba80c7c5d52af2b1c3271dcc7e80bd)
What’s Next?
We encourage you to try out this integration possibility and let us know how it goes. Please create issues to request enhancements to the sample and provide your feedback. Full source code for the sample is available in the Semantic Kernel repository.
Next up, the team is looking at building on this sample to provide Model Context Protocol as a standard feature within the Semantic Kernel.
0 comments
Be the first to start the discussion.