Generative-AI agents only become useful when they can do things—query systems of record, trigger workflows, or look up specialized knowledge. Until now that meant hand-rolling Azure Functions, managing OpenAPI specs, or writing custom plug-ins for every backend you own.
MCP changes the economics: it is an open, JSON-RPC–based protocol—originally proposed by Anthropic—that lets a “server” publish tools (functions) and resources (context) once and have any compliant “client” (your agent runtime) discover and call them automatically. Think “USB-C for AI integrations.”
With today’s preview, Foundry Agent Service become first-class MCP clients. Bring any remote MCP server—self-hosted or SaaS—and Azure AI Foundry will import its capabilities in seconds, keep them updated, and route calls through the service’s enterprise envelope.
Model Context Protocol (MCP) allows developers, organizations, and service providers to host services and APIs on MCP servers and easily expose and connect tools to MCP-compatible clients, such as the Foundry Agent Service. MCP is an open standard that defines how services provide various functions and context to AI models and agents. With Foundry Agent Service’s support of MCP, users can bring an existing MCP server endpoint and add it as a tool to Foundry agents. When connecting to an MCP server, actions and knowledge are automatically added to the agent and updated as functionality evolves. This streamlines the process of building agents and reduces the time required for maintaining them.
With MCP support in Foundry Agent Service, it empowers you to:
- Easily integrate with services and APIs. Whether you want to connect with your internal services or APIs from external providers, MCP provides an easy way to integrate with Foundry Agent Service without writing and managing custom functions.
- Enhance your AI agent with enterprise features in Foundry Agent Service. With Foundry Agent Service, you can enable enterprise-ready features such as Bring Your Own thread storage.
Code Samples
Step 1: Import the needed packages. Please make sure you use the latest packages for azure-ai-projects and azure-ai-agents and the supported region.
import time
import json
from azure.ai.agents.models import MessageTextContent, ListSortOrder, McpTool
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential
Step 2: Create AI Project Client and an Azure AI Foundry Agent
Use server_label to provide a unique name for the MCP server within the same Foundry Agent.
server_url is the URL of the MCP server.
allowed_tools is optional to specify which tools are enabled for the agent.
mcp_tool = McpTool(
server_label=mcp_server_label,
server_url=mcp_server_url,
allowed_tools=[], # Optional
)
project_client = AIProjectClient(
endpoint=PROJECT_ENDPOINT,
credential=DefaultAzureCredential()
)
with project_client:
agent = project_client.agents.create_agent(
model=MODEL_DEPLOYMENT_NAME,
name="my-mcp-agent",
instructions=(
"You are a helpful assistant. Use the tools provided to answer the user's "
"questions. Be sure to cite your sources."
),
tools=mcp_tool.definitions
)
print(f"Created agent, agent ID: {agent.id}")
Step 3: Create a Thread, Message and Run
Within the Run, you can pass custom headers and use the server_label you provided to map to the specific MCP server.
thread = project_client.agents.threads.create()
print(f"Created thread, thread ID: {thread.id}")
message = project_client.agents.messages.create(
thread_id=thread.id,
role="user",
content="<a question for your MCP server>"
)
print(f"Created message, message ID: {message.id}")
Please note headers are only valid for the run. You can provide headers in this way:
mcp_tool.update_headers("SuperSecret", "123456")
run = project_client.agents.runs.create(
thread_id=thread.id,
agent_id=agent.id
)
Step 4: Execute the Run and retrieve Message
You can use run_step to get more details on tool inputs and outputs. By default, every tool calling to a MCP server needs approval.
while run.status in ["queued", "in_progress", "requires_action"]:
time.sleep(1)
run = agents_client.runs.get(thread_id=thread.id, run_id=run.id)
if run.status == "requires_action" and isinstance(run.required_action, SubmitToolApprovalAction):
tool_calls = run.required_action.submit_tool_approval.tool_calls
if not tool_calls:
print("No tool calls provided - cancelling run")
agents_client.runs.cancel(thread_id=thread.id, run_id=run.id)
break
tool_approvals = []
for tool_call in tool_calls:
if isinstance(tool_call, RequiredMcpToolCall):
try:
print(f"Approving tool call: {tool_call}")
tool_approvals.append(
ToolApproval(
tool_call_id=tool_call.id,
approve=True,
headers=mcp_tool.headers,
)
)
except Exception as e:
print(f"Error approving tool_call {tool_call.id}: {e}")
print(f"tool_approvals: {tool_approvals}")
if tool_approvals:
agents_client.runs.submit_tool_outputs(
thread_id=thread.id,
run_id=run.id,
tool_approvals=tool_approvals
)
print(f"Current run status: {run.status}")
# Retrieve the generated response:
messages = agents_client.messages.list(thread_id=thread.id)
print("\nConversation:")
print("-" * 50)
for msg in messages:
if msg.text_messages:
last_text = msg.text_messages[-1]
print(f"{msg.role.upper()}: {last_text.text.value}")
print("-" * 50)
Step 5: Clean up
project_client.agents.delete_agent(agent.id)
print(f"Deleted agent, agent ID: {agent.id}")
At Microsoft Build 2025, Satya Nadella shared our vision for an open-by-design AI ecosystem and announced a partnership with Anthropic to make the Model Context Protocol (MCP) a first-class standard across Windows 11, GitHub, Copilot Studio, and Azure AI Foundry.
Today’s preview support for MCP in Azure AI Foundry Agent Service is the next step in that journey. It brings the same “connect once, integrate anywhere” promise to cloud-hosted agents—letting you plug any MCP server directly into Foundry with zero custom code.
Create with Azure AI Foundry
- Get started with Azure AI Foundry and jump directly into Visual Studio Code .
- Download the Azure AI Foundry SDK .
- Read the documentation to learn more about the feature.
- Take the Azure AI Foundry Learn courses .
- Keep the conversation going in GitHub and Discord .
What version of `azure.ai.agents` and `azure.ai.projects` are we using?