June 27th, 2025
0 reactions

Announcing Model Context Protocol Support (preview) in Azure AI Foundry Agent Service

Linda Li
Product Manager

Generative-AI agents only become useful when they can do things—query systems of record, trigger workflows, or look up specialized knowledge. Until now that meant hand-rolling Azure Functions, managing OpenAPI specs, or writing custom plug-ins for every backend you own. 

MCP changes the economics: it is an open, JSON-RPC–based protocol—originally proposed by Anthropic—that lets a “server” publish tools (functions) and resources (context) once and have any compliant “client” (your agent runtime) discover and call them automatically. Think “USB-C for AI integrations.” 

With today’s preview, Foundry Agent Service become first-class MCP clients. Bring any remote MCP server—self-hosted or SaaS—and Azure AI Foundry will import its capabilities in seconds, keep them updated, and route calls through the service’s enterprise envelope. 

Model Context Protocol (MCP) allows developers, organizations, and service providers to host services and APIs on MCP servers and easily expose and connect tools to MCP-compatible clients, such as the Foundry Agent Service. MCP is an open standard that defines how services provide various functions and context to AI models and agents. With Foundry Agent Service’s support of MCP, users can bring an existing MCP server endpoint and add it as a tool to Foundry agents. When connecting to an MCP server, actions and knowledge are automatically added to the agent and updated as functionality evolves. This streamlines the process of building agents and reduces the time required for maintaining them. 

With MCP support in Foundry Agent Service, it empowers you to: 

  • Easily integrate with services and APIs. Whether you want to connect with your internal services or APIs from external providers, MCP provides an easy way to integrate with Foundry Agent Service without writing and managing custom functions.
  • Enhance your AI agent with enterprise features in Foundry Agent Service. With Foundry Agent Service, you can enable enterprise-ready features such as Bring Your Own thread storage.

Code Samples

Step 1: Import the needed packages

import time
import json

from azure.ai.agents.models import MessageTextContent, ListSortOrder
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential

Step 2: create AI Project Client and create an Azure AI Foundry Agent

server_labelis a unique name you provide for the MCP server within the same Foundry Agent

server_urlis the url of the MCP server

require_approvalsupports never

project_client = AIProjectClient(
    endpoint=PROJECT_ENDPOINT,
    credential=DefaultAzureCredential()
)

with project_client:
    agent = project_client.agents.create_agent(
        model=MODEL_DEPLOYMENT_NAME, 
        name="my-mcp-agent", 
        instructions="You are a helpful assistant. Use the tools provided to answer the user's questions. Be sure to cite your sources.",
        tools= [
            {
                "type": "mcp",
		"server_label": <name of your choice for the mcp server>,
                "server_url": <url of the remote MCP server>,
                "require_approval": "never"
            }
        ],
        tool_resources=None
    )
    print(f"Created agent, agent ID: {agent.id}")

Step 3: Create a Thread, Message and Run

Within the Run, you can pass custom headers and use the server_label you provided to map to the specific MCP server.

    thread = project_client.agents.threads.create()
    print(f"Created thread, thread ID: {thread.id}")

    message = project_client.agents.messages.create(
        thread_id=thread.id, role="user", content="<a question for your MCP server>",
    )
    print(f"Created message, message ID: {message.id}")

    run = project_client.agents.runs.create(thread_id=thread.id, agent_id=agent.id)

Step 4: Execute the Run and retrieve Message

You can use run_step to get more details on tool inputs and tool outputs. Once the run is complete, you can retrieve message back from the Foundry Agent.


    # Poll the run as long as run status is queued or in progress
    while run.status in ["queued", "in_progress", "requires_action"]:
        # Wait for a second
        time.sleep(1)
        run = project_client.agents.runs.get(thread_id=thread.id, run_id=run.id)
        print(f"Run status: {run.status}")

    if run.status == "failed":
        print(f"Run error: {run.last_error}")

    run_steps = project_client.agents.run_steps.list(thread_id=thread.id, run_id=run.id)
    for step in run_steps:
        print(f"Run step: {step.id}, status: {step.status}, type: {step.type}")
        if step.type == "tool_calls":
            print(f"Tool call details:")
            for tool_call in step.step_details.tool_calls:
                print(json.dumps(tool_call.as_dict(), indent=2))

    messages = project_client.agents.messages.list(thread_id=thread.id, order=ListSortOrder.ASCENDING)
    for data_point in messages:
        last_message_content = data_point.content[-1]
        if isinstance(last_message_content, MessageTextContent):
            print(f"{data_point.role}: {last_message_content.text.value}")

Step 5: Clean up

    project_client.agents.delete_agent(agent.id)
    print(f"Deleted agent, agent ID: {agent.id}")

At Microsoft Build 2025, Satya Nadella highlighted an “open-by-design” AI ecosystem and announced that Microsoft is partnering with Anthropic to make the Model Context Protocol (MCP) a first-class standard across Windows 11, GitHub, Copilot Studio, and Azure AI Foundry. Today’s preview support for MCP in Azure AI Foundry Agent Service is the next step in that journey. It brings the same “connect once, integrate anywhere” promise to cloud-hosted agents, letting you take any MCP server and plug it directly into Foundry with zero custom code. 

Create with Azure AI Foundry

Author

Linda Li
Product Manager

0 comments