May 27th, 2025
3 reactions

Azure AI Foundry MCP Server May 2025 Update: Adding Models, Knowledge & Evaluation

Azure AI Foundry MCP Server May 2025 Update: Adding Models, Knowledge & Evaluation

At Microsoft Build 2025, Satya Nadella highlighted the transformative potential of the Model Context Protocol (MCP) in democratizing AI development. Today, we’re excited to add more capabilities to our MCP Server for Azure AI Foundry – a powerful integration that brings this vision to life for developers working with Azure AI services.

The Developer Challenge We’re Solving

Before MCP Server: Developers struggled with complex API integrations, inconsistent interfaces across Azure AI services, and time-consuming model exploration. Building AI applications required deep knowledge of multiple SDKs, authentication methods, and service-specific protocols.

With MCP Server: One unified protocol connects your favorite AI assistants directly to Azure AI Foundry’s full capabilities. Now you can explore models, manage knowledge bases, and run evaluations using natural language – just like having a conversation with your development environment.

What is the MCP Server for Azure AI Foundry?

The MCP Server for Azure AI Foundry (experimental) is a powerful integration layer that brings together Azure AI Foundry‘s capabilities through the standardized Model Context Protocol. This server acts as a bridge between large language model clients (like GitHub Copilot and Claude Desktop) and Azure AI Foundry, providing a unified interface for AI model exploration, knowledge management, and comprehensive evaluation.

This MCP server is provided as an example to show how developers can leverage MCP to build their own integrations with Azure AI Foundry, and to invite the community to contribute to its development.

Why Use the MCP Server?

🚀 Faster Development Cycles – Natural language commands eliminate API documentation overhead

🔧 Simplified Architecture – Replace multiple SDK integrations with a single MCP protocol

💬 Intuitive Interaction – Query your AI infrastructure conversationally

🎯 Complete Toolkit – Agents, Models, knowledge management, and evaluation in one interface

Prerequisites & Quick Setup

Before diving in, ensure you have:

  • uv installed (Installation Guide)
  • Azure subscription with appropriate permissions
  • Environment variables configured (we’ll cover this below)

🚀 Fastest Way to Start: GitHub Template

Use The Template

Create your own repo using this template and open it in GitHub Codespace. Everything is pre-configured – just open GitHub Copilot in Agent mode and start chatting.

Animated GIF that shows how to start the MCP Server for Azure AI Foundry
How to start the MCP Server for Azure AI Foundry

🔧 VS Code Integration

Install in VS Code

This automatically sets up the MCP server in your VS Code environment under user settings.

Core Capabilities: What You Can Do

🤖 Models: Discover, Build, Deploy

Explore the Model Catalog

  • What models can I use from Azure AI Foundry?Discover available models in the catalog
  • What are the most popular models in Azure AI Foundry? Pick me 10 models.Get curated recommendations
  • Show me models from MetaFilter by specific publishers
  • What models support GitHub token for free testing?Find models for prototyping

Build Prototypes Rapidly

  • How can you help me build a prototype using the model?Get step-by-step guidance
  • I need to build an application that can analyze my web UX designsReceive tailored recommendations

Deploy with Confidence

  • Can you help me deploy OpenAI models?Get deployment guidance
  • What steps do I need to take to deploy OpenAI models on Azure AI Foundry?Detailed deployment workflows
Category Key Tools What They Do
Explore list_models_from_model_catalog Browse the entire Azure AI Foundry catalog with filters
  list_azure_ai_foundry_labs_projects Browse SOTA projects from Azure AI Foundry Labs
get_model_details_and_code_samples Get implementation details and sample code
Build get_prototyping_instructions_for_github_and_labs Complete setup guidance for development
Deploy deploy_model_on_ai_services Production deployment automation
create_foundry_project Project setup and configuration

How to explore models using MCP Server
How to explore models using MCP Server

🧠 Knowledge: Search and Information Management

Transform how you work with enterprise knowledge using Azure AI Search integration.

Index Management Made Simple

  • Show me all my search indexesGet overview of existing indexes
  • Create an index for customer dataSet up new search capabilities

Document Operations

  • Add customer records to the indexBulk document uploads
  • Find all customers from FranceNatural language querying

Advanced Search Operations

  • Show me documents where signup date is March 2025Complex filtering
  • How many documents are in my customer index?Analytics and insights
Category Key Tools Purpose
Index Management create_index, modify_index Build and customize search indexes
Document Operations add_document, query_index Manage and search your data
Data Sources create_indexer, list_data_sources Automate data ingestion

📊 Evaluation: Measure and Improve Performance

Comprehensive evaluation tools for both text responses and agent behaviors.

Text Quality Evaluation

  • Evaluate my model responses for groundednessCheck factual accuracy
  • Run fluency evaluation on my chatbot outputsAssess response quality

Agent Performance Testing

  • Test my agent's tool-calling accuracyValidate agent behaviors
  • Evaluate intent resolution capabilitiesMeasure understanding

Risk and Safety Assessment

  • Check for potential harmful contentSafety evaluations
  • Evaluate for bias and fairnessResponsible AI testing

Environment Configuration

Set up your environment variables in a .env file:

# GitHub authentication (for free model testing)
GITHUB_TOKEN=your_github_token

# Azure AI Search (for Knowledge capabilities)
AZURE_AI_SEARCH_ENDPOINT=https://your-search-service.search.windows.net/
AZURE_AI_SEARCH_API_KEY=your_api_key
SEARCH_AUTHENTICATION_METHOD=api-search-key

# Azure OpenAI (for Evaluation)
AZURE_OPENAI_ENDPOINT=https://your-openai-endpoint.openai.azure.com/
AZURE_OPENAI_API_KEY=your_api_key
AZURE_OPENAI_DEPLOYMENT=gpt-4o

# Azure Project (for Agent Evaluation)
AZURE_SUBSCRIPTION_ID=your_subscription_id
AZURE_RESOURCE_GROUP=your_resource_group
AZURE_PROJECT_NAME=your_project_name

Manual Setup for Custom Configurations

  1. Install uv by following Installing uv
  2. Create .vscode/mcp.json in your workspace:
{
  "servers": {
    "mcp_foundry_server": {
      "type": "stdio",
      "command": "uvx",
      "args": [
        "--prerelease=allow",
        "--from",
        "git+https://github.com/azure-ai-foundry/mcp-foundry.git",
        "run-azure-ai-foundry-mcp",
        "--envFile",
        "${workspaceFolder}/.env"
      ]
    }
  }
}
  1. Configure Environment Variables – Create your .env file with the configuration above
  2. Start the Server – Click the Start button in VS Code or run the command manually

Real-World Use Cases

🔬 Prototyping Phase: “I need to quickly test different models for my use case” → Compare models, get GitHub token access, and iterate rapidly

🔍 Development & Testing: “I want to build a RAG application with my company docs” → Create vector indexes with Azure AI Search, ingest documents, and test retrieval quality

📊 Performance Validation: “I need to evaluate my AI system before production” → Run comprehensive evaluations across multiple quality and safety metrics

🚀 Production Deployment: “I’m ready to deploy my model to serve real users” → Set up Azure AI Services, deploy models, and configure monitoring

Sample Conversation Flows

Exploring and Deploying Models

You: "What OpenAI models are available that I can test for free?"

MCP Server: "Here are OpenAI models supporting GitHub token for free testing:
- GPT-4o-mini (best for experimentation)
- GPT-3.5-turbo (cost-effective for prototypes)
..."

You: "Show me how to deploy GPT-4o-mini to production"

MCP Server: "I'll help you deploy GPT-4o-mini. First, let me check your quotas and guide you through the process..."

Building Knowledge Applications

You: "Create a search index for customer support tickets"

MCP Server: "I'll create an optimized index for support tickets. What fields do you need to search and filter on?"

You: "I need to search ticket content, filter by priority and date, and retrieve customer info"

MCP Server: "Perfect! I'll create an index with searchable content fields, filterable priority and date fields..."

Foundry Labs: Powering Prototyping with Microsoft Research Innovation

Foundry Labs models and projects bring Microsoft Research innovations just a prompt away. With the MCP Server for Azure AI Foundry, agents can now:

  • Explore Foundry Labs projects and surface cutting-edge Microsoft Research models like OmniParser V2 (for screen parsing) and Magnetic One (for multi-agent planning).
  • Recommend the right model based on your goal – whether you’re extracting tables from PDFs or tackling multi-agent scheduling – and explain why it is a good fit.
  • Generate starter code on demand, such as “Build a cognitive-load analyzer with OmniParser V2”) by pulling integration details from MCP Server and inserting them directly into a GitHub Codespaces or Visual Studio Code.
  • Produce a runnable prototype in minutes, complete with authentication and model calls already wired in.

The result: faster experiments with specialized Microsoft Research models, fewer repetitive steps, and a smoother path from idea to working prototype.

See the workflow end to end in our Microsoft Build session “⁠Inside Azure AI Foundry Labs: Experimenting with the Future of AI“—available on demand. Watch the demo, clone the GitHub repository, and start experimenting with Foundry Labs models.

What’s Next: Building on This Foundation

The MCP Server for Azure AI Foundry represents just the beginning of our commitment to building MCP in Microsoft products. As the MCP ecosystem grows, we expect to contribute more Azure AI Foundry capabilities to this server.

Ready to Transform Your AI Development Workflow?

The future of AI development is conversational, intuitive, and democratized. The Azure AI Foundry MCP Server makes that future available today.

🚀 Start Building Now:

💡 What Will You Build First?

Whether you’re prototyping the next breakthrough AI application, building enterprise knowledge systems, or ensuring your models meet the highest quality standards, the MCP Server for Azure AI Foundry puts the full power of Azure AI at your fingertips – through simple, natural conversation.


The MCP Server for Azure AI Foundry is currently in experimental release. We welcome your feedback and contributions to help shape its future development.

Author

Farzad Sunavala
Principal Product Manager

Building knowledge retrieval capabilities for AI Agents.

SeokJin Han
Product Manager

SeokJin is a Product Manager at AI Platform, Microsoft, focused on building a robust, developer-centric ecosystem for AI models and agents. He leads the inferencing stack that powers key Microsoft products such as Azure OpenAI, Microsoft 365 Copilot, GitHub Copilot, and more - ensuring scalable, efficient, and reliable model deployment.

Saumil
Principle PM

Saumil Shrivastava is a product leader on the Azure AI Foundry team at Microsoft. He leads efforts to build a thriving ecosystem of models and agents, and also heads the Foundry Labs initiative. Saumil is passionate about creating systems that turn cutting-edge innovation into real-world impact. Outside of work, he enjoys writing and mentoring PMs.

0 comments