{"id":1320,"date":"2025-09-04T22:09:45","date_gmt":"2025-09-04T22:09:45","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/all-things-azure\/?p=1320"},"modified":"2025-09-05T13:13:15","modified_gmt":"2025-09-05T13:13:15","slug":"build-your-own-microsoft-docs-ai-assistant-with-azure-container-apps-and-azure-openai","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/all-things-azure\/build-your-own-microsoft-docs-ai-assistant-with-azure-container-apps-and-azure-openai\/","title":{"rendered":"Build your own Microsoft Docs AI assistant with Azure Container Apps and Azure OpenAI"},"content":{"rendered":"<p>Learn how to deploy a self-hosted AI assistant that leverages Microsoft Learn content via the <strong data-start=\"706\" data-end=\"738\">Model Context Protocol (MCP)<\/strong> and Azure OpenAI. It\u2019s fast, secure, and ready for developer use in real-world apps.<\/p>\n<p><strong data-start=\"327\" data-end=\"343\">Prerequisite\n<\/strong><br data-start=\"343\" data-end=\"346\" \/>This guide assumes you already have an Azure OpenAI resource provisioned in your subscription, with a deployed model (e.g., <code data-start=\"476\" data-end=\"483\">gpt-4<\/code>, <code data-start=\"485\" data-end=\"494\">gpt-4.1<\/code>, or <code data-start=\"499\" data-end=\"513\">gpt-35-turbo<\/code>). If not, follow the <a class=\"decorated-link cursor-pointer\" href=\"https:\/\/learn.microsoft.com\/en-us\/azure\/ai-foundry\/openai\/how-to\/create-resource\" target=\"_new\" rel=\"noopener\" data-start=\"535\" data-end=\"686\">Azure OpenAI Quickstart<\/a> before proceeding.<\/p>\n<h3>The goal<\/h3>\n<p data-start=\"846\" data-end=\"1079\">Imagine being able to ask Microsoft Learn, &#8220;How do I monitor AKS workloads?&#8221; or &#8220;What are best practices for Azure Bicep deployments?&#8221; and instantly get a precise, summarized answer powered by GPT-4.1 and grounded in documentation.<\/p>\n<p data-start=\"1081\" data-end=\"1163\">This blog post walks through deploying a <strong data-start=\"1122\" data-end=\"1155\">production-ready AI assistant<\/strong>, using:<\/p>\n<ul>\n<li>Microsoft\u2019s <strong data-start=\"1179\" data-end=\"1211\">Model Context Protocol (MCP)<\/strong><\/li>\n<li><strong data-start=\"1214\" data-end=\"1230\">Azure OpenAI<\/strong> (GPT-4.1 Mini deployment)<\/li>\n<li><strong data-start=\"1259\" data-end=\"1283\">Azure Container Apps<\/strong> for secure, serverless hosting<\/li>\n<li>Docker and ACR for packaging and image delivery<\/li>\n<\/ul>\n<h3 data-start=\"1371\" data-end=\"1398\">Architecture overview<\/h3>\n<p data-start=\"1400\" data-end=\"1585\">The flow is based on <a class=\"decorated-link\" href=\"https:\/\/learn.microsoft.com\/api\/mcp\" target=\"_new\" rel=\"noopener\" data-start=\"1421\" data-end=\"1488\">Model Context Protocol (MCP)<\/a>, an emerging standard designed to provide trusted, contextual information for LLMs and copilots.<\/p>\n<p data-start=\"1400\" data-end=\"1585\"><a href=\"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mcp.png\"><img decoding=\"async\" class=\"alignnone wp-image-1330 \" src=\"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mcp-596x1024.png\" alt=\"msdocs assistant architecture\" width=\"442\" height=\"759\" srcset=\"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mcp-596x1024.png 596w, https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mcp-175x300.png 175w, https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mcp-768x1319.png 768w, https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mcp-895x1536.png 895w, https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mcp-1193x2048.png 1193w, https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mcp.png 1453w\" sizes=\"(max-width: 442px) 100vw, 442px\" \/><\/a><\/p>\n<p data-start=\"1966\" data-end=\"1999\">The result: an agent that\u2019s grounded in official content and capable of serving precise answers to developers.<\/p>\n<h3 data-start=\"1966\" data-end=\"1999\">Deploying your own instance<\/h3>\n<p data-start=\"2001\" data-end=\"2172\">Let\u2019s walk through building and deploying your own instance of <a class=\"decorated-link\" href=\"https:\/\/github.com\/passadis\/mslearn-mcp-chat\" target=\"_new\" rel=\"noopener\" data-start=\"2064\" data-end=\"2128\">mslearn-mcp-chat<\/a>, a lightweight web app powered by Next.js.<\/p>\n<h4>Step 1: Clone the project<\/h4>\n<p>&nbsp;<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">git clone https:\/\/github.com\/passadis\/mslearn-mcp-chat.git\r\ncd mslearn-mcp-chat\r\n<\/code><\/pre>\n<h4>Step 2: Auto-discover Azure OpenAI config<\/h4>\n<p>&nbsp;<\/p>\n<p>If you&#8217;ve already deployed an Azure OpenAI resource and a GPT-4 deployment, you can auto-discover everything and generate .env.local like this:<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">#!\/bin\/bash\r\n\r\n# Get resource group and AOAI resource name\r\nRESOURCE_GROUP=$(az cognitiveservices account list \\\r\n  --query \"[?kind=='OpenAI'].resourceGroup\" -o tsv | head -n1)\r\n\r\nAOAI_RESOURCE_NAME=$(az cognitiveservices account list \\\r\n  --query \"[?kind=='OpenAI'].name\" -o tsv | head -n1)\r\n\r\n# Get endpoint\r\nAOAI_ENDPOINT=$(az cognitiveservices account show \\\r\n  --name \"$AOAI_RESOURCE_NAME\" \\\r\n  --resource-group \"$RESOURCE_GROUP\" \\\r\n  --query \"properties.endpoint\" -o tsv)\r\n\r\n# Get API key\r\nAOAI_KEY=$(az cognitiveservices account keys list \\\r\n  --name \"$AOAI_RESOURCE_NAME\" \\\r\n  --resource-group \"$RESOURCE_GROUP\" \\\r\n  --query \"key1\" -o tsv)\r\n\r\n# Get deployment name (adjust model name filter if needed)\r\nDEPLOYMENT_NAME=$(az cognitiveservices account deployment list \\\r\n  --name \"$AOAI_RESOURCE_NAME\" \\\r\n  --resource-group \"$RESOURCE_GROUP\" \\\r\n  --query \"[?contains(properties.model.name, 'gpt-4')].name\" -o tsv | head -n1)\r\n\r\n# Write to .env.local\r\ncat &lt;&lt;EOF &gt; .env.local\r\nAZURE_OPENAI_KEY=$AOAI_KEY\r\nAZURE_OPENAI_ENDPOINT=$AOAI_ENDPOINT\r\nAZURE_OPENAI_DEPLOYMENT_NAME=$DEPLOYMENT_NAME\r\nEOF\r\n\r\necho \".env.local created\"\r\ncat .env.local<\/code><\/pre>\n<h4>Step 3: Export for CLI usage<\/h4>\n<p>&nbsp;<\/p>\n<p>Since Azure CLI doesn\u2019t parse .env.local, create a temp exported version:<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">sed 's\/^\/export \/' .env.local &gt; .env.exported\r\nsource .env.exported<\/code><\/pre>\n<p>Now you can use those variables in your az commands.<\/p>\n<h4>Step 4: Create Azure resources<\/h4>\n<p>&nbsp;<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">az group create --name rg-mcp-chat --location eastus\r\n\r\naz acr create --name acrmcpchat \\\r\n  --resource-group rg-mcp-chat \\\r\n  --sku Basic --admin-enabled true<\/code><\/pre>\n<h4>Step 5: Dockerize and push the image<\/h4>\n<p>&nbsp;<\/p>\n<h5>Create a Dockerfile<\/h5>\n<p>&nbsp;<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">FROM node:20\r\nWORKDIR \/app\r\nCOPY . .\r\nRUN npm install &amp;&amp; npm run build\r\nEXPOSE 3000\r\nCMD [\"npm\", \"start\"]<\/code><\/pre>\n<h5>Build &amp; Push<\/h5>\n<p>&nbsp;<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">docker build -t acrmcpchat.azurecr.io\/mcp-chat:latest .\r\naz acr login --name acrmcpchat\r\ndocker push acrmcpchat.azurecr.io\/mcp-chat:latest\r\n<\/code><\/pre>\n<h4>Step 6: Create a Container App Environment<\/h4>\n<p>&nbsp;<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">az containerapp env create \\\r\n  --name env-mcp-chat \\\r\n  --resource-group rg-mcp-chat \\\r\n  --location eastus<\/code><\/pre>\n<h4>Step 7: Deploy the Container App<\/h4>\n<p>&nbsp;<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">az containerapp create \\\r\n  --name mcp-chat-app \\\r\n  --resource-group rg-mcp-chat \\\r\n  --environment env-mcp-chat \\\r\n  --image acrmcpchat.azurecr.io\/mcp-chat:latest \\\r\n  --registry-server acrmcpchat.azurecr.io \\\r\n  --cpu 1.0 --memory 2.0Gi \\\r\n  --target-port 3000 \\\r\n  --ingress external \\\r\n  --env-vars \\\r\n    AZURE_OPENAI_KEY=$AZURE_OPENAI_KEY \\\r\n    AZURE_OPENAI_ENDPOINT=$AZURE_OPENAI_ENDPOINT \\\r\n    AZURE_OPENAI_DEPLOYMENT_NAME=$AZURE_OPENAI_DEPLOYMENT_NAME<\/code><\/pre>\n<h4>Step 8: Get the public URL<\/h4>\n<p>&nbsp;<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">az containerapp show \\\r\n  --name mcp-chat-app \\\r\n  --resource-group rg-mcp-chat \\\r\n  --query properties.configuration.ingress.fqdn \\\r\n  --output tsv<\/code><\/pre>\n<p data-start=\"4580\" data-end=\"4633\">Open that URL in your browser and try questions like:<\/p>\n<ul>\n<li>\u201cWhat\u2019s the best way to deploy Azure Functions using Bicep?\u201d<\/li>\n<li>\u201cHow does Azure Policy work with management groups?\u201d<\/li>\n<li>\u201cWhat\u2019s the difference between vCore and DTU in Azure SQL?\u201d<\/li>\n<\/ul>\n<p>Here\u2019s how the app looks when running:<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mslearnassistant-1.png\"><img decoding=\"async\" class=\"alignnone size-large wp-image-1350\" src=\"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mslearnassistant-1-872x1024.png\" alt=\"mslearnassistant image\" width=\"872\" height=\"1024\" srcset=\"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mslearnassistant-1-872x1024.png 872w, https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mslearnassistant-1-256x300.png 256w, https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mslearnassistant-1-768x902.png 768w, https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-content\/uploads\/sites\/83\/2025\/09\/mslearnassistant-1.png 891w\" sizes=\"(max-width: 872px) 100vw, 872px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<h3 data-start=\"4821\" data-end=\"4867\">Why model context protocol (MCP) matters<\/h3>\n<p data-start=\"4869\" data-end=\"5084\">MCP is a structured protocol that helps AI assistants ground responses in official sources, such as Microsoft Learn and Docs, by returning text fragments (chunks) for use in RAG pipelines or summarization prompts.<\/p>\n<h4 data-start=\"5086\" data-end=\"5109\">Sample MCP payload:<\/h4>\n<p>&nbsp;<\/p>\n<pre class=\"prettyprint language-json\"><code class=\"language-json\">{\r\n  \"jsonrpc\": \"2.0\",\r\n  \"id\": \"chat-123\",\r\n  \"method\": \"tools\/call\",\r\n  \"params\": {\r\n    \"name\": \"microsoft_docs_search\",\r\n    \"arguments\": {\r\n      \"question\": \"How do I deploy AKS with Bicep?\"\r\n    }\r\n  }\r\n}<\/code><\/pre>\n<p>The assistant receives those docs, crafts a system message, and sends the question and context to Azure OpenAI for synthesis.<\/p>\n<h3>What you get<\/h3>\n<table>\n<thead>\n<tr>\n<th>Feature<\/th>\n<th>Benefit<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>\ud83e\udde9 MCP Integration<\/td>\n<td>Answers grounded in Microsoft Learn docs<\/td>\n<\/tr>\n<tr>\n<td>\ud83d\udd10 AOAI Security<\/td>\n<td>Backend-only key usage, never exposed in client<\/td>\n<\/tr>\n<tr>\n<td>\ud83d\ude80 Container Apps<\/td>\n<td>Scalable, secure hosting with no infrastructure to manage<\/td>\n<\/tr>\n<tr>\n<td>\ud83d\udee0\ufe0f Dev-Focused Stack<\/td>\n<td>Next.js + Node + Azure CLI + ACR, fast to iterate<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h3 data-start=\"6296\" data-end=\"6316\">Final thoughts<\/h3>\n<p data-start=\"6318\" data-end=\"6586\">Whether you\u2019re building an internal dev assistant, an Azure learning tool, or testing out custom copilots, the MCP + Azure OpenAI combo is powerful, trustworthy, and fully customizable. And thanks to Azure Container Apps, deploying it is just a few CLI commands away.<\/p>\n<h3 data-start=\"6593\" data-end=\"6609\">References<\/h3>\n<ul>\n<li><a class=\"decorated-link\" href=\"https:\/\/learn.microsoft.com\/api\/mcp\" target=\"_new\" rel=\"noopener\" data-start=\"6613\" data-end=\"6690\">Microsoft Model Context Protocol (MCP)<\/a><\/li>\n<li><a class=\"decorated-link cursor-pointer\" href=\"https:\/\/learn.microsoft.com\/en-us\/azure\/ai-foundry\/what-is-azure-ai-foundry\" target=\"_new\" rel=\"noopener\" data-start=\"6693\" data-end=\"6777\">Azure OpenAI Overview<\/a><\/li>\n<li><a class=\"decorated-link cursor-pointer\" href=\"https:\/\/learn.microsoft.com\/en-us\/azure\/container-apps\/overview\" target=\"_new\" rel=\"noopener\" data-start=\"6780\" data-end=\"6859\">Azure Container Apps<\/a><\/li>\n<li><a class=\"decorated-link\" href=\"https:\/\/github.com\/passadis\/mslearn-mcp-chat\" target=\"_new\" rel=\"noopener\" data-start=\"6862\" data-end=\"6952\">Original GitHub Project \u2013 mslearn-mcp-chat<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Learn how to deploy a self-hosted AI assistant that leverages Microsoft Learn content via the Model Context Protocol (MCP) and Azure OpenAI. It\u2019s fast, secure, and ready for developer use in real-world apps. Prerequisite This guide assumes you already have an Azure OpenAI resource provisioned in your subscription, with a deployed model (e.g., gpt-4, gpt-4.1, [&hellip;]<\/p>\n","protected":false},"author":197938,"featured_media":1322,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[36,37,1],"tags":[],"class_list":["post-1320","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-apps","category-ai-foundry","category-azure"],"acf":[],"blog_post_summary":"<p>Learn how to deploy a self-hosted AI assistant that leverages Microsoft Learn content via the Model Context Protocol (MCP) and Azure OpenAI. It\u2019s fast, secure, and ready for developer use in real-world apps. Prerequisite This guide assumes you already have an Azure OpenAI resource provisioned in your subscription, with a deployed model (e.g., gpt-4, gpt-4.1, [&hellip;]<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/posts\/1320","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/users\/197938"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/comments?post=1320"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/posts\/1320\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/media\/1322"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/media?parent=1320"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/categories?post=1320"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/all-things-azure\/wp-json\/wp\/v2\/tags?post=1320"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}