{"id":60055,"date":"2026-05-06T10:00:00","date_gmt":"2026-05-06T17:00:00","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/dotnet\/?p=60055"},"modified":"2026-05-03T21:37:30","modified_gmt":"2026-05-04T04:37:30","slug":"durable-workflows-in-microsoft-agent-framework","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/dotnet\/durable-workflows-in-microsoft-agent-framework\/","title":{"rendered":"Durable Workflows in the Microsoft Agent Framework"},"content":{"rendered":"<p>The <a href=\"https:\/\/github.com\/microsoft\/agent-framework\">Microsoft Agent Framework (MAF)<\/a>\nis an open-source, multi-language framework for building, orchestrating, and\ndeploying AI agents. Since its\n<a href=\"https:\/\/devblogs.microsoft.com\/dotnet\/introducing-microsoft-agent-framework-preview\/\">preview announcement<\/a>,\nthe framework has added a <strong>workflow programming model<\/strong> that lets you compose\nmultiple agents and other units of work into multi-step pipelines. You define\nindividual steps called <em>executors<\/em>, wire them into a directed graph using a\n<em>workflow builder<\/em>, and the framework handles execution, data flow between\nsteps, and error propagation. Workflows can model sequential chains, parallel\nfan-out\/fan-in patterns, conditional branching, human-in-the-loop approvals,\nand more.<\/p>\n<p>The core workflow package includes a lightweight <strong>in-process runner<\/strong> that\nexecutes workflows entirely in memory. It&#8217;s perfect for getting started\nquickly and for local development. In this post, we&#8217;ll start by building a\nsimple workflow in a .NET console app, then progressively add durability,\nparallel AI agents, and Azure Functions hosting.<\/p>\n<h2>The Workflow Programming Model<\/h2>\n<p>To get started, create a new console app project and add the following NuGet packages:<\/p>\n<pre><code class=\"language-bash\">dotnet add package Microsoft.Agents.AI\r\ndotnet add package Microsoft.Agents.AI.Workflows<\/code><\/pre>\n<p>Let&#8217;s start with the core building blocks of a MAF workflow.<\/p>\n<h3>Executors<\/h3>\n<p>An <strong>Executor<\/strong> is the fundamental unit of work. It receives a typed input,\nprocesses it, and produces output. You create one by subclassing\n<code>Executor&lt;TInput, TOutput&gt;<\/code>:<\/p>\n<pre><code class=\"language-csharp\">using Microsoft.Agents.AI.Workflows;\r\n\r\ninternal sealed class OrderLookup()\r\n    : Executor&lt;OrderCancelRequest, Order&gt;(\"OrderLookup\")\r\n{\r\n    public override async ValueTask&lt;Order&gt; HandleAsync(\r\n        OrderCancelRequest message,\r\n        IWorkflowContext context,\r\n        CancellationToken cancellationToken = default)\r\n    {\r\n        \/\/ Simulate looking up an order by ID\r\n        await Task.Delay(TimeSpan.FromMilliseconds(100), cancellationToken);\r\n\r\n        return new Order(\r\n            Id: message.OrderId,\r\n            OrderDate: DateTime.UtcNow.AddDays(-1),\r\n            IsCancelled: false,\r\n            CancelReason: message.Reason,\r\n            Customer: new Customer(\r\n                Name: \"Jerry\", Email: \"jerry@example.com\"));\r\n    }\r\n}\r\n\r\ninternal sealed class OrderCancel()\r\n    : Executor&lt;Order, Order&gt;(\"OrderCancel\")\r\n{\r\n    public override async ValueTask&lt;Order&gt; HandleAsync(\r\n        Order message,\r\n        IWorkflowContext context,\r\n        CancellationToken cancellationToken = default)\r\n    {\r\n        await Task.Delay(TimeSpan.FromMilliseconds(200), cancellationToken);\r\n        return message with { IsCancelled = true };\r\n    }\r\n}\r\n\r\ninternal sealed class SendEmail()\r\n    : Executor&lt;Order, string&gt;(\"SendEmail\")\r\n{\r\n    public override ValueTask&lt;string&gt; HandleAsync(\r\n        Order message,\r\n        IWorkflowContext context,\r\n        CancellationToken cancellationToken = default)\r\n    {\r\n        return ValueTask.FromResult(\r\n            $\"Cancellation email sent for order {message.Id} \"\r\n            + $\"to {message.Customer.Email}.\");\r\n    }\r\n}\r\n\r\ninternal sealed record OrderCancelRequest(string OrderId, string Reason);\r\ninternal sealed record Order(\r\n    string Id, DateTime OrderDate, bool IsCancelled,\r\n    string? CancelReason, Customer Customer);\r\ninternal sealed record Customer(string Name, string Email);<\/code><\/pre>\n<p>The generic type parameters define each executor&#8217;s contract: <code>TInput<\/code> is\nwhat it receives from the previous step (or the workflow&#8217;s initial input),\nand <code>TOutput<\/code> is what it passes downstream. The string in the base\nconstructor (e.g., <code>\"OrderLookup\"<\/code>) is the executor&#8217;s unique ID within\nthe workflow.<\/p>\n<h3>WorkflowBuilder<\/h3>\n<p>The <strong>WorkflowBuilder<\/strong> wires executors into a directed graph. You define\nedges between executors to control the flow of data, and the builder produces\nan immutable <code>Workflow<\/code> object. Here is what the CancelOrder workflow graph\nlooks like:<\/p>\n<pre><code class=\"language-text\">OrderLookup \u2500\u2500\u25ba OrderCancel \u2500\u2500\u25ba SendEmail<\/code><\/pre>\n<pre><code class=\"language-csharp\">OrderLookup orderLookup = new();\r\nOrderCancel orderCancel = new();\r\nSendEmail sendEmail = new();\r\n\r\nWorkflow cancelOrder = new WorkflowBuilder(orderLookup)\r\n    .WithName(\"CancelOrder\")\r\n    .WithDescription(\"Cancel an order and notify the customer\")\r\n    .AddEdge(orderLookup, orderCancel)\r\n    .AddEdge(orderCancel, sendEmail)\r\n    .Build();<\/code><\/pre>\n<p>The <code>TOutput<\/code> of each executor must match the <code>TInput<\/code> of the executor\nit flows into, and the framework enforces this at compile time.<\/p>\n<h3>Running In-Process<\/h3>\n<p>The quickest way to run a workflow is in-process with\n<code>InProcessExecution.RunStreamingAsync<\/code>. This returns a <code>StreamingRun<\/code>\nthat emits events as each step completes:<\/p>\n<pre><code class=\"language-csharp\">var cancelRequest = new OrderCancelRequest(\r\n    OrderId: \"123\", Reason: \"Wrong color\");\r\n\r\nawait using StreamingRun run =\r\n    await InProcessExecution.RunStreamingAsync(\r\n        cancelOrder, input: cancelRequest);\r\n\r\nawait foreach (WorkflowEvent evt in run.WatchStreamAsync())\r\n{\r\n    if (evt is ExecutorCompletedEvent completed)\r\n    {\r\n        Console.WriteLine(\r\n            $\"{completed.ExecutorId}: {completed.Data}\");\r\n    }\r\n}<\/code><\/pre>\n<p>This is all it takes to run a workflow. No external dependencies, no\ninfrastructure setup, just a .NET console app. Run it with:<\/p>\n<pre><code class=\"language-bash\">dotnet run<\/code><\/pre>\n<h2>Making Workflows Durable<\/h2>\n<p>The in-process runner executes everything in memory, so if the process\nexits (whether from a crash, a restart, or simply reaching the end of a\nlong-running step), all workflow state is lost. Real-world AI agent\nworkflows often need to survive process restarts, run for extended\nperiods, and be observable from external tooling. The\n<code>Microsoft.Agents.AI.DurableTask<\/code> package adds all of this to any MAF\nworkflow without changing the workflow definition. It is powered by the\n<a href=\"https:\/\/learn.microsoft.com\/azure\/durable-task\/\">Durable Task tech stack<\/a>.<\/p>\n<p>Install the package:<\/p>\n<pre><code class=\"language-bash\">dotnet add package Microsoft.Agents.AI.DurableTask --prerelease<\/code><\/pre>\n<p>The durable runtime provides:<\/p>\n<ul>\n<li><strong>Stateful, durable execution<\/strong>: workflows survive process restarts\nand failures<\/li>\n<li><strong>Automatic checkpointing<\/strong>: progress is saved after each step<\/li>\n<li><strong>Distributed execution<\/strong>: executors in a workflow can run across\ndifferent machines. One executor might be running on one VM while\nanother executor in the same workflow runs on a completely different\none. The <a href=\"https:\/\/learn.microsoft.com\/azure\/durable-task\/scheduler\/durable-task-scheduler?toc=\/azure\/durable-task\/common\/toc.json\">Durable Task Scheduler<\/a> coordinates them.<\/li>\n<li><strong>Long-running orchestrations<\/strong>: workflows can run for minutes, hours,\nor even days<\/li>\n<li><strong>Observability<\/strong>: built-in dashboard for monitoring and managing\nworkflow executions<\/li>\n<\/ul>\n<h3>Running the DTS Emulator<\/h3>\n<p>The durable runtime needs a backend to store workflow state and coordinate\nexecution. The\nDurable Task Scheduler (DTS)\nserves this role. It persists checkpoints, manages orchestration history,\nand provides a dashboard for monitoring runs. For local development, you\ncan run the DTS emulator in Docker with a single command:<\/p>\n<pre><code class=\"language-bash\">docker run -d --name dts-emulator \\\r\n  -p 8080:8080 -p 8082:8082 \\\r\n  mcr.microsoft.com\/dts\/dts-emulator:latest<\/code><\/pre>\n<ul>\n<li><strong>Port 8080<\/strong>: Scheduler endpoint (used by the app)<\/li>\n<li><strong>Port 8082<\/strong>: Dashboard UI (open <code>http:\/\/localhost:8082<\/code> in your\nbrowser)<\/li>\n<\/ul>\n<h3>A Durable Console App<\/h3>\n<p>To enable durability, add the following NuGet packages to your project:<\/p>\n<pre><code class=\"language-bash\">dotnet add package Microsoft.Agents.AI.DurableTask --prerelease\r\ndotnet add package Microsoft.DurableTask.Client.AzureManaged\r\ndotnet add package Microsoft.DurableTask.Worker.AzureManaged\r\ndotnet add package Microsoft.Extensions.Hosting<\/code><\/pre>\n<p>Notice that <strong>the workflow definition doesn&#8217;t change<\/strong>. You use the same\n<code>WorkflowBuilder<\/code> code. The only difference is how you host the workflow.\nInstead of <code>InProcessExecution.RunStreamingAsync<\/code>, you configure a .NET\nGeneric Host with <code>ConfigureDurableWorkflows<\/code>:<\/p>\n<pre><code class=\"language-csharp\">using Microsoft.Agents.AI.DurableTask;\r\nusing Microsoft.Agents.AI.DurableTask.Workflows;\r\nusing Microsoft.Agents.AI.Workflows;\r\nusing Microsoft.DurableTask.Client.AzureManaged;\r\nusing Microsoft.DurableTask.Worker.AzureManaged;\r\nusing Microsoft.Extensions.DependencyInjection;\r\nusing Microsoft.Extensions.Hosting;\r\n\r\nstring dtsConnectionString =\r\n    Environment.GetEnvironmentVariable(\r\n        \"DURABLE_TASK_SCHEDULER_CONNECTION_STRING\")\r\n    ?? \"Endpoint=http:\/\/localhost:8080;TaskHub=default;Authentication=None\";\r\n\r\n\/\/ Same workflow definition as before\r\nOrderLookup orderLookup = new();\r\nOrderCancel orderCancel = new();\r\nSendEmail sendEmail = new();\r\n\r\nWorkflow cancelOrder = new WorkflowBuilder(orderLookup)\r\n    .WithName(\"CancelOrder\")\r\n    .WithDescription(\"Cancel an order and notify the customer\")\r\n    .AddEdge(orderLookup, orderCancel)\r\n    .AddEdge(orderCancel, sendEmail)\r\n    .Build();\r\n\r\n\/\/ Host it with the durable runtime\r\nIHost host = Host.CreateDefaultBuilder(args)\r\n    .ConfigureServices(services =&gt;\r\n    {\r\n        services.ConfigureDurableWorkflows(\r\n            workflowOptions =&gt;\r\n                workflowOptions.AddWorkflow(cancelOrder),\r\n            workerBuilder: builder =&gt;\r\n                builder.UseDurableTaskScheduler(dtsConnectionString),\r\n            clientBuilder: builder =&gt;\r\n                builder.UseDurableTaskScheduler(dtsConnectionString));\r\n    })\r\n    .Build();\r\n\r\nawait host.StartAsync();\r\n\r\ntry\r\n{\r\n    IWorkflowClient workflowClient =\r\n        host.Services.GetRequiredService&lt;IWorkflowClient&gt;();\r\n\r\n    OrderCancelRequest request = new(\r\n        OrderId: \"12345\", Reason: \"Wrong color\");\r\n\r\n    Console.WriteLine(\r\n        $\"Starting workflow for order '{request.OrderId}'...\");\r\n\r\n    IAwaitableWorkflowRun run =\r\n        (IAwaitableWorkflowRun)await workflowClient\r\n            .RunAsync(cancelOrder, request);\r\n\r\n    Console.WriteLine($\"Workflow started with run id: {run.RunId}\");\r\n    string? result = await run.WaitForCompletionAsync&lt;string&gt;();\r\n    Console.WriteLine($\"Workflow completed. {result}\");\r\n}\r\nfinally\r\n{\r\n    await host.StopAsync();\r\n}<\/code><\/pre>\n<p><code>ConfigureDurableWorkflows<\/code> registers the workflow with the Durable Task\nruntime, maps each executor to a durable activity, and wires up the\norchestration. The <code>IWorkflowClient<\/code> provides a clean API for starting\nruns and waiting for results.<\/p>\n<p>Once the workflow completes, open the DTS Dashboard at\n<code>http:\/\/localhost:8082<\/code> to inspect the run, see executor timelines, and\nview inputs\/outputs for each step. Under the hood, each executor in your\nworkflow becomes a durable activity, named with a <code>dafx-<\/code> prefix in the\ndashboard (e.g., <code>dafx-OrderLookup<\/code>, <code>dafx-OrderCancel<\/code>, <code>dafx-SendEmail<\/code>).<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2026\/05\/dts-dashboard-workflow-run.webp\" alt=\"DTS Dashboard showing the sequential workflow run\" \/><\/p>\n<p>To summarize: <strong>same workflow definition, different runtime<\/strong>. Swap the\nhost, and your workflow gains durability, checkpointing, observability,\nand distributed execution, with no changes to the executor code.<\/p>\n<h2>Fan-Out \/ Fan-In with AI Agents<\/h2>\n<p>When you need multiple agents to process the same input concurrently,\nuse the <strong>fan-out \/ fan-in<\/strong> pattern. <code>AddFanOutEdge<\/code> sends a message\nto multiple executors in parallel, and <code>AddFanInBarrierEdge<\/code> waits for\nall of them to complete before proceeding.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2026\/05\/fanout-workflow-diagram.svg\" alt=\"Fan-out\/fan-in workflow diagram with AI agents\" \/><\/p>\n<p>MAF supports using <strong>AI agents directly as executors<\/strong>. The\n<code>AsAIAgent<\/code> extension method creates an executor from a chat client and\nsystem prompt. Since this sample uses Azure OpenAI, set the following\nenvironment variables before running:<\/p>\n<table>\n<thead>\n<tr>\n<th>Variable<\/th>\n<th>Example value<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td><code>AZURE_OPENAI_ENDPOINT<\/code><\/td>\n<td><code>https:\/\/&lt;your-resource&gt;.cognitiveservices.azure.com<\/code><\/td>\n<\/tr>\n<tr>\n<td><code>AZURE_OPENAI_DEPLOYMENT<\/code><\/td>\n<td><code>gpt-4o<\/code><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>Note that this sample uses <code>ConfigureDurableOptions<\/code> instead of\n<code>ConfigureDurableWorkflows<\/code>. This is a more general API that gives you\naccess to both <code>options.Workflows<\/code> and <code>options.Agents<\/code>, making it\npossible to register standalone AI agents alongside workflows in the\nsame host.<\/p>\n<pre><code class=\"language-csharp\">using Azure.AI.OpenAI;\r\nusing Azure.Identity;\r\nusing Microsoft.Agents.AI;\r\nusing Microsoft.Agents.AI.DurableTask;\r\nusing Microsoft.Agents.AI.DurableTask.Workflows;\r\nusing Microsoft.Agents.AI.Workflows;\r\nusing Microsoft.DurableTask.Client.AzureManaged;\r\nusing Microsoft.DurableTask.Worker.AzureManaged;\r\nusing Microsoft.Extensions.DependencyInjection;\r\nusing Microsoft.Extensions.Hosting;\r\nusing OpenAI.Chat;\r\n\r\nstring dtsConnectionString =\r\n    Environment.GetEnvironmentVariable(\r\n        \"DURABLE_TASK_SCHEDULER_CONNECTION_STRING\")\r\n    ?? \"Endpoint=http:\/\/localhost:8080;TaskHub=default;Authentication=None\";\r\nstring endpoint =\r\n    Environment.GetEnvironmentVariable(\"AZURE_OPENAI_ENDPOINT\")\r\n    ?? throw new InvalidOperationException(\r\n        \"AZURE_OPENAI_ENDPOINT is not set.\");\r\nstring deploymentName =\r\n    Environment.GetEnvironmentVariable(\"AZURE_OPENAI_DEPLOYMENT\")\r\n    ?? throw new InvalidOperationException(\r\n        \"AZURE_OPENAI_DEPLOYMENT is not set.\");\r\n\r\nAzureOpenAIClient openAiClient = new(\r\n    new Uri(endpoint), new AzureCliCredential());\r\nChatClient chatClient = openAiClient.GetChatClient(deploymentName);\r\n\r\n\/\/ AI agents as executors\r\nAIAgent physicist = chatClient.AsAIAgent(\r\n    \"You are a physics expert. Be concise (2-3 sentences).\",\r\n    \"Physicist\");\r\nAIAgent chemist = chatClient.AsAIAgent(\r\n    \"You are a chemistry expert. Be concise (2-3 sentences).\",\r\n    \"Chemist\");\r\n\r\nParseQuestionExecutor parseQuestion = new();\r\nAggregatorExecutor aggregator = new();\r\n\r\n\/\/ Build workflow: ParseQuestion -&gt; [Physicist, Chemist] -&gt; Aggregator\r\nWorkflow workflow = new WorkflowBuilder(parseQuestion)\r\n    .WithName(\"ExpertReview\")\r\n    .AddFanOutEdge(parseQuestion, [physicist, chemist])\r\n    .AddFanInBarrierEdge([physicist, chemist], aggregator)\r\n    .Build();\r\n\r\nIHost host = Host.CreateDefaultBuilder(args)\r\n    .ConfigureServices(services =&gt;\r\n    {\r\n        \/\/ ConfigureDurableOptions is the more general sibling of\r\n        \/\/ ConfigureDurableWorkflows. It gives access to both\r\n        \/\/ options.Workflows and options.Agents, so you can register\r\n        \/\/ standalone AI agents alongside workflows in the same host.\r\n        services.ConfigureDurableOptions(\r\n            options =&gt; options.Workflows.AddWorkflow(workflow),\r\n            workerBuilder: builder =&gt;\r\n                builder.UseDurableTaskScheduler(dtsConnectionString),\r\n            clientBuilder: builder =&gt;\r\n                builder.UseDurableTaskScheduler(dtsConnectionString));\r\n    })\r\n    .Build();\r\n\r\nawait host.StartAsync();\r\n\r\ntry\r\n{\r\n    IWorkflowClient workflowClient =\r\n        host.Services.GetRequiredService&lt;IWorkflowClient&gt;();\r\n\r\n    IAwaitableWorkflowRun run =\r\n        (IAwaitableWorkflowRun)await workflowClient\r\n            .RunAsync(workflow, \"Why is the sky blue?\");\r\n\r\n    Console.WriteLine($\"Run ID: {run.RunId}\");\r\n    string? result = await run.WaitForCompletionAsync&lt;string&gt;();\r\n    Console.WriteLine($\"Workflow completed!\\n{result}\");\r\n}\r\nfinally\r\n{\r\n    await host.StopAsync();\r\n}<\/code><\/pre>\n<p>The <code>ParseQuestionExecutor<\/code> validates the input, both AI agents run\n<strong>in parallel<\/strong> against the Durable Task Scheduler, and the\n<code>AggregatorExecutor<\/code> combines their responses. Because this runs on\nthe distributed runtime, the Physicist agent could be executing on one\nVM while the Chemist agent runs on another. Each agent&#8217;s response is\ncheckpointed, so if the process restarts mid-flight, completed agents\ndon&#8217;t re-execute.<\/p>\n<p>You can see the parallel execution in the DTS Dashboard:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2026\/05\/dts-dashboard-parallel-run.webp\" alt=\"DTS Dashboard showing parallel AI agent runs\" \/><\/p>\n<h2>Hosting on Azure Functions<\/h2>\n<p>Now that you&#8217;ve seen durable workflows running in console apps, let&#8217;s\ntake them to the cloud. The\n<code>Microsoft.Agents.AI.Hosting.AzureFunctions<\/code> package bridges MAF\nworkflows with the Azure Functions runtime, giving you serverless\nscaling with zero infrastructure management.<\/p>\n<pre><code class=\"language-bash\">dotnet add package Microsoft.Agents.AI.Hosting.AzureFunctions<\/code><\/pre>\n<h3>Why Azure Functions?<\/h3>\n<ul>\n<li><strong>Serverless scaling<\/strong>: Azure Functions automatically scales out\nbased on workload. Workflow executions scale independently without\nmanaging infrastructure.<\/li>\n<li><strong>Built-in HTTP endpoints<\/strong>: Each registered workflow gets an HTTP\ntrigger automatically. No need to write controllers or routing logic.<\/li>\n<li><strong>MCP tool support<\/strong>: Workflows can be exposed as\n<a href=\"https:\/\/modelcontextprotocol.io\/\">MCP (Model Context Protocol)<\/a> tools\nwith a single flag, making them discoverable by AI agents and other\nMCP-compatible clients.<\/li>\n<li><strong>Durable Task Scheduler integration<\/strong>: Workflow state is persisted\nand managed by the Durable Task Scheduler, providing reliability,\nobservability, and cross-process coordination.<\/li>\n<li><strong>Zero boilerplate<\/strong>: The hosting package generates orchestrator\nfunctions, activity functions, and entity functions from the workflow\ndefinition. The only code you write is the executors and the workflow\ngraph.<\/li>\n<\/ul>\n<h3>Hosting a Workflow<\/h3>\n<p>Here&#8217;s a complete <code>Program.cs<\/code> for a Functions app that hosts the\nCancelOrder workflow:<\/p>\n<pre><code class=\"language-csharp\">using Microsoft.Agents.AI.Hosting.AzureFunctions;\r\nusing Microsoft.Agents.AI.Workflows;\r\nusing Microsoft.Azure.Functions.Worker.Builder;\r\nusing Microsoft.Extensions.Hosting;\r\n\r\nOrderLookup orderLookup = new();\r\nOrderCancel orderCancel = new();\r\nSendEmail sendEmail = new();\r\n\r\nWorkflow cancelOrder = new WorkflowBuilder(orderLookup)\r\n    .WithName(\"CancelOrder\")\r\n    .WithDescription(\"Cancel an order and notify the customer\")\r\n    .AddEdge(orderLookup, orderCancel)\r\n    .AddEdge(orderCancel, sendEmail)\r\n    .Build();\r\n\r\nusing IHost app = FunctionsApplication\r\n    .CreateBuilder(args)\r\n    .ConfigureFunctionsWebApplication()\r\n    .ConfigureDurableWorkflows(workflows =&gt; workflows.AddWorkflow(cancelOrder))\r\n    .Build();\r\n\r\napp.Run();<\/code><\/pre>\n<p>The <code>.ConfigureDurableWorkflows()<\/code> extension method is the single call\nthat bridges your workflow to the Azure Functions runtime. Behind the\nscenes, the hosting layer automatically maps your workflow concepts to\nDurable Functions primitives:<\/p>\n<ul>\n<li><strong>Your workflow becomes an orchestrator function<\/strong>: the workflow\ngraph is translated into a durable orchestration<\/li>\n<li><strong>Each executor becomes an activity function<\/strong>: executors are\nwrapped as durable activities with automatic retry, checkpointing,\nand fault tolerance<\/li>\n<li>An HTTP trigger is generated to start the workflow:\n<code>POST \/api\/workflows\/CancelOrder\/run<\/code><\/li>\n<\/ul>\n<p>Because these are standard Azure Functions under the hood, you\nautomatically get all the platform benefits: auto-scaling,\nscale-to-zero when idle (so you only pay for actual compute), built-in\nmonitoring via Application Insights, distributed tracing, and the full\nAzure Functions diagnostics tooling. No extra infrastructure code needed.<\/p>\n<p>You can register multiple workflows in a single Functions app, and\nexecutors can be shared across workflows:<\/p>\n<pre><code class=\"language-csharp\">.ConfigureDurableWorkflows(workflows =&gt;\r\n    workflows.AddWorkflows(cancelOrder, orderStatus, batchProcess))<\/code><\/pre>\n<h3>Invoking the Workflow<\/h3>\n<p>Once the Functions app is running, trigger the workflow with a simple\nHTTP request:<\/p>\n<pre><code class=\"language-http\">POST http:\/\/localhost:7071\/api\/workflows\/CancelOrder\/run\r\nContent-Type: text\/plain\r\n\r\n12345<\/code><\/pre>\n<p>This starts the orchestration asynchronously and returns a <code>202 Accepted<\/code>\nresponse with a run ID. The workflow then executes durably in the\nbackground. If you want the request to wait and return the result\nsynchronously, add the <code>x-ms-wait-for-response: true<\/code> header:<\/p>\n<pre><code class=\"language-http\">POST http:\/\/localhost:7071\/api\/workflows\/CancelOrder\/run\r\nContent-Type: text\/plain\r\nx-ms-wait-for-response: true\r\n\r\n12345<\/code><\/pre>\n<h2>Human-in-the-Loop<\/h2>\n<p>The <strong>human-in-the-loop<\/strong> pattern pauses a workflow to wait for external\napproval or input before continuing. In MAF, this is modeled using\n<code>RequestPort<\/code>.<\/p>\n<p>A <code>RequestPort<\/code> acts like an executor in the graph, but instead of\nprocessing data automatically, it pauses the orchestration and waits for\nan external response. When hosted on Azure Functions, the framework\nauto-generates HTTP endpoints for checking pending requests and submitting\nresponses.<\/p>\n<p>Here&#8217;s an expense reimbursement workflow with a manager approval gate\nfollowed by two parallel finance approvals:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2026\/05\/hitl-workflow-diagram.svg\" alt=\"Human-in-the-Loop expense reimbursement workflow diagram\" \/><\/p>\n<pre><code class=\"language-csharp\">CreateApprovalRequest createRequest = new();\r\nRequestPort&lt;ApprovalRequest, ApprovalResponse&gt; managerApproval =\r\n    RequestPort.Create&lt;ApprovalRequest, ApprovalResponse&gt;(\r\n        \"ManagerApproval\");\r\nPrepareFinanceReview prepareFinanceReview = new();\r\nRequestPort&lt;ApprovalRequest, ApprovalResponse&gt; budgetApproval =\r\n    RequestPort.Create&lt;ApprovalRequest, ApprovalResponse&gt;(\r\n        \"BudgetApproval\");\r\nRequestPort&lt;ApprovalRequest, ApprovalResponse&gt; complianceApproval =\r\n    RequestPort.Create&lt;ApprovalRequest, ApprovalResponse&gt;(\r\n        \"ComplianceApproval\");\r\nExpenseReimburse reimburse = new();\r\n\r\nWorkflow expenseApproval = new WorkflowBuilder(createRequest)\r\n    .WithName(\"ExpenseReimbursement\")\r\n    .WithDescription(\r\n        \"Expense reimbursement with manager and finance approvals\")\r\n    .AddEdge(createRequest, managerApproval)\r\n    .AddEdge(managerApproval, prepareFinanceReview)\r\n    .AddFanOutEdge(prepareFinanceReview,\r\n        [budgetApproval, complianceApproval])\r\n    .AddFanInBarrierEdge(\r\n        [budgetApproval, complianceApproval], reimburse)\r\n    .Build();<\/code><\/pre>\n<p>When hosted on Azure Functions, the runtime automatically generates the\nfollowing HTTP endpoints for this workflow:<\/p>\n<table>\n<thead>\n<tr>\n<th>Method<\/th>\n<th>Endpoint<\/th>\n<th>When generated<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>POST<\/td>\n<td><code>\/api\/workflows\/ExpenseReimbursement\/run<\/code><\/td>\n<td>Always, for every registered workflow<\/td>\n<\/tr>\n<tr>\n<td>POST<\/td>\n<td><code>\/api\/workflows\/ExpenseReimbursement\/respond\/{runId}<\/code><\/td>\n<td>Automatically, when the workflow contains <code>RequestPort<\/code> nodes<\/td>\n<\/tr>\n<tr>\n<td>GET<\/td>\n<td><code>\/api\/workflows\/ExpenseReimbursement\/status\/{runId}<\/code><\/td>\n<td>Opt-in via <code>exposeStatusEndpoint: true<\/code><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>External systems (or humans via a UI) can call the status endpoint to\nsee which approvals are pending, then POST a response to unblock the\nworkflow:<\/p>\n<pre><code class=\"language-http\">POST http:\/\/localhost:7071\/api\/workflows\/ExpenseReimbursement\/respond\/{runId}\r\nContent-Type: application\/json\r\n\r\n{\r\n  \"eventName\": \"ManagerApproval\",\r\n  \"response\": { \"approved\": true, \"comments\": \"Looks good\" }\r\n}<\/code><\/pre>\n<h2>Exposing Workflows as MCP Tools<\/h2>\n<p>With the <code>exposeMcpToolTrigger: true<\/code> option, your workflows become\ncallable as MCP tools. Other AI agents or MCP-compatible clients can\ndiscover and invoke your workflows:<\/p>\n<pre><code class=\"language-csharp\">.ConfigureDurableWorkflows(workflows =&gt;\r\n{\r\n    workflows.AddWorkflow(orderLookupWorkflow,\r\n        exposeStatusEndpoint: false,\r\n        exposeMcpToolTrigger: true);\r\n})<\/code><\/pre>\n<p>The Functions host generates a remote MCP endpoint at\n<code>\/runtime\/webhooks\/mcp<\/code> with a tool for each registered workflow. The\nworkflow&#8217;s <code>.WithName()<\/code> and <code>.WithDescription()<\/code> are used as the MCP\ntool name and description. Once exposed, any MCP client can connect to\nyour Functions app and use these workflows as tools. This includes AI\nagents built with other frameworks, IDE extensions like GitHub Copilot,\nand any other MCP-compatible client.<\/p>\n<h2>More Workflow Patterns<\/h2>\n<p>The workflow programming model supports several additional patterns that\nwork with both in-process and durable execution:<\/p>\n<h3>Conditional Routing<\/h3>\n<p>Use <code>AddSwitch<\/code> to route messages to different executors based on the\noutput of a previous step:<\/p>\n<pre><code class=\"language-csharp\">builder.AddSwitch(spamDetector, switchBuilder =&gt;\r\n    switchBuilder\r\n        .AddCase(\r\n            result =&gt; result is DetectionResult r\r\n                &amp;&amp; r.Decision == SpamDecision.NotSpam,\r\n            emailAssistant)\r\n        .AddCase(\r\n            result =&gt; result is DetectionResult r\r\n                &amp;&amp; r.Decision == SpamDecision.Spam,\r\n            handleSpam)\r\n        .WithDefault(handleUncertain));<\/code><\/pre>\n<h3>Shared State<\/h3>\n<p>Executors can share data through scoped key-value state, useful when\nparallel executors need access to the same source data:<\/p>\n<pre><code class=\"language-csharp\">\/\/ Write to shared state in one executor\r\nawait context.QueueStateUpdateAsync(\r\n    fileID, fileContent,\r\n    scopeName: \"FileContentState\", cancellationToken);\r\n\r\n\/\/ Read from shared state in another executor\r\nvar fileContent = await context.ReadStateAsync&lt;string&gt;(\r\n    message, scopeName: \"FileContentState\", cancellationToken);<\/code><\/pre>\n<h3>Sub-Workflows<\/h3>\n<p>Embed a workflow as an executor inside another workflow for modular,\nhierarchical architectures:<\/p>\n<pre><code class=\"language-csharp\">var subWorkflow = new WorkflowBuilder(uppercase)\r\n    .AddEdge(uppercase, reverse)\r\n    .AddEdge(reverse, append)\r\n    .WithOutputFrom(append)\r\n    .Build();\r\n\r\nExecutorBinding subWorkflowExecutor =\r\n    subWorkflow.BindAsExecutor(\"TextProcessing\");\r\n\r\nvar mainWorkflow = new WorkflowBuilder(prefix)\r\n    .AddEdge(prefix, subWorkflowExecutor)\r\n    .AddEdge(subWorkflowExecutor, postProcess)\r\n    .WithOutputFrom(postProcess)\r\n    .Build();<\/code><\/pre>\n<p>When running on the durable runtime, sub-workflows execute as\nsub-orchestrations with proper result propagation.<\/p>\n<h2>Wrapping Up<\/h2>\n<p>Durable workflows in the Microsoft Agent Framework bring together the\nflexibility of AI agent orchestration and the reliability of durable\nexecution. Starting from a simple console app with in-process execution,\nyou can progressively add durability, parallel AI agents, and cloud\nhosting, all while keeping the same workflow definition.<\/p>\n<p>Here are some useful links to get started:<\/p>\n<ul>\n<li><a href=\"https:\/\/github.com\/microsoft\/agent-framework\">Microsoft Agent Framework on GitHub<\/a><\/li>\n<li><a href=\"https:\/\/github.com\/microsoft\/agent-framework\/tree\/main\/dotnet\/samples\/03-workflows\">Workflow samples<\/a><\/li>\n<li><a href=\"https:\/\/github.com\/microsoft\/agent-framework\/tree\/main\/dotnet\/samples\/04-hosting\/DurableWorkflows\/AzureFunctions\">Azure Functions hosting samples<\/a><\/li>\n<li><a href=\"https:\/\/www.nuget.org\/packages\/Microsoft.Agents.AI.DurableTask\">Microsoft.Agents.AI.DurableTask on NuGet<\/a><\/li>\n<li><a href=\"https:\/\/www.nuget.org\/packages\/Microsoft.Agents.AI.Hosting.AzureFunctions\">Microsoft.Agents.AI.Hosting.AzureFunctions on NuGet<\/a><\/li>\n<\/ul>\n<p>We&#8217;d love to hear what you build! Share your feedback or file issues on\nthe <a href=\"https:\/\/github.com\/microsoft\/agent-framework\">GitHub repo<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Build durable AI agent workflows with the Microsoft Agent Framework. Start with in-process console apps, add durability with the Durable Task runtime, scale with parallel AI agents, and host on Azure Functions for serverless execution.<\/p>\n","protected":false},"author":211774,"featured_media":60056,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[685,7781,327],"tags":[4,8074,568,7537,8159,8158,8160],"class_list":["post-60055","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-dotnet","category-ai","category-azure","tag-net","tag-agents","tag-ai","tag-azure-functions","tag-durable-task","tag-microsoft-agent-framework","tag-workflows"],"acf":[],"blog_post_summary":"<p>Build durable AI agent workflows with the Microsoft Agent Framework. Start with in-process console apps, add durability with the Durable Task runtime, scale with parallel AI agents, and host on Azure Functions for serverless execution.<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts\/60055","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/users\/211774"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/comments?post=60055"}],"version-history":[{"count":1,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts\/60055\/revisions"}],"predecessor-version":[{"id":60057,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts\/60055\/revisions\/60057"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/media\/60056"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/media?parent=60055"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/categories?post=60055"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/tags?post=60055"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}