{"id":55409,"date":"2025-01-31T12:51:45","date_gmt":"2025-01-31T20:51:45","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/dotnet\/?p=55409"},"modified":"2025-01-31T12:51:45","modified_gmt":"2025-01-31T20:51:45","slug":"start-building-an-intelligent-app-with-dotnet-and-deep-seek","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/dotnet\/start-building-an-intelligent-app-with-dotnet-and-deep-seek\/","title":{"rendered":"Build Intelligent Apps with .NET and DeepSeek R1 Today!"},"content":{"rendered":"<p>The DeepSeek R1 model has been gaining a ton of attention lately. And one of the questions we&#8217;ve been getting asked is: &#8220;Can I use DeepSeek in my .NET applications&#8221;? The answer is absolutely! I&#8217;m going to walk you through how to use the <strong><a href=\"https:\/\/learn.microsoft.com\/dotnet\/ai\/ai-extensions\">Microsoft.Extensions.AI<\/a><\/strong> (MEAI) library with DeepSeek R1 on GitHub Models so you can start experimenting with the R1 model today.<\/p>\n<h2>MEAI makes using AI services easy<\/h2>\n<p>The MEAI library provides a set of unified abstractions and middleware to simplify the integration of AI services into .NET applications.<\/p>\n<p>In other words, if you develop your application with MEAI, your code will use the same APIs no matter which model you decide to use &#8220;under the covers&#8221;. This lowers the friction it takes to build a .NET AI application as you&#8217;ll only have to remember a single library&#8217;s (MEAI&#8217;s) way of doing things regardless of which AI service you use.<\/p>\n<p>And for MEAI, the main interface you&#8217;ll use is <code>IChatClient<\/code>.<\/p>\n<h3>Let&#8217;s chat with DeepSeek R1<\/h3>\n<p>GitHub Models allows you to experiment with a ton of different AI models without having to worry about hosting. It&#8217;s a great way to get started in your AI development journey for free. And GitHub Models gets updated with new models all the time, <a href=\"https:\/\/azure.microsoft.com\/blog\/deepseek-r1-is-now-available-on-azure-ai-foundry-and-github\/\">like DeepSeek&#8217;s R1<\/a>.<\/p>\n<p><a href=\"https:\/\/azure.microsoft.com\/blog\/deepseek-r1-is-now-available-on-azure-ai-foundry-and-github\/\"><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2025\/01\/gh-models.jpg\" alt=\"Announcement that DeepSeek R1 is available on GitHub Models\" \/><\/a><\/p>\n<p>The demo app we&#8217;re going to build is a simple console application and it&#8217;s available on <a href=\"https:\/\/github.com\/codemillmatt\/deepseek-dotnet\">GitHub at codemillmatt\/deepseek-dotnet<\/a>. You can clone or fork it to follow along, but we&#8217;ll talk through the important pieces below too.<\/p>\n<p>First let&#8217;s take care of some prerequisites:<\/p>\n<ol>\n<li>Head on over to GitHub and generate a personal access token (PAT). This will be your key for GitHub Models access. <a href=\"https:\/\/docs.github.com\/en\/authentication\/keeping-your-account-and-data-secure\/managing-your-personal-access-tokens#creating-a-personal-access-token-classic\">Follow these instructions to create the PAT<\/a>. You will want a <em>classic<\/em> token.<\/li>\n<li>Open the <strong>DeepSeek.Console.GHModels<\/strong> project. You can either open the full solution in Visual Studio or just the project folder if using VS Code.<\/li>\n<li>Create a new user secrets entry for the GitHub PAT. Name it <strong>GH_TOKEN<\/strong> and paste in the PAT you generated as the value.<\/li>\n<\/ol>\n<p>Now let&#8217;s explore the code a bit:<\/p>\n<ol>\n<li>Open the <strong>Program.cs<\/strong> file in the <strong>DeepSeek.Console.GHModels<\/strong> project.<\/li>\n<li>The first 2 things to notice are where we initialize the <code>modelEndpoint<\/code> and <code>modelName<\/code> variables. These are standard for the GitHub Models service, they will always be the same.<\/li>\n<li>Now for the fun part! We&#8217;re going to initialize our chat client. This is where we&#8217;ll connect to the DeepSeek R1 model.\n<pre><code class=\"language-csharp\">IChatClient client = new ChatCompletionsClient(modelEndpoint, new AzureKeyCredential(Configuration[\"GH_TOKEN\"])).AsChatClient(modelName);<\/code><\/pre>\n<p>This uses the <strong>Microsoft.Extensions.AI.AzureAIInference<\/strong> package to connect to the GitHub Models service. But the <code>AsChatClient<\/code> function returns an <code>IChatClient<\/code> implementation. And that&#8217;s super cool. Because regardless of which model we chose from GitHub Models, we&#8217;d still write our application against the <code>IChatClient<\/code> interface!<\/li>\n<li>Next up we pass in our question, or prompt, to the model. And we&#8217;ll use make sure we get a streaming response back, this way we can display it as it comes in.\n<pre><code class=\"language-csharp\">var response = client.CompleteStreamingAsync(question);\r\n\r\nawait foreach (var item in response)\r\n{\r\n    Console.Write(item);\r\n}<\/code><\/pre>\n<\/li>\n<\/ol>\n<p>That&#8217;s it! Go ahead and run the project. It might take a few seconds to get the response back (lots of people are trying the model out!). You&#8217;ll notice the response isn&#8217;t like you&#8217;d see in a &#8220;normal&#8221; chat bot. DeepSeek R1 is a reasoning model, so it wants to figure out and reason through problems. The first part of the response will be it&#8217;s <em>reasoning<\/em> and will be delimited by <strong>\\&lt;think&gt;<\/strong> and is quite interesting. The second part of the response will be the answer to the question you asked.<\/p>\n<p>Here&#8217;s a partial example of a response:<\/p>\n<pre><code class=\"language-md\">&lt;think&gt;\r\nOkay, let's try to figure this out. The problem says: If I have 3 apples and eat 2, how many bananas do I have? Hmm, at first glance, that seems a bit confusing. Let me break it down step by step.\r\n\r\nSo, the person starts with 3 apples. Then they eat 2 of them. That part is straightforward. If you eat 2 apples out of 3, you'd have 1 apple left, right? But then the question shifts to bananas. Wait, where did bananas come from? The original problem only mentions apples. There's no mention of bananas at all.\r\n\r\n...<\/code><\/pre>\n<h3>Do I have to use GitHub Models?<\/h3>\n<p>You&#8217;re not limited to running DeepSeek R1 on GitHub Models. You can run it on Azure or even locally (or on GitHub Codespaces) through Ollama. I provided 2 additional console applications in the GitHub repository that show you how to do that.<\/p>\n<p>The biggest difference between the GitHub Models version is where the DeepSeek R1 model is deployed, the credentials you use to connect to it, and the specific model name.<\/p>\n<p>If you deploy on Azure AI Foundry, the code is exactly the same. Here are <a href=\"https:\/\/azure.microsoft.com\/blog\/deepseek-r1-is-now-available-on-azure-ai-foundry-and-github\/\">some instructions on how to deploy<\/a> the DeepSeek R1 model into Azure AI Foundry.<\/p>\n<p>If you want to run locally on Ollama, we&#8217;ve provided a devcontainer definition that you can use to run Ollama in Docker. It will automatically pull down a small parameter version of DeepSeek R1 and start it up for you. The only difference is you&#8217;ll use the <strong>Microsoft.Extensions.AI.Ollama<\/strong> NuGet package and initialize the <code>IChatClient<\/code> with the with <code>OllamaChatClient<\/code>. Interacting with DeepSeek R1 is the same.<\/p>\n<blockquote><p>Note: If you <a href=\"https:\/\/codespaces.new\/codemillmatt\/deepseek-dotnet\">run this in a GitHub Codespace<\/a>, it will take a couple of minutes to start up and you&#8217;ll use roughly 8GB of space &#8211; so be aware depending on your Codespace plan.<\/p><\/blockquote>\n<p>Of course these are simple Console applications. If you&#8217;re using .NET Aspire, it&#8217;s easy to use Ollama and DeepSeek R1. Thanks to the .NET Aspire Community Toolkit&#8217;s Ollama integration, all you need to do is add one line and you&#8217;re all set!<\/p>\n<pre><code class=\"language-csharp\">var chat = ollama.AddModel(\"chat\", \"deepseek-r1\");<\/code><\/pre>\n<p>Check out this <a href=\"https:\/\/devblogs.microsoft.com\/dotnet\/local-ai-models-with-dotnet-aspire\/\">blog post<\/a> with all the details on how to get going.<\/p>\n<h2>Summary<\/h2>\n<p>DeepSeek R1 is an exciting new reasoning model that&#8217;s drawing a lot of attention and you can build .NET applications that make use of it today using the Microsoft.Extensions.AI library. GitHub Models lowers the friction of getting started and experimenting it with. Go ahead and <a href=\"https:\/\/github.com\/codemillmatt\/deepseek-dotnet\">try out the samples<\/a> and checkout our other <a href=\"https:\/\/github.com\/dotnet\/ai-samples\/tree\/main\/src\/microsoft-extensions-ai\">MEAI samples<\/a>!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Learn how to easily integrate DeepSeek R1 with .NET applications using the Microsoft.Extensions.AI library.<\/p>\n","protected":false},"author":569,"featured_media":55410,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[685,7781],"tags":[4,568,7897,7870],"class_list":["post-55409","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-dotnet","category-ai","tag-net","tag-ai","tag-deepseek","tag-meai"],"acf":[],"blog_post_summary":"<p>Learn how to easily integrate DeepSeek R1 with .NET applications using the Microsoft.Extensions.AI library.<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts\/55409","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/users\/569"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/comments?post=55409"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts\/55409\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/media\/55410"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/media?parent=55409"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/categories?post=55409"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/tags?post=55409"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}