{"id":52761,"date":"2024-07-22T10:05:00","date_gmt":"2024-07-22T17:05:00","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/dotnet\/?p=52761"},"modified":"2024-07-22T10:05:00","modified_gmt":"2024-07-22T17:05:00","slug":"add-ai-to-your-dotnet-apps-easily-with-prompty","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/dotnet\/add-ai-to-your-dotnet-apps-easily-with-prompty\/","title":{"rendered":"Add AI to Your .NET Apps Easily with Prompty"},"content":{"rendered":"<p>Adding AI features to .NET development is a new and exciting experience. In this blog post, we will explore Prompty and how to use it to integrate a Large Language Model, like GPT-4o, into your development flow and .NET applications.<\/p>\n<h2>Introduction to Prompty<\/h2>\n<p>As AI enthusiasts and .NET developers, we constantly seek tools that simplify our workflows and enhance our productivity. One such powerful tool is Prompty, a Visual Studio Code extension designed to facilitate the integration of Large Language Models (LLMs) like GPT-4o into your applications. Prompty provides an intuitive interface to interact with LLMs directly from your development environment, making it easier than ever to add AI features to your projects.<\/p>\n<p>Prompty is developed by Microsoft and is available for free on the Visual Studio Code marketplace. Whether you are building chatbots, creating content generators, or experimenting with other AI-driven applications, Prompty can significantly streamline your development process.<\/p>\n<h2>Description of a Typical Flow of a Developer Using Prompty in Visual Studio Code<\/h2>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2024\/07\/11prompty-venn.png\" alt=\"What is prompty\" \/><\/p>\n<p>Let&#8217;s take a look at a sample flow on how to use Prompty. This process usually involves a few key steps:<\/p>\n<ol>\n<li>\n<p><strong>Installation<\/strong>: Begin by installing the Prompty extension from the <a href=\"https:\/\/marketplace.visualstudio.com\/items?itemName=ms-toolsai.prompty\">Visual Studio Code Marketplace<\/a>.<\/p>\n<\/li>\n<li>\n<p><strong>Setup<\/strong>: After installation, configure the extension by providing your API keys and setting up the necessary parameters to connect to the LLM of your choice, such as GPT-4o.<\/p>\n<\/li>\n<li>\n<p><strong>Integration<\/strong>: Prompty integrates seamlessly with your development workflow. Start by creating a new file or opening an existing one where you want to use the LLM. Prompty provides commands and snippets to easily insert prompts and handle responses.<\/p>\n<\/li>\n<li>\n<p><strong>Development<\/strong>: Write prompts directly in your codebase to interact with the LLM. Prompty supports various prompt formats and provides syntax highlighting to make your prompts readable and maintainable.<\/p>\n<p>You can use the extension to generate code snippets, create documentation, or even debug your applications by asking the LLM specific questions. Once you have your prompt ready, you can use the extension to generate code snippets, create documentation, or even debug your applications by asking the LLM specific questions.<\/p>\n<\/li>\n<li>\n<p><strong>Testing<\/strong>: Test your prompts and adjust them as needed to get the desired responses from the LLM. Prompty allows you to iterate quickly, refining your prompts to improve the accuracy and relevance of the AI&#8217;s responses.<\/p>\n<\/li>\n<\/ol>\n<h2>Real Sample Using a WebAPI Application<\/h2>\n<p>Let&#8217;s walk through a practical example of using Prompty in a .NET WebAPI application.<\/p>\n<h3>Step 1: Set up the WebAPI Project<\/h3>\n<p>First, create a new WebAPI project, named <code>PromptyWebAPI<\/code>, using the .NET CLI:<\/p>\n<pre><code class=\"language-bash\">dotnet new webapi -n PromptyWebAPI\ncd PromptyWebAPI<\/code><\/pre>\n<p>Add the following dependencies:<\/p>\n<pre><code class=\"language-bash\">dotnet add package Microsoft.SemanticKernel --version 1.15.1\ndotnet add package Microsoft.SemanticKernel.Prompty --version 1.15.1-alpha\ndotnet add package Microsoft.Extensions.Configuration.UserSecrets --version 8.0.0<\/code><\/pre>\n<p>Run the project directly from Visual Studio Code or with the command:<\/p>\n<pre><code class=\"language-bash\">dotnet run<\/code><\/pre>\n<p>We will see the standard Weather Forecast API Endpoint.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2024\/07\/20WebAPIRun.png\" alt=\"WebAPI Project running in the browser\" \/><\/p>\n<p><strong>Note:<\/strong> The WebAPI project uses User Secrets to access the GPT-4o Azure OpenAI Model. Set up the user secrets with the following commands:<\/p>\n<pre><code class=\"language-bash\">dotnet user-secrets init\ndotnet user-secrets set \"AZURE_OPENAI_MODEL\" \"&lt; model &gt;\"\ndotnet user-secrets set \"AZURE_OPENAI_ENDPOINT\" \"&lt; endpoint &gt;\"\ndotnet user-secrets set \"AZURE_OPENAI_APIKEY\" \"&lt; api key &gt;\"<\/code><\/pre>\n<h3>Step 2: Create a prompty for a more descriptive summary for the Forecast<\/h3>\n<p>The weather forecast returns a fictitious forecast including dates, temperatures (C and F) and a summary. Our goal is to create a prompt that can generate a more detailed Summary.<\/p>\n<p>Let&#8217;s add a new prompty file to the root of the <code>PromptyWebAPI<\/code> folder. This is as easy as <code>right-click<\/code> and <code>New Prompty<\/code>. Rename the created file to <code>weatherforecastdesc.prompty<\/code>.<\/p>\n<p>Our solution should look like this:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2024\/07\/25NewPrompty.png\" alt=\"Solution explorer view showing the prompty file added to the solution\" \/><\/p>\n<p>Now it&#8217;s time to complete the sections of our prompty file. Each section has specific information related to the use of the LLM. In example, the <strong>model section<\/strong> will define the model to be used, the <strong>sample section<\/strong> will provide a sample for the expected output and finally we have the prompt to be used. <\/p>\n<p>In the following sample, the prompt defines a <strong>System message<\/strong>, and in the <strong>Context<\/strong>, we provide the parameters for the weather.<\/p>\n<p>Replace the content of your prompty file with the following content.  <\/p>\n<pre><code class=\"language-yml\">---\nname: generate_weather_detailed_description\ndescription: A prompt that generated a detaled description for a weather forecast\nauthors:\n  - Bruno Capuano\nmodel:\n  api: chat\n  configuration:\n    type: azure_openai\n    azure_endpoint: ${env:AZURE_OPENAI_ENDPOINT}\n    azure_deployment: ${env:AZURE_OPENAI_MODEL}\n  parameters:\n    max_tokens: 3000\nsample:\n  today: &gt; \n    2024-07-16\n\n  date: &gt; \n    2024-07-17\n\n  forecastTemperatureC: &gt;\n    25\u00b0C\n---\n\n# System:\nYou are an AI assistant who generated detailed weather forecast descriptions. The detailed description is a paragraph long.\nYou use the full description of the date, including the weekday.\nYou also give a reference to the forecast compared to the current date today.\nAs the assistant, you generate descriptions using a funny style and even add some personal flair with appropriate emojis.\n\n# Context\nUse the following context to generated a detailed weather forecast descriptions \n- Today: {{today}}\n- Date: {{date}}\n- TemperatureC: {{forecastTemperatureC}}<\/code><\/pre>\n<p>Once our prompty is ready, we can test the prompt pressing <strong>F5<\/strong>. Once we run our prompty, we should see the results in the output window:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2024\/07\/30TestPrompty.png\" alt=\"Testing prompty with the results directly in the Output window\" \/><\/p>\n<p>This is the moment to start to refine our prompt!<\/p>\n<p><em><strong>Bonus:<\/strong> Right click on the prompty file, also allow the generation of C# code using Semantic Kernel to use the current file.<\/em><\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/dotnet\/wp-content\/uploads\/sites\/10\/2024\/07\/27GenSKCode.png\" alt=\"Generate Semantic Kernel code from a prompty file\" \/><\/p>\n<p><em><strong>Note:<\/strong> Previous to this, we need to provide the necessary information for the LLM to be used by prompty. Follow the extension configuration to do this. The easiest way is to create a <code>.env<\/code> file with the LLM information.<\/em><\/p>\n<h3>Step 3: Update the endpoint using prompty for the forecast summary<\/h3>\n<p>Now we can use this prompty, using Semantic Kernel directly in our project. Let&#8217;s edit the <code>program.cs<\/code> file and apply the following changes:<\/p>\n<ul>\n<li>Add the necessary usings to the top of the file.<\/li>\n<li>Create a Semantic Kernel to generated the forecast summaries.<\/li>\n<li>\n<p>Add the new forecast summary in the forecast result.<\/p>\n<p>To generate the detailed summary, Semantic Kernel will use the prompty file and the weather information.<\/p>\n<\/li>\n<\/ul>\n<pre><code class=\"language-csharp\">using Microsoft.SemanticKernel;\n\nvar builder = WebApplication.CreateBuilder(args);\n\n\/\/ Add services to the container.\n\/\/ Learn more about configuring Swagger\/OpenAPI at https:\/\/aka.ms\/aspnetcore\/swashbuckle\nbuilder.Services.AddEndpointsApiExplorer();\nbuilder.Services.AddSwaggerGen();\n\n\/\/ Azure OpenAI keys\nvar config = new ConfigurationBuilder().AddUserSecrets&lt;Program&gt;().Build();\nvar deploymentName = config[\"AZURE_OPENAI_MODEL\"];\nvar endpoint = config[\"AZURE_OPENAI_ENDPOINT\"];\nvar apiKey = config[\"AZURE_OPENAI_APIKEY\"];\n\n\/\/ Create a chat completion service\nbuilder.Services.AddKernel();\nbuilder.Services.AddAzureOpenAIChatCompletion(deploymentName, endpoint, apiKey);\n\nvar app = builder.Build();\n\n\/\/ Configure the HTTP request pipeline.\nif (app.Environment.IsDevelopment())\n{\n    app.UseSwagger();\n    app.UseSwaggerUI();\n}\n\napp.UseHttpsRedirection();\n\napp.MapGet(\"\/weatherforecast\", async (HttpContext context, Kernel kernel) =&gt;\n{\n    var forecast = new List&lt;WeatherForecast&gt;();\n    for (int i = 0; i &lt; 3; i++)\n    {\n        var forecastDate = DateOnly.FromDateTime(DateTime.Now.AddDays(i));\n        var forecastTemperature = Random.Shared.Next(-20, 55);\n\n        var weatherFunc = kernel.CreateFunctionFromPromptyFile(\"weatherforecastdesc.prompty\");\n        var forecastSummary = await weatherFunc.InvokeAsync&lt;string&gt;(kernel, new()\n        {\n            { \"today\", $\"{DateOnly.FromDateTime(DateTime.Now)}\" },\n            { \"date\", $\"{forecastDate}\" },\n            { \"forecastTemperatureC\", $\"{forecastTemperature}\" }\n        });\n\n        forecast.Add(new WeatherForecast(forecastDate, forecastTemperature, forecastSummary));\n    }\n    return forecast;\n})\n.WithName(\"GetWeatherForecast\")\n.WithOpenApi();\n\napp.Run();\n\nrecord WeatherForecast(DateOnly Date, int TemperatureC, string? Summary)\n{\n    public int TemperatureF =&gt; 32 + (int)(TemperatureC \/ 0.5556);\n}<\/code><\/pre>\n<p>When we test again the <code>\/weatherforecast<\/code> endpoint, the outputs should include more detailed summaries. The following example includes the current date (Jul-16) and 2 more days:<\/p>\n<p><em><strong>Note:<\/strong> These are random generated temperatures. I&#8217;m not sure about a temperature change  from -4C\/25F to 45C\/112F in a single day.<\/em><\/p>\n<pre><code class=\"language-json\">[\n  {\n    \"date\": \"2024-07-16\",\n    \"temperatureC\": -4,\n    \"summary\": \"\ud83c\udf2c\ufe0f\u2744\ufe0f Happy Tuesday, July 16th, 2024, folks! Guess what? Today\u2019s weather forecast is brought to you by the Frozen Frappuccino Club, because it\u2019s a chilly one! With a temperature of -4\u00b0C, it\u2019s colder than a snowman\u2019s nose out there! \ud83e\udd76 So, get ready to channel your inner penguin and waddle through the frosty air. Remember to layer up with your snuggiest sweaters and warmest scarves, or you might just turn into an icicle! Compared to good old yesterday, well... there\u2019s not much change, because yesterday was just as brrrrr-tastic. Stay warm, my friends, and maybe keep a hot chocolate handy for emergencies! \u2615\u26c4\",\n    \"temperatureF\": 25\n  },\n  {\n    \"date\": \"2024-07-17\",\n    \"temperatureC\": 45,\n    \"summary\": \"\ud83c\udf1e\ud83d\udd25 Well, buckle up, buttercup, because *Wednesday, July 17, 2024*, is coming in hot! If you thought today was toasty, wait until you get a load of tomorrow. With a sizzling temperature of 45\u00b0C, it's like Mother Nature cranked the thermostat up to \\\"sauna mode.\\\" \ud83c\udf21\ufe0f Don't even think about wearing dark colors or stepping outside without some serious SPF and hydration on standby! Maybe it's a good day to try frying an egg on the sidewalk for breakfast\u2014just kidding, or am I? \ud83e\udd75 Anyway, stay cool, find a shady spot, and keep your ice cream close; you're gonna need it! \ud83c\udf66\",\n    \"temperatureF\": 112\n  },\n  {\n    \"date\": \"2024-07-18\",\n    \"temperatureC\": 35,\n    \"summary\": \"Ladies and gentlemen, fasten your seatbelts and hold onto your hats\u2014it\u2019s going to be a sizzling ride! \ud83d\udd76\ufe0f\ud83c\udf1e On Thursday, July 18, 2024, just two days from today, Mother Nature cranks up the heat like she\u2019s trying to turn the entire planet into a giant summer barbecue. \ud83c\udf21\ufe0f\ud83d\udd25 With the temperature shooting up to a toasty 35\u00b0C, it\u2019s the perfect day to channel your inner popsicle in front of the A\/C. Water fights, ice cream sundaes, and epic pool floats are all highly recommended survival strategies. And remember, folks, sunscreen is your best friend\u2014don't be caught out there lookin\u2019 like a lobster in a sauna! \ud83e\udd9e\u2600\ufe0f So get ready to sweat but with style, as we dive headfirst into the fantastic scorching adventure that Thursday promises to be! \ud83d\ude0e\ud83c\udf67\ud83c\udf34\",\n    \"temperatureF\": 94\n  }\n]<\/code><\/pre>\n<h2>Summary<\/h2>\n<p>Prompty offers .NET developers an efficient way to integrate AI capabilities into their applications. By using this Visual Studio Code extension, developers can effortlessly incorporate GPT-4o and other Large Language Models into their workflows.<\/p>\n<p>Prompty and Semantic Kernel simplifies the process of generating code snippets, creating documentation, and debugging applications with AI-driven focus.<\/p>\n<p>To learn more about Prompty and explore its features, visit the <strong><a href=\"https:\/\/prompty.ai\/\">Prompty main page<\/a><\/strong>, check out the <strong><a href=\"https:\/\/marketplace.visualstudio.com\/items?itemName=ms-toolsai.prompty\">Prompty Visual Studio Code extension<\/a><\/strong>, dive into the <strong><a href=\"https:\/\/github.com\/microsoft\/prompty\">Prompty source code on GitHub<\/a><\/strong> or watch the Build session <strong><a href=\"https:\/\/www.youtube.com\/watch?v=HALMFU7o9Gc\">Practical End-to-End AI Development using Prompty and AI Studio | BRK114<\/a><\/strong>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Learn how to integrate AI into your .NET applications with Prompty, a powerful Visual Studio Code extension.<\/p>\n","protected":false},"author":120281,"featured_media":52762,"comment_status":"open","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[685,7781,756,688],"tags":[568,7726,7724],"class_list":["post-52761","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-dotnet","category-ai","category-csharp","category-machine-learning","tag-ai","tag-azure-openai","tag-openai"],"acf":[],"blog_post_summary":"<p>Learn how to integrate AI into your .NET applications with Prompty, a powerful Visual Studio Code extension.<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts\/52761","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/users\/120281"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/comments?post=52761"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/posts\/52761\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/media\/52762"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/media?parent=52761"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/categories?post=52761"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/dotnet\/wp-json\/wp\/v2\/tags?post=52761"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}