{"id":189,"date":"2021-01-27T12:03:16","date_gmt":"2021-01-27T20:03:16","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/pax-windows\/?p=189"},"modified":"2021-02-02T09:33:51","modified_gmt":"2021-02-02T17:33:51","slug":"using-winml-in-net5","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/using-winml-in-net5\/","title":{"rendered":"Using WinML in .NET5"},"content":{"rendered":"<h2>WinML+.NET5<\/h2>\n<p><a href=\"https:\/\/docs.microsoft.com\/windows\/ai\/windows-ml\/\">WinML<\/a> is a high-performance, reliable API for deploying hardware-accelerated ML (Machine Learning) inferences on Windows devices. Since its introduction, many developers started using this technology to develop UWP applications that leverage artificial intelligence. Throughout this blog post, we&#8217;ll understand how you can leverage WinML on a simple .NET5 Console app.<\/p>\n<hr \/>\n<h2>.NET5 + WinRT<\/h2>\n<p>In my previous <a href=\"https:\/\/devblogs.microsoft.com\/pax-windows\/winui-3-preview-3\/#c-winrt\">blog post<\/a>, I briefly explained how C#\/WinRT works, and how you can access WinRT APIs from a .NET5 app. If you need help understanding how C#\/WinRT helps you, I suggest you read this <a href=\"https:\/\/blogs.windows.com\/windowsdeveloper\/2020\/11\/10\/announcing-c-winrt-version-1-0-with-the-net-5-ga-release\/\">blog post<\/a>.<\/p>\n<h2>Show me the code!<\/h2>\n<p>Lets begin with a simple .NET5 console project:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/pax-windows\/wp-content\/uploads\/sites\/61\/2021\/01\/Net5WinML-NewProject.png\" alt=\"New Console App\" \/><\/p>\n<p>Give the project a name and a location. Since we&#8217;ll call WinRT APIs from this app, which are unique to Windows, lets make this app work only on Windows. This can be easily achieved by changing the target framework of our project. Double click on the project in Solution Explorer, and you should see the source of our <code>csproj<\/code>:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/pax-windows\/wp-content\/uploads\/sites\/61\/2021\/01\/Net5WinML-csproj.png\" alt=\"Existing csproj\" \/><\/p>\n<p>Depending on your Visual Studio version, the template of the project you just created might be using <code>netcoreapp3.1<\/code>, so lets update this to not only be Windows specific, but also lets make sure it is targeting .NET5. We also want this project to include the Windows SDK, which can now be achieved in a simpler way (compared to <code>netcore3.1<\/code>) by simply using the right <a href=\"https:\/\/docs.microsoft.com\/dotnet\/standard\/frameworks\">target framework moniker (TFM)<\/a>. These are the supported TFMs:<\/p>\n<ul>\n<li><strong>net5.0-windows10.0.17763.0 (Windows 10, version 1809)<\/strong><\/li>\n<li><strong>net5.0-windows10.0.18362.0 (Windows 10, version 1903)<\/strong><\/li>\n<li><strong>net5.0-windows10.0.19041.0 (Windows 10, version 2004)<\/strong><\/li>\n<\/ul>\n<p>This will enable your .NET5 app to call WinRT APIs from that specific Windows 10 SDK. Since WinML is already supported on older version of Windows, we don&#8217;t need to pick the latest version, so pick 1903 (18362):<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/pax-windows\/wp-content\/uploads\/sites\/61\/2021\/01\/Net5WinML-UpdatedCsproj.png\" alt=\"Updated csproj targeting .NET5\" \/><\/p>\n<p>Doing that will block our .NET5 app from running on Linux, or even Windows7, since we are specifically stating we support Windows 10 only.<\/p>\n<p>Now that we can call the WinML APIs, lets add an ONNX model to our project. For the sample we have here, lets use SqueezeNet, which is a deep neural network for computer vision. You can download the model <a href=\"https:\/\/github.com\/onnx\/models\/blob\/master\/vision\/classification\/squeezenet\/model\/squeezenet1.0-9.onnx\">here<\/a>, and the labels <a href=\"https:\/\/github.com\/runwayml\/model-squeezenet\/blob\/master\/labels.json\">here<\/a>. Without the labels, there is no way to interpret the results, since the output of the model is just a bunch of numbers.<\/p>\n<p>Lets add both files to our project and make sure to change their <code>Build Action<\/code> to <code>Content<\/code>, and set <code>Copy to Output Directory<\/code> to <code>Copy if newer<\/code>. This will ensure that we deploy the model and the labels side-by-side with our <code>.exe<\/code>, so we can load them during runtime.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/pax-windows\/wp-content\/uploads\/sites\/61\/2021\/01\/Net5WinML-ProjectExplorer.png\" alt=\"Added ONNX model and labels JSON to project and set their properties\" \/><\/p>\n<p>Now we can write our code to use WinML and load this model, as well as the JSON file with the labels. Lets create a new file called <code>SqueezeNet.cs<\/code>:<\/p>\n<pre><code class=\"csharp\">using System;\r\nusing System.Threading.Tasks;\r\nusing Windows.AI.MachineLearning;\r\n\r\nnamespace ImageClassifier\r\n{\r\n    public sealed class SqueezeNetInput\r\n    {\r\n        public ImageFeatureValue data_0; \/\/ shape(1,3,224,224)\r\n    }\r\n\r\n    public sealed class SqueezeNetOutput\r\n    {\r\n        public TensorFloat softmaxout_1; \/\/ shape(1,1000,1,1)\r\n    }\r\n\r\n    public sealed class SqueezeNetModel\r\n    {\r\n        private LearningModel model;\r\n        private LearningModelSession session;\r\n        private LearningModelBinding binding;\r\n\r\n        public static SqueezeNetModel CreateFromFilePath(string filePath)\r\n        {\r\n            var learningModel = new SqueezeNetModel\r\n            {\r\n                model = LearningModel.LoadFromFilePath(filePath)\r\n            };\r\n            learningModel.session = new LearningModelSession(learningModel.model);\r\n            learningModel.binding = new LearningModelBinding(learningModel.session);\r\n            return learningModel;\r\n        }\r\n\r\n        public async Task&lt;SqueezeNetOutput&gt; EvaluateAsync(SqueezeNetInput input)\r\n        {\r\n            binding.Bind(\"data_0\", input.data_0);\r\n            var result = await session.EvaluateAsync(binding, \"0\");\r\n            var output = new SqueezeNetOutput\r\n            {\r\n                softmaxout_1 = result.Outputs[\"softmaxout_1\"] as TensorFloat\r\n            };\r\n            return output;\r\n        }\r\n    }\r\n}\r\n<\/code><\/pre>\n<p>This code defines 3 classes: One for the model&#8217;s input, one for the model&#8217;s output, and one to load and store our model using the WinML APIs. This last class also have a handy <code>async<\/code> method to evaluate an input and return its output, named <code>EvaluateAsync<\/code>.<\/p>\n<p>See how we are simply using the <code>Windows.AI.MachineLearning<\/code> namespace? That is provided by the specific TFM that we are using. It is also noteworthy that <code>data_0<\/code> and <code>softmaxout_1<\/code> are not random strings. They are specifically what the existing <code>ONNX<\/code> model is expecting. You can use the Visual Studio Extension <a href=\"https:\/\/marketplace.visualstudio.com\/items?itemName=WinML.mlgenv2\">mlgen<\/a> or any <code>ONNX<\/code> model viewer (like <a href=\"https:\/\/github.com\/lutzroeder\/Netron\">Netron<\/a>) to see these values when you are binding the inputs and outputs of your models.<\/p>\n<p><code>MLGen<\/code> automatically creates a <code>.cs<\/code> file for each <code>.ONNX<\/code> file you add to any <code>UWP<\/code> project, but that capability is not yet enabled for <code>.NET5<\/code> projects. The <code>WinML<\/code> team is looking at improving this experience with <code>.NET5<\/code> projects, but since this tool is installed in your Visual Studio&#8217;s Extensions folder, you can already manually call it and add the generated <code>C#<\/code> file to your project. The path to it should be similar to <code>C:\\Program Files (x86)\\Microsoft Visual Studio\\2019\\Enterprise\\Common7\\IDE\\Extensions\\hiofzu03.xur\\mlgen.exe<\/code>, and you can change <code>Enterprise<\/code> to your Visual Studio Edition. Calling it with no parameters will give you the common usage:<\/p>\n<pre class=\"prettyprint\">usage: mlgen.exe -i &lt;INPUT-FILE&gt; -l &lt;LANGUAGE&gt; -n &lt;NAMESPACE&gt; -p &lt;PREFIX&gt; [-o OUTPUT-FILE]\r\n&lt;INPUT-FILE&gt; : onnx model file\r\n&lt;LANGUAGE&gt; : cppwinrt or cppcx or cs\r\n&lt;NAMESPACE&gt; : code namespace\r\n&lt;PREFIX&gt; : generated class prefix\r\n&lt;OUTPUT-FILE&gt;: generated code output file. If not specified\r\nthe code will be written to std output.<\/pre>\n<p>I used <code>mlgen.exe<\/code> as a starting point to get our <code>SqueezeNetModel<\/code> class, and tweaked it a little bit to be better aligned with a <code>string<\/code> for loading the model&#8217;s file. I&#8217;ve also changed the <code>data_0<\/code> data type from <a href=\"https:\/\/docs.microsoft.com\/uwp\/api\/windows.ai.machinelearning.tensorfloat\">TensorFloat<\/a> to <a href=\"https:\/\/docs.microsoft.com\/uwp\/api\/windows.ai.machinelearning.imagefeaturevalue\">ImageFeatureValue<\/a>, which makes it easier for us to use the image we will load.<\/p>\n<p>Now that this code is ready, we can load our model in our Main method:\n<img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/pax-windows\/wp-content\/uploads\/sites\/61\/2021\/01\/Net5WinML-LoadModel.png\" alt=\"Load ONNX model in C# using WinML APIs\" \/><\/p>\n<p>For simplicity, lets assume that there is always one parameter in our Console app args variable, which will represent the image file&#8217;s path that we will load and evaluate.<\/p>\n<p>Since our input is expecting an <a href=\"https:\/\/docs.microsoft.com\/uwp\/api\/windows.ai.machinelearning.imagefeaturevalue\">ImageFeatureValue<\/a>, we will need to load the file and convert it properly, which is straightforward to achieve using the Windows 10 SDK. We also load the JSON file to have a human-readable table of what this class represents:<\/p>\n<pre><code class=\"csharp\">using System;\r\nusing System.Collections.Generic;\r\nusing System.IO;\r\nusing System.Linq;\r\nusing System.Reflection;\r\nusing System.Text.Json;\r\nusing System.Threading.Tasks;\r\nusing Windows.AI.MachineLearning;\r\nusing Windows.Graphics.Imaging;\r\nusing Windows.Media;\r\n\r\nnamespace ImageClassifier\r\n{\r\n    class Program\r\n    {\r\n        static async Task Main(string[] args)\r\n        {\r\n            var rootDir = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);\r\n            var squeezeNetModel = SqueezeNetModel.CreateFromFilePath(Path.Combine(rootDir, \"squeezenet1.0-9.onnx\"));\r\n\r\n            \/\/ Load labels from JSON\r\n            var labels = new List&lt;string&gt;();\r\n            foreach (var kvp in JsonSerializer.Deserialize&lt;Dictionary&lt;string, string&gt;&gt;(File.ReadAllText(Path.Combine(rootDir, \"Labels.json\"))))\r\n            {\r\n                labels.Add(kvp.Value);\r\n            }\r\n\r\n            if (args.Length &lt; 1)\r\n                return;\r\n\r\n            var filePath = args[0];\r\n\r\n            \/\/ Open image file\r\n            SqueezeNetOutput output;\r\n            using (var fileStream = File.OpenRead(filePath))\r\n            {\r\n                \/\/ Convert from FileStream to ImageFeatureValue\r\n                var decoder = await BitmapDecoder.CreateAsync(fileStream.AsRandomAccessStream());\r\n                using var softwareBitmap = await decoder.GetSoftwareBitmapAsync();\r\n                using var inputImage = VideoFrame.CreateWithSoftwareBitmap(softwareBitmap);\r\n                var imageTensor = ImageFeatureValue.CreateFromVideoFrame(inputImage);\r\n\r\n                output = await squeezeNetModel.EvaluateAsync(new SqueezeNetInput\r\n                {\r\n                    data_0 = imageTensor\r\n                });\r\n            }\r\n\r\n            \/\/ Get result, which is a list of floats with all the probabilities for all 1000 classes of SqueezeNet\r\n            var resultTensor = output.softmaxout_1;\r\n            var resultVector = resultTensor.GetAsVectorView();\r\n\r\n            \/\/ Order the 1000 results with their indexes to know which class is the highest ranked one\r\n            List&lt;(int index, float p)&gt; results = new List&lt;(int, float)&gt;();\r\n            for (int i = 0; i &lt; resultVector.Count; i++)\r\n            {\r\n                results.Add((index: i, p: resultVector.ElementAt(i)));\r\n            }\r\n            results.Sort((a, b) =&gt; a.p switch\r\n            {\r\n                var p when p &lt; b.p =&gt; 1,\r\n                var p when p &gt; b.p =&gt; -1,\r\n                _ =&gt; 0\r\n            });\r\n\r\n            if (results[0].p &gt;= 0.9f)\r\n            {\r\n                Console.WriteLine($\"Image '{filePath}' is classified as '{labels[results[0].index]}'(p={(int)(results[0].p * 100)}%).\");\r\n            }\r\n            else\r\n            {\r\n                Console.WriteLine(\"Sorry, but I'm not sure what this is.\");\r\n            }\r\n        }\r\n    }\r\n}\r\n<\/code><\/pre>\n<p>You can see that the very last <code>if<\/code> clause is comparing our result with <code>0.9f<\/code>, which represents <code>90%<\/code> of confidence in our result. I&#8217;ve empirically found that <code>90%<\/code> is a good enough value for this model, but you can tweak it depending on what you want to achieve.<\/p>\n<p>If we build our project, it will output a folder with our exe, its dependencies, as well as the Labels JSON file and our ONNX model.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/pax-windows\/wp-content\/uploads\/sites\/61\/2021\/01\/Net5WinML-OutputFolder.png\" alt=\"Output folder\" \/><\/p>\n<p>Now we can call this exe file from a command line prompt and pass an image file as an argument, which will evaluate our input and return the class it thinks it matches best. Lets use this cat image as an example:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/pax-windows\/wp-content\/uploads\/sites\/61\/2021\/01\/Net5WinML-Cat.jpg\" alt=\"Cat image\" \/><\/p>\n<p>Which will output this:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/pax-windows\/wp-content\/uploads\/sites\/61\/2021\/01\/Net5WinML-ConsoleOutput.png\" alt=\"Console Output\" \/><\/p>\n<p>And as you can see, it had a pretty good evaluation, classifying it as an <code>Egyptian cat<\/code>.<\/p>\n<h2>Closing<\/h2>\n<p>Checkout the source of the code of the sample shared <a href=\"https:\/\/github.com\/azchohfi\/ImageClassifierSample\/tree\/net5winml\">here<\/a>. Remember that this is only an example, and there might be inputs that this model does not properly evaluate with confidence, or it might even make mistakes. <code>WinML<\/code> is a tool, and we need to understand its limitations, which is a complex subject that needs to be handled on a case-by-case basis. Yet, its a very powerful tool, so use it wisely!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Lets get our hands down and create a simple .NET5 console app that uses WinML to analyse an image and classify what it is using a pre-built machine learning model.<\/p>\n","protected":false},"author":40006,"featured_media":208,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[17,10,18,8,19,20],"class_list":["post-189","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ifdef-windows","tag-ai","tag-desktop","tag-net5","tag-windows","tag-winml","tag-winrt"],"acf":[],"blog_post_summary":"<p>Lets get our hands down and create a simple .NET5 console app that uses WinML to analyse an image and classify what it is using a pre-built machine learning model.<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/posts\/189","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/users\/40006"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/comments?post=189"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/posts\/189\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/media\/208"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/media?parent=189"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/categories?post=189"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ifdef-windows\/wp-json\/wp\/v2\/tags?post=189"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}