{"id":35666,"date":"2018-02-20T14:36:35","date_gmt":"2018-02-20T19:36:35","guid":{"rendered":"https:\/\/blog.xamarin.com\/?p=35666"},"modified":"2019-04-04T15:37:20","modified_gmt":"2019-04-04T22:37:20","slug":"coreml-azure-create-simple-xamarin-ios-apps","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/xamarin\/coreml-azure-create-simple-xamarin-ios-apps\/","title":{"rendered":"Use CoreML And Azure To Create Simple Xamarin.iOS Apps"},"content":{"rendered":"<p>Last year Apple released a device-optimized machine learning framework called Core ML to make it as easy as possible to integrate machine learning and artificial intelligence services into your iOS and Mac apps. CoreML is a blessing for developers who lack extensive knowledge of AI or machine learning, because getting started only requires referencing a pre-trained model in a project and adding a few lines of code.<\/p>\n<p>In this blog post, we&#8217;ll discuss how to create a simple iOS app to identify Links, Hylian Shield, and a Master Sword using CoreML and Azure.<\/p>\n<h2>Getting started<\/h2>\n<p>While consuming pre-trained models is easiest, we often don\u2019t have large data sets available to train the model. To solve this problem, Microsoft created the <a href=\"https:\/\/customvision.ai\/\">Custom Vision Service<\/a>, which generates custom machine learning models with just five images in only a few minutes. These models can be exported into CoreML models and consumed in iOS and Mac apps.<\/p>\n<p>Let\u2019s start by creating a project with Custom Vision Service. Make sure you select a <code>compact<\/code> model under Domain. The Compact domains can be exported to Core ML or Tensorflow compatible models to run on the devices locally:<\/p>\n<p><img decoding=\"async\" class=\"aligncenter wp-image-35670\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/44\/2019\/03\/CoreML_Pic1.png\" alt=\"\" width=\"800\" height=\"482\" \/><\/p>\n<p>Next, a trained model will be ready for use in three easy steps:<\/p>\n<ol>\n<li>Upload training images.<\/li>\n<li>Tag the images.<\/li>\n<li>Click Train.<\/li>\n<\/ol>\n<p>It only takes few minutes to complete the training, and then you can download the model to use in your app.<\/p>\n<p><img decoding=\"async\" class=\"aligncenter wp-image-35671\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/44\/2019\/03\/CoreML_Pic2.png\" alt=\"\" width=\"800\" height=\"313\" \/><\/p>\n<h2>For Your iOS App<\/h2>\n<p>Now that we have our CoreML model set up, it\u2019s time to add it to your iOS app project.<\/p>\n<h3>Add Model to the project<\/h3>\n<p>Create a new single view iOS app by going to File \u2192 New Project \u2192 iOS \u2192 Single View in Visual Studio 2017. Add the downloaded model (file with .mlmodel extension) to the <code>Resources<\/code> folder, and make sure the <code>BuildAction<\/code> is set to <code>BundleResource<\/code>.<\/p>\n<h3>Initialize Vision Model<\/h3>\n<p>Before using the model, we need to load it in our app and initialize it with a request handler, where we can read ML\u2019s observations:<\/p>\n<pre><code>\r\n    var modelPath = NSBundle.MainBundle.GetUrlForResource(\"Link\", \"mlmodel\");\r\n    var compiledPath = MLModel.CompileModel(modelPath, out NSError compileError);\r\n    var mlModel = MLModel.Create(compiledPath, out NSError createError);\r\n\r\n    var model = VNCoreMLModel.FromMLModel(mlModel, out NSError mlError);\r\n\r\n    \/\/ Initialise classification request and Handler\r\n    var classificationRequest = new VNCoreMLRequest(model, HandleClassificationRequest);\r\n<\/code><\/pre>\n<p>This code compiles the downloaded <code>mlmodel<\/code> on the device, making it easy to update models without updating the app:<\/p>\n<p><img decoding=\"async\" class=\"aligncenter wp-image-35672\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/44\/2019\/03\/CoreML_Pic3.png\" alt=\"\" width=\"800\" height=\"343\" \/><\/p>\n<h3>Request classification<\/h3>\n<p>Now we need to identify the objects. The following code does this when we take a photo on device:<\/p>\n<pre><code>\r\n    var requestHandler = new VNImageRequestHandler(imageNSData, new VNImageOptions());\r\n    requestHandler.Perform(classificationRequest, out NSError error);\r\n<\/code><\/pre>\n<h3>Read the observations<\/h3>\n<p>As soon as something is detected, the classification request will invoke <code>HandleClassificationRequest<\/code> method, where we can read the observations from the <code>request<\/code> parameter using <code>GetResults<\/code> method:<\/p>\n<pre><code>\r\n    void HandleClassificationRequest(VNRequest request, NSError error)\r\n    {\r\n        var observations = request.GetResults();\r\n        var best = observations?[0];\r\n\r\n        Debug.WriteLine($\"{best.Identifier}, Confidence: {best.Confidence:P0}\");\r\n    }\r\n<\/code><\/pre>\n<p>The best possible classification will be available at the <code>0<\/code> index, with an identification label, and confidence value in the range <code>0..1<\/code>, where <code>0<\/code> means no confidence that the observation is accurate and <code>1<\/code> means certainty.\n&nbsp;\n<center><iframe width=\"560\" height=\"315\" src=\"https:\/\/www.youtube.com\/embed\/6MGDoXmNDZ0\" frameborder=\"0\" allow=\"autoplay; encrypted-media\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/center><\/p>\n<h2>Conclusion<\/h2>\n<p>It&#8217;s that easy to get started with ML and AI with Xamarin.iOS, Visual Studio, and Azure! You can find the final sample code <a href=\"https:\/\/github.com\/prashantvc\/CoreMLDemo\">on Github<\/a>. Be sure to read through <a href=\"https:\/\/developer.xamarin.com\/guides\/ios\/platform_features\/introduction-to-ios11\/coreml\/\">Introduction to CoreML<\/a> on our documentation website, and then check out the Azure documentation to build classifier by using Custom Vision Services <a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/cognitive-services\/custom-vision-service\/getting-started-build-a-classifier\">here<\/a>.<\/p>\n<p><a href=\"https:\/\/forums.xamarin.com\/122818\/\">Discuss the post on the forums!<\/a>\t\t<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Last year Apple released a device-optimized machine learning framework called Core ML to make it as easy as possible to integrate machine learning and artificial intelligence services into your iOS and Mac apps. CoreML is a blessing for developers who lack extensive knowledge of AI or machine learning, because getting started only requires referencing a [&hellip;]<\/p>\n","protected":false},"author":558,"featured_media":39167,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[2],"tags":[6,4],"class_list":["post-35666","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-developers","tag-ios","tag-xamarin-platform"],"acf":[],"blog_post_summary":"<p>Last year Apple released a device-optimized machine learning framework called Core ML to make it as easy as possible to integrate machine learning and artificial intelligence services into your iOS and Mac apps. CoreML is a blessing for developers who lack extensive knowledge of AI or machine learning, because getting started only requires referencing a [&hellip;]<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/posts\/35666","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/users\/558"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/comments?post=35666"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/posts\/35666\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/media\/39167"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/media?parent=35666"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/categories?post=35666"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/tags?post=35666"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}