{"id":7688,"date":"2013-09-26T08:00:29","date_gmt":"2013-09-26T12:00:29","guid":{"rendered":"http:\/\/blog.xamarin.com\/?p=7688"},"modified":"2013-09-26T08:00:29","modified_gmt":"2013-09-26T12:00:29","slug":"make-your-ios-7-app-speak","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/xamarin\/make-your-ios-7-app-speak\/","title":{"rendered":"Make Your iOS 7 App Speak"},"content":{"rendered":"<p dir=\"ltr\">Have you ever wanted to add text to speech capability in an iOS application? Before iOS 7, this required using a third party library; however, with iOS 7 speech synthesis is built into the platform. What&#8217;s more, adding speech synthesis only requires a few lines of code.<\/p>\n<p dir=\"ltr\">The class that synthesizes text to speech is the AVSpeechSynthesizer. <a href=\"http:\/\/www.youtube.com\/watch?v=uCWKZWieMSY\"><img decoding=\"async\" class=\"size-medium wp-image-7724 alignright\" alt=\"shallweplayagame\" src=\"\/wp-content\/uploads\/sites\/44\/2019\/04\/shallweplayagame-300x214.png\" width=\"300\" height=\"214\" \/><\/a>This class works with an AVSpeechUtterance instance that encapsulates the text to synthesize. You simply pass an AVSpeechUtterance instance to the synthesizer&#8217;s SpeakUtterance method and the text is &#8220;spoken&#8221; by the iOS device.<\/p>\n<p dir=\"ltr\">The following example is all you need to have text to speech on iOS 7:<\/p>\n<pre class=\"lang:csharp decode:true\">\nvar speechSynthesizer = new AVSpeechSynthesizer ();\nvar speechUtterance =\n  new AVSpeechUtterance (&quot;Shall we play a game?&quot;);\nspeechSynthesizer.SpeakUtterance (speechUtterance);\n<\/pre>\n<p dir=\"ltr\">The AVSpeechUtterance also includes several properties that allow you to control the audio output of the synthesized text. These include:<\/p>\n<ul>\n<li>\n<p dir=\"ltr\">Rate &#8211; The speed at which the speech plays back.<\/p>\n<\/li>\n<li>\n<p dir=\"ltr\">Voice &#8211; An AVSpeechSynthesisVoice instance used to speak the text.<\/p>\n<\/li>\n<li>\n<p dir=\"ltr\">Volume &#8211; The volume level of the audio used to speak the text.<\/p>\n<\/li>\n<li>\n<p dir=\"ltr\">PitchMultiplier &#8211; A value between 0.5 and 2.0 to control the pitch of the spoken text.<\/p>\n<\/li>\n<\/ul>\n<p dir=\"ltr\">In particular, I found the default rate speaks a bit too fast on an iPhone 5. Adjusting the rate to 1\/4 the maximum rate, available via the AVSpeechUtterance.MaximumSpeechRate property (there&#8217;s also an AVSpeechUtterance.MinimumSpeechRate) produced a better sounding result.<\/p>\n<p dir=\"ltr\">Even better, you can supply a variety of different voices to the synthesizer, ideally based upon the locale. There&#8217;s even a helper method, AVSpeechSynthesisVoice.GetSpeechVoices, that will return all the available voices.<\/p>\n<p dir=\"ltr\">I thought it would be fun to revisit the <a href=\"https:\/\/github.com\/mikebluestein\/FindTheMonkey\" target=\"_blank\">FindTheMonkey app<\/a> from the previous <a href=\"\/play-find-the-monkey-with-ios-7-ibeacons\/\" target=\"_blank\">iBeacon blog post<\/a> to speak the status message as the user looked for the monkey.<\/p>\n<p dir=\"ltr\">Doing this is incredibly easy. Simply add a Speak method to create the AVSpeechSynthesizer and AVSpeechUtterence instances respectively, and call SpeakUtterance. To spruce it up a bit more, also add a couple sliders to control the pitch multiplier and volume of the AVSpeechUtterance.<\/p>\n<p dir=\"ltr\">Generally, the voice should match the current device locale, but you can set this however you like. Let&#8217;s use Australian English here for example:<\/p>\n<pre class=\"lang:csharp decode:true\">\n    void Speak (string text)\n    {\n        var speechSynthesizer = new AVSpeechSynthesizer ();\n\n        var speechUtterance = new AVSpeechUtterance (text) {\n            Rate = AVSpeechUtterance.MaximumSpeechRate\/4,\n            Voice = AVSpeechSynthesisVoice.FromLanguage (&quot;en-AU&quot;),\n            Volume = volume,\n            PitchMultiplier = pitch\n        };\n\n        speechSynthesizer.SpeakUtterance (speechUtterance);\n    }\n\n    void InitPitchAndVolume ()\n    {\n        volumeSlider.MinValue = 0;\n        volumeSlider.MaxValue = 1.0f;\n        volumeSlider.SetValue (volume, false);\n\n        pitchSlider.MinValue = 0.5f;\n        pitchSlider.MaxValue = 2.0f;\n        pitchSlider.SetValue (pitch, false);\n\n        volumeSlider.ValueChanged += (sender, e) =&gt; {\n            volume = volumeSlider.Value;\n        };\n\n        pitchSlider.ValueChanged += (sender, e) =&gt; {\n            pitch = volumeSlider.Value;\n        };\n    }\n<\/pre>\n<p dir=\"ltr\">Then just call Speak when the proximity changes and voil\u00e0, FindTheMonkey now has text to speech capability!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Have you ever wanted to add text to speech capability in an iOS application? Before iOS 7, this required using a third party library; however, with iOS 7 speech synthesis is built into the platform. What&#8217;s more, adding speech synthesis only requires a few lines of code. The class that synthesizes text to speech is [&hellip;]<\/p>\n","protected":false},"author":1932,"featured_media":39167,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[2],"tags":[6,4],"class_list":["post-7688","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-developers","tag-ios","tag-xamarin-platform"],"acf":[],"blog_post_summary":"<p>Have you ever wanted to add text to speech capability in an iOS application? Before iOS 7, this required using a third party library; however, with iOS 7 speech synthesis is built into the platform. What&#8217;s more, adding speech synthesis only requires a few lines of code. The class that synthesizes text to speech is [&hellip;]<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/posts\/7688","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/users\/1932"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/comments?post=7688"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/posts\/7688\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/media\/39167"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/media?parent=7688"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/categories?post=7688"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/xamarin\/wp-json\/wp\/v2\/tags?post=7688"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}