{"id":12035,"date":"2017-06-30T02:35:00","date_gmt":"2017-06-30T02:35:00","guid":{"rendered":"https:\/\/blogs.msdn.microsoft.com\/premier_developer\/?p=12035"},"modified":"2019-03-05T13:04:22","modified_gmt":"2019-03-05T20:04:22","slug":"hololens-animations","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/premier-developer\/hololens-animations\/","title":{"rendered":"HoloLens Animations"},"content":{"rendered":"<p>This post is provided by App Dev Manager, <a href=\"https:\/\/www.linkedin.com\/in\/robertdschumann\/\">Robert Schumann<\/a> who explores the process of creating and animating avatars with HoloLens.<\/p>\n<hr \/>\n<h2>HoloLens Ambitious Beginnings<\/h2>\n<p>My first application on HoloLens involved animating a Marine Corps squad to do PT. Using voice commands, avatars would do sit-ups, push-ups, burpees, jumping jacks, etc. That was a year ago and a brutally na\u00efve, overly ambitious learning curve to overcome. Since then HoloLens has its first \u201cbirthday\u201d, seen huge interest and growth, and countless tutorials created. But few tutorials exist that provide straight-forward guidance about building animated apps for HoloLens.<\/p>\n<p>This blog outlines steps I took to create, animate, and incorporate avatars into HoloLens. It involves the use of free, open source tools such as Adobe Fuse, Mixamo, Unity Editor, and Visual Studio. If you follow along and reach the end of this blog you should have a working HoloLens application of an avatar doing animation controlled by voice commands.<\/p>\n<p>Here we go\u2026<\/p>\n<h2>Setup<\/h2>\n<p>My setup is based on a Windows 10 Creators Update PC, so pictures and navigation may vary if you\u2019re using a different platform and applications. Make sure your hardware supports the following soft resources needed for this blog;<\/p>\n<ul>\n<li>Adobe Fuse (Beta)<\/li>\n<li>Unity Editor 5.6<\/li>\n<li>Visual Studio Community 2017<\/li>\n<li>HoloToolkit-Unity from Github<\/li>\n<\/ul>\n<p>You may also need online accounts from the following vendors;<\/p>\n<ul>\n<li>Adobe Creative Cloud (https:\/\/accounts.adobe.com)<\/li>\n<li>Unity (https:\/\/id.unity.com)<\/li>\n<li>Microsoft (<a href=\"https:\/\/signup.live.com\">https:\/\/signup.live.com<\/a>)<\/li>\n<\/ul>\n<p>If not already proceed to sign-up, download, install, and setup above resources. Some setup notes are provided below.<\/p>\n<p>Additionally, the following links provide overviews of important concepts for completing this blog. Of note is how to correct Mixamo character eyelashes when importing into Unity.<\/p>\n<ul>\n<li><a href=\"https:\/\/developer.microsoft.com\/en-us\/windows\/mixed-reality\/academy\">https:\/\/developer.microsoft.com\/en-us\/windows\/mixed-reality\/academy<\/a><\/li>\n<li><a href=\"https:\/\/www.youtube.com\/watch?v=gpvNBiBv_sI\">https:\/\/www.youtube.com\/watch?v=gpvNBiBv_sI<\/a><\/li>\n<li><a href=\"https:\/\/www.youtube.com\/watch?v=4BX0oeCKf0w\">https:\/\/www.youtube.com\/watch?v=4BX0oeCKf0w<\/a><\/li>\n<li>(eyelash fix) <a href=\"https:\/\/www.youtube.com\/watch?v=YHlWcmmDNVg\">https:\/\/www.youtube.com\/watch?v=YHlWcmmDNVg<\/a><\/li>\n<li>(optional, webinar, 55 min.) <a href=\"https:\/\/www.youtube.com\/watch?v=wp9FB6NQu1U\">https:\/\/www.youtube.com\/watch?v=wp9FB6NQu1U<\/a><\/li>\n<\/ul>\n<h2>Unity Editor<\/h2>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/downloadunity1.png\"><img decoding=\"async\" style=\"margin: 0px 0px 0px 8px; float: right;\" title=\"downloadunity\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/downloadunity_thumb1.png\" alt=\"downloadunity\" width=\"324\" height=\"272\" align=\"right\" border=\"0\" \/><\/a>When installing Unity Editor, there\u2019s an option to also download and install Visual Studio Community 2017 (VSC2017). If you don\u2019t already have VSC2017 go ahead and check that box. Otherwise, default settings will suffice.<\/p>\n<h2>Spin up<\/h2>\n<p>If you\u2019re new to any of the tools used in this blog take a few moments to at least get a familiar grasp of navigating around and identifying key functional aspects.<\/p>\n<h4>Adobe Fuse (Beta)<\/h4>\n<p>Help: <a href=\"https:\/\/helpx.adobe.com\/beta\/fuse\/topics.html\">https:\/\/helpx.adobe.com\/beta\/fuse\/topics.html<\/a><\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/creativecloud1.png\"><img decoding=\"async\" style=\"margin: 0px 0px 0px 8px; float: right;\" title=\"creativecloud\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/creativecloud_thumb1.png\" alt=\"creativecloud\" width=\"324\" height=\"341\" align=\"right\" border=\"0\" \/><\/a>Start Creative Cloud (CC) making sure you log on to your Adobe account. At CC console, select the Apps tab. Make sure your Fuse installation is up-to-date.<\/p>\n<p>Once logged on and up-to-date, click the Open button on the CC console. When Fuse starts, you\u2019ll see a dark canvas center and body parts at right. Pretty weird uh? Ok, so go ahead and drag-n-drop body parts to center canvas; choose a head, torso, leg, and arm.<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/avatar.png\"><img decoding=\"async\" title=\"avatar\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/avatar_thumb.png\" alt=\"avatar\" width=\"1028\" height=\"573\" border=\"0\" \/><\/a><\/p>\n<p>Click the Customize link located above the 3D model. In the right window pane, feel free to randomize attributes of each body section, or modify the attributes individually.<\/p>\n<p><img decoding=\"async\" class=\"size-full wp-image-35745 alignright\" src=\"http:\/\/devblogs.microsoft.com\/premier-developer\/wp-content\/uploads\/sites\/31\/2017\/06\/hla1.jpg\" alt=\"\" width=\"137\" height=\"484\" srcset=\"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-content\/uploads\/sites\/31\/2017\/06\/hla1.jpg 137w, https:\/\/devblogs.microsoft.com\/premier-developer\/wp-content\/uploads\/sites\/31\/2017\/06\/hla1-85x300.jpg 85w\" sizes=\"(max-width: 137px) 100vw, 137px\" \/>Next click the Clothing link located above the 3D model, next to Customize. Again, in the right window pane, chose clothing for each body section of the 3D model.<\/p>\n<p>Next click the Texture link located above the 3D model, next to Clothing. Click on the 3D model <b>skin<\/b> anywhere. Now in the right window pane options appear to change specific attributes of the 3D model. Things like eye and hair color, skin tone, smoothness and age, and teeth condition and coloring, etc. To change specific attributes about other aspects of the model like clothing color, make sure you\u2019re in Texture mode and click on that part of the model you\u2019d like to modify, options appear in right pane. Throughout all your changes for now leave model resolutions at default values.<\/p>\n<p>If you\u2019re wondering how to change the model pose from being something other than a T-pose don\u2019t worry about that. Just use Fuse to model your avatar for now. We\u2019ll use mixamo.com for rigging and animating the model next.<\/p>\n<p>In Fuse save your work but don\u2019t close anything, leave Fuse running and model ready.<\/p>\n<p>BTW &#8211; if you\u2019ve ever wanted an 3D avatar of yourself check-out the following link:\n<a href=\"http:\/\/blog.mixamo.com\/how-to-scan-yourself-into-a-game-ready-3d-model-using-kinect-and-fuse\/\">http:\/\/blog.mixamo.com\/how-to-scan-yourself-into-a-game-ready-3d-model-using-kinect-and-fuse\/<\/a><\/p>\n<h2>Mixamo<\/h2>\n<p>Help: <a href=\"https:\/\/community.mixamo.com\/hc\/en-us\">https:\/\/community.mixamo.com\/hc\/en-us<\/a><\/p>\n<p>Go to Mixamo.com and sign in with your Adobe account. Now go back to Fuse and located at the top right corner of the screen you should see the Send to Mixamo button. Click it. Give your model a name. Click Save. Fuse will now send your model over to Mixamo as an asset. Here\u2019s where Fuse and Mixamo start to earn their keep.<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/autorigger1.png\"><img decoding=\"async\" style=\"margin: 0px 8px 0px 0px; float: right;\" title=\"autorigger\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/autorigger_thumb1.png\" alt=\"autorigger\" width=\"644\" height=\"398\" align=\"right\" border=\"0\" \/><\/a>A new browser session should begin with your model loaded and ready for rigging. Rigging basically gives your model a skeleton by which movement can be applied, thus enabling animation. Fuse and Mixamo greatly simplify rigging, which can be very time consuming. On the Auto-Rigger screen, <i>enable<\/i> Facial Blendshapes, choose 65 joints for Skeleton LOD, then click Finish. (If you\u2019re wondering why your model appears to have black-eyes this is \u201cnormal\u201d and we\u2019ll correct this later.)<\/p>\n<p>A new window should load with your model in a T-pose. Again, hold off doing any animations. Click the Queue Download button. Then choose FBX for Unity (.Fbx) format and Original Pose (.Fbx) for Pose. Again, click Queue Download button. Once ready go ahead and download your new character. Ok. We just saved off our new character without any animations baked in, per se. Now let\u2019s find animations we want to use.<\/p>\n<p>(BEGIN REPEATABLE STEP)<\/p>\n<p>Click Store link at top, then Animations link under the Search bar. Go ahead and peruse through the animations, but for this blog I\u2019m going to use Standing Idle, Samba, Macarena, YMCA, and Salsa animations. Click an animation you\u2019re interested in using. The screen should change and your character should display doing the animation selected. Click the \u201cAdd to Pack\u201d button right of your character. Multiple animations can be added to an animation, but for demo purposes we\u2019re going to add and save each animation individually. Since we\u2019re going to use the same character it won\u2019t necessarily matter that we\u2019ll later apply different animations. If you were to use a different model\/character, then you\u2019d want to apply the animation to that character instead of trying to use the same animation across different models\/characters. This is because animations as applied through Mixamo are based on the rigging of the character used. So if you have, for example, a one-arm Zombie doing the YMCA dance rigged to a two-arm character, the animation may not replay correctly.<\/p>\n<p>Once an animation has been added to the pack, click the View\/Download button. Click Queue Download button. Choose FBX for Unity (.Fbx), for Pose or Skin choose Without, Frames per second change to 60. Click Queue Download button and when ready download file. Repeat for each animation.<\/p>\n<p>(END REPEATABLE STEP)<\/p>\n<p>Alrighty; that should be it for Fuse and Mixamo. Before we switch gears to Unity, let\u2019s do a quick assessment to make sure we\u2019re ready to move ahead. By now you should have the Unity Editor installed. You should have the HoloToolkit for Unity downloaded. You should have an Fuse export of your character textures. And finally you should have at least 2 files downloaded from Mixamo\u2013 character.fbx file, and animation.fbx file(s). Additionally, if you haven\u2019t done so already, you\u2019ll need to make a Unity package of the HoloToolkit-Unity downloaded from Github. For instructions how see <a href=\"https:\/\/github.com\/Microsoft\/HoloToolkit-Unity\/blob\/master\/GettingStarted.md\">https:\/\/github.com\/Microsoft\/HoloToolkit-Unity\/blob\/master\/GettingStarted.md<\/a>.<\/p>\n<h2>Unity<\/h2>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/clip_image00262.png\"><img decoding=\"async\" style=\"float: right;\" title=\"clip_image002[6]\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/clip_image0026_thumb2.png\" alt=\"clip_image002[6]\" width=\"324\" height=\"266\" align=\"right\" border=\"0\" \/><\/a>Help: <a href=\"https:\/\/docs.unity3d.com\/Manual\/index.html\">https:\/\/docs.unity3d.com\/Manual\/index.html<\/a><\/p>\n<p>First time starting Unity you may get a prompt for firewall settings. Check all boxes to avoid complications with this blog.<\/p>\n<p>At the start splash feel free to log onto your Unity account by clicking \u201cMy Account\u201d, provide a project name, turn off Unity Analytics, and make sure 3D is checked.<\/p>\n<table cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td><\/td>\n<\/tr>\n<tr>\n<td><\/td>\n<td><img decoding=\"async\" class=\"alignnone size-full wp-image-35746\" src=\"http:\/\/devblogs.microsoft.com\/premier-developer\/wp-content\/uploads\/sites\/31\/2017\/06\/hla2.jpg\" alt=\"\" width=\"324\" height=\"158\" srcset=\"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-content\/uploads\/sites\/31\/2017\/06\/hla2.jpg 324w, https:\/\/devblogs.microsoft.com\/premier-developer\/wp-content\/uploads\/sites\/31\/2017\/06\/hla2-300x146.jpg 300w\" sizes=\"(max-width: 324px) 100vw, 324px\" \/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>When the new project begins delete the default camera from the Hierarchy tab. Import the HoloToolkit-Unity custom package, either choose defaults or at least the standard assets without any tests or examples. Now go ahead and replace the default camera with the HoloLensCamera prefab.<\/p>\n<p>Once the toolkit is imported you\u2019ll notice a new menu option called HoloToolkit. Under it are three (3) configuration options. From top-to-bottom choose and apply each one. For Capability Settings choose Microphone, Spatial Perception, and Internet Client.<\/p>\n<p>Within the Project tab create four (4) new folders &#8211; _Animations, _Models, _Prefabs, and _Scripts.<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/maximochar1.png\"><img decoding=\"async\" style=\"margin: 0px 0px 0px 8px; float: right;\" title=\"maximochar\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/maximochar_thumb1.png\" alt=\"maximochar\" width=\"512\" height=\"484\" align=\"right\" border=\"0\" \/><\/a>Let\u2019s add our Mixamo character by dragging and dropping the downloaded character .fbx file to the _Models folder. You should get prompted to fix textures, choose Fix Now. Also drag-n-drop the Fuse textures you previously exported into the _Models folder. You should see one Material object named something like *_Packed0_Diffuse. Select that one Material, in the Inspector window change the Rendering Mode to Opaque and scale down Smoothness completely. In the same folder (still _Models) as this one Material create a new Material, name it Hair. Make sure this new Hair Material is selected in the Project tab. Then, using the exported Fuse textures drag-n-drop the BaseColor texture next to Albedo, MetallicAndSmoothness texture next to Metallic, and Normal texture next to Normal Map. Change the Rendering Mode to Fade, and ensure Smoothness is scaled completely down.<\/p>\n<p>We now need to tweak the model as exported from Mixamo. Within the _Models folder should be a prefab of your model, as represented by the blue icon cube. Select it, and in the Inspector tab select the Rig button. Change Animation Type to Humanoid. Ensure Avatar Definition is Create From This Model. Click the Configure button, saving first if prompted. What you should see now is the skin of your model with bones inside of it. If so, cool, we\u2019re good so far. Under the Inspector tab, the Mapping button should be selected, and just above the Transform properties section, right-side should be a Done button. Click Done. Now drag-n-drop your model prefab onto the Hierarchy window, doesn\u2019t matter where. Then select it from the Hierarchy and expand it exposing all child objects. Click each child object and find the eyelashes and hair objects. As you do change the default Material to the new Hair Material you previously created. Simply select the object then in the Inspector window drag-n-drop the new Hair Material over the existing one to swap out. If all is correct your model should look normal at this point. Once correct, drag-n-drop the root object from the Hierarchy window onto the Project _Prefabs folder in the Project Window. You\u2019ve now just created a prefab and can delete it from the Hierarchy window.<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/BlogAvatar.png\"><img decoding=\"async\" style=\"margin: 0px 0px 0px 8px; float: right;\" title=\"BlogAvatar\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/BlogAvatar_thumb.png\" alt=\"BlogAvatar\" width=\"373\" height=\"484\" align=\"right\" border=\"0\" \/><\/a>Let\u2019s add the Mixamo animations by dragging and dropping each downloaded animation .fbx file to the _Animations folder. Again, you should get prompted to fix textures, choose Fix Now. As you do each one, before starting the next, select the animation prefab and in the Inspector window choose the Rig button. Change the Animation type to Humanoid, but for Avatar Definition this time choose Copy from Other Avatar. Underneath, for Source click the round icon to select an avatar. A new window should pop with just one avatar to choose which should be from the character model you previously imported. Select that avatar, close the select avatar window, click the Apply button in the Inspector. While still in the Inspector window click the Animations button next to the Rig button. Scroll down; check Loop Time, check Bake Into Pose for Root Transform Position (Y), and make sure the Based Upon (at Start) is set to Original. Click Apply, bottom right. Repeat these steps for each animation .fbx file you import.<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/animationstates.png\"><img decoding=\"async\" style=\"float: right;\" title=\"animationstates\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/animationstates_thumb.png\" alt=\"animationstates\" width=\"644\" height=\"197\" align=\"right\" border=\"0\" \/><\/a>Once your model and animations have been imported and staged, we now need to create an Animator Controller. Select the _Animations folder in the Project window and create a new Animation Controller, call it Dancing. If the Unity Editor is not displaying the Animator window, go to the Window menu and choose Animations. Once visible, dock it where ever you like. Then drag the newly created Dancing animation controller onto the Animator window. Now drag-n-drop each animation prefab from the _Animations folder onto the Animator window. As you do you should see block representing each animation with a name on it. Pay attention to the name and text case. Once you have all animations on the controller, right-click the green Entry block and choose Make Transition. You\u2019ll now notice your cursor has an arrow string attached to it. Place your cursor over the standing idle animation and click to lock. You should now see a line from the Entry block connected to the idle animation block.<\/p>\n<p>We\u2019re \u00be done with the hard parts taken care of. And are at the point of assembling everything together.<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/heirarchy.png\"><img decoding=\"async\" style=\"margin: 0px 0px 0px 8px; float: right;\" title=\"heirarchy\" src=\"https:\/\/devblogs.microsoft.com\/wp-content\/uploads\/sites\/31\/2019\/04\/heirarchy_thumb.png\" alt=\"heirarchy\" width=\"450\" height=\"772\" align=\"right\" border=\"0\" \/><\/a>Go back to the _Prefabs folder and select your character prefab. In the Inspector window, for the Animator properties drag-n-drop the Dancing animation controller you just created from the _Animations folder to the Animator Controller property. For the Animator Avatar, again click the round select icon next to the property field and in the pop-up choose your character avatar. Make sure the Apply Root Motion checkbox is checked.<\/p>\n<p>In the Hierarchy window add two HoloToolkit prefabs \u2013 BasicCursor and InputManager. While still in the Hierarchy window select the HoloLensCamera and add 3 components to this object \u2013 Spatial Mapping Collider, Spatial Mapping Renderer, and New C Sharp script. Name the new C# script Manager. By default, the new script is created in the root Assets folder. Drag and drop the Manager script into the _Scripts folder, then double-click it to open in code editor. Replace the existing code with sample code provided below. Once saved go ahead and Build &amp; Run the application from Unity Editor. When Unity finishes building the application and creating a Visual Studio solution deploy the application to your HoloLens device. When running the application, you should see the usual Made with Unity logo then a brief 3-5 second period of nothingness. At this point the spatial mapping functions of the application are beginning to scan the room. As orientation is established you should start to see wire mesh of room surfaces. At this point you should also see a gaze cursor. Find an open space on the floor and air tap. The avatar character should now be standing idle in that spot on the floor and facing you. You can say four voice commands to make the avatar animate. Say \u201cMacarena Dance\u201d, \u201cSamba Dance\u201d, \u201cSalsa Dance\u201d, or \u201cYMCA Dance\u201d to make the avatar dance. Say \u201cIdle Dance\u201d to make the avatar stand idle again.<\/p>\n<p>Enjoy!<\/p>\n<pre class=\"lang:default decode:true \">using System;\r\n\r\nusing System.Collections;\r\n\r\nusing System.Collections.Generic;\r\n\r\nusing UnityEngine;\r\n\r\nusing UnityEngine.UI;\r\n \r\nusing System.Linq;\r\n\r\nusing UnityEngine.VR.WSA.Input;\r\n\r\nusing UnityEngine.Windows.Speech;\r\n\r\nusing HoloToolkit.Unity;\r\n\r\nusing HoloToolkit.Unity.InputModule;\r\n \r\npublic class Manager : MonoBehaviour {\r\n \r\n    public GameObject DancerPrefab;\r\n     private Animator anim;\r\n     private Boolean dancerExist = false;\r\n     private GestureRecognizer gestureRecognizer;\r\n     private KeywordRecognizer keywordRecognizer;\r\n \r\n    delegate void KeywordAction(PhraseRecognizedEventArgs args);\r\n     private Dictionary&lt;string, KeywordAction&gt; keywordCollection;\r\n \r\n    private void Start()\r\n     {\r\n         gestureRecognizer = new GestureRecognizer();\r\n         gestureRecognizer.SetRecognizableGestures(GestureSettings.Tap);\r\n         gestureRecognizer.TappedEvent += Recognizer_TappedEvent;\r\n         gestureRecognizer.StartCapturingGestures();\r\n \r\n        keywordCollection = new Dictionary&lt;string, KeywordAction&gt;();\r\n \r\n        keywordCollection.Add(\"Samba Dance\", SambaDanceCommand);\r\n         keywordCollection.Add(\"Salsa Dance\", SalsaDanceCommand);\r\n         keywordCollection.Add(\"YMCA Dance\", YMCADanceCommand);\r\n         keywordCollection.Add(\"Macarena Dance\", MacarenaDanceCommand);\r\n         keywordCollection.Add(\"Idle Dance\", IdleDanceCommand);\r\n \r\n        keywordRecognizer = new KeywordRecognizer(keywordCollection.Keys.ToArray());\r\n         keywordRecognizer.OnPhraseRecognized += KeywordRecognizer_OnPhraseRecognized;\r\n         keywordRecognizer.Start();\r\n     }\r\n \r\n    private void KeywordRecognizer_OnPhraseRecognized(PhraseRecognizedEventArgs args)\r\n     {\r\n         KeywordAction keywordAction;\r\n \r\n        if (keywordCollection.TryGetValue(args.text, out keywordAction))\r\n         {\r\n             keywordAction.Invoke(args);\r\n         }\r\n     }\r\n \r\n    private void Recognizer_TappedEvent(InteractionSourceKind source, int tapCount, Ray headRay)\r\n     {\r\n         RaycastHit hitInfo;\r\n \r\n        if (!dancerExist &amp;&amp; Physics.Raycast(Camera.main.transform.position, Camera.main.transform.forward, out hitInfo, Mathf.Infinity))\r\n         {                             \r\n             GameObject dancer = Instantiate(DancerPrefab, hitInfo.point, Quaternion.Euler(0, transform.eulerAngles.y + 180f, 0));\r\n             anim = dancer.GetComponent&lt;Animator&gt;();\r\n             dancerExist = true;\r\n \r\n            Camera.main.gameObject.GetComponent&lt;UnityEngine.VR.WSA.SpatialMappingRenderer&gt;().enabled = false;\r\n         }\r\n     }\r\n \r\n    private void MacarenaDanceCommand(PhraseRecognizedEventArgs args)\r\n     {\r\n         anim.Play(\"macarena_dance\", -1, 0f);\r\n     }\r\n \r\n    private void YMCADanceCommand(PhraseRecognizedEventArgs args)\r\n     {\r\n         anim.Play(\"ymca_dance\", -1, 0f);\r\n     }\r\n \r\n    private void SalsaDanceCommand(PhraseRecognizedEventArgs args)\r\n     {\r\n         anim.Play(\"salsa_dance\", -1, 0f);\r\n     }\r\n \r\n    private void SambaDanceCommand(PhraseRecognizedEventArgs args)\r\n     {\r\n         anim.Play(\"samba_dance\", -1, 0f);\r\n     }\r\n \r\n    private void IdleDanceCommand(PhraseRecognizedEventArgs args)\r\n     {\r\n         anim.Play(\"standing_idle\", -1, 0f);\r\n     }\r\n \r\n    private void OnDestroy()\r\n     {\r\n         gestureRecognizer.TappedEvent -= Recognizer_TappedEvent;\r\n         keywordRecognizer.OnPhraseRecognized -= KeywordRecognizer_OnPhraseRecognized;\r\n     }\r\n\r\n}\r\n<\/pre>\n<p>&nbsp;<\/p>\n<hr \/>\n<p><a href=\"https:\/\/blogs.msdn.com\/b\/premier_developer\/archive\/2014\/09\/15\/welcome.aspx\"><strong>Premier Support for Developers<\/strong><\/a> provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.\u00a0 Contact your Application Development Manager (ADM) or <a href=\"https:\/\/blogs.msdn.microsoft.com\/premier_developer\/contact-us\/\">email us<\/a><b><\/b> to learn more about what we can do for you.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This post is provided by App Dev Manager, Robert Schumann who explores the process of creating and animating avatars with HoloLens. HoloLens Ambitious Beginnings My first application on HoloLens involved animating a Marine Corps squad to do PT. Using voice commands, avatars would do sit-ups, push-ups, burpees, jumping jacks, etc. That was a year ago [&hellip;]<\/p>\n","protected":false},"author":582,"featured_media":37840,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[73,3,75],"class_list":["post-12035","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-permierdev","tag-hololens","tag-team","tag-unity"],"acf":[],"blog_post_summary":"<p>This post is provided by App Dev Manager, Robert Schumann who explores the process of creating and animating avatars with HoloLens. HoloLens Ambitious Beginnings My first application on HoloLens involved animating a Marine Corps squad to do PT. Using voice commands, avatars would do sit-ups, push-ups, burpees, jumping jacks, etc. That was a year ago [&hellip;]<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/posts\/12035","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/users\/582"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/comments?post=12035"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/posts\/12035\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/media\/37840"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/media?parent=12035"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/categories?post=12035"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/premier-developer\/wp-json\/wp\/v2\/tags?post=12035"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}