{"id":3394,"date":"2017-06-29T15:49:57","date_gmt":"2017-06-29T22:49:57","guid":{"rendered":"https:\/\/www.microsoft.com\/reallifecode\/?p=3394"},"modified":"2020-03-14T20:40:54","modified_gmt":"2020-03-15T03:40:54","slug":"iot-sports-sensor-machine-learning-helps-amateurs-up-their-game","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/ise\/iot-sports-sensor-machine-learning-helps-amateurs-up-their-game\/","title":{"rendered":"IoT Sports Sensor Machine Learning Helps Amateurs Up Their Game"},"content":{"rendered":"<p>Recently, Microsoft partnered with the <a href=\"http:\/\/www.thesnowpros.org\/\">Professional Ski Instructors of America and the American Association of Snowboard Instructors, (PSIA-AASI)<\/a>\u00a0to use wearable IoT sensors in order to develop a machine learning model of skiing skills and expertise levels.\u00a0 PSIA-AASI continually evaluates new technology and methods for nurturing learning experiences for developing skiers and snowboarders.<\/p>\n<h2>IoT Human Sensor Data Enables Skill Measurement<\/h2>\n<p>Watching a professional skier practicing drills, it\u2019s easy to recognize their expertise level.\u00a0 Our challenge was to create data to help characterize this expertise level.\u00a0 With wearable IoT sensors, we can collect positional and motion data that allow us to measure this expertise level distinction between professionals and amateurs with high precision and accuracy.\u00a0 In our analysis, we discovered the sensor data from just nine body positions provides ample signal to generate distinct activity signatures for the professional skiers\u00a0when compared with the amateurs.<\/p>\n<p><figure id=\"attachment_3907\" aria-labelledby=\"figcaption_attachment_3907\" class=\"wp-caption aligncenter\" ><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/skiers2-1024x682-1.jpg\" alt=\"Image skiers2 1024 215 682\" width=\"1024\" height=\"682\" class=\"aligncenter size-full wp-image-10920\" srcset=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/skiers2-1024x682-1.jpg 1024w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/skiers2-1024x682-1-300x200.jpg 300w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/skiers2-1024x682-1-768x512.jpg 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption id=\"figcaption_attachment_3907\" class=\"wp-caption-text\">PSIA-AASI Members Generate Skiing Sensor Data at Snowbird, Utah<\/figcaption><\/figure><\/p>\n<p>Examining the sensor data, one can see the pros\u2019 tight adherence to proper form throughout the skill, lack of erratic movement, and precision in execution. These distinct differences in absolute and relative measures of these nine body positions allow us to construct a powerful and simple classification model to categorize skiers into different expertise levels.<\/p>\n<p>We think this type of classification model can be used by the amateur to help them understand differences between their performance and that of the pro, and allow them to improve overall form and skill execution. \u00a0Ski and snowboard instructors can customize training strategies for each trainee, based on the insights provided by the model and the quantitative data analysis.\u00a0\u00a0As a result,\u00a0training can become more efficient.\u00a0 Over time, and with more data, more models can be created to differentiate finer-grained expertise and skill execution levels.<\/p>\n<h2>Data Gathering to Sports Activity Machine Learning<\/h2>\n<p>With PSIA-AASI, we wanted to allow amateurs to compare their own skiing data to the pros&#8217; and classify their skill level, as well as to examine specific positional and gestural differences in their skill performance.\u00a0\u00a0The Microsoft and PSIA-AASI teams worked\u00a0together at the\u00a0Snowbird ski resort to gather the field data and build the concrete data model that would give aspiring amateurs guidance on how to improve.\u00a0\u00a0In this code story, we&#8217;ll describe the steps we took to gather the data and develop the model.\u00a0 We&#8217;ll also provide links to the R script and data set that you can use to recreate this solution.<\/p>\n<p><figure id=\"attachment_3898\" aria-labelledby=\"figcaption_attachment_3898\" class=\"wp-caption aligncenter\" ><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/guggs-1024x768-1.jpg\" alt=\"Image guggs 1024 215 768\" width=\"1024\" height=\"768\" class=\"aligncenter size-full wp-image-10915\" srcset=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/guggs-1024x768-1.jpg 1024w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/guggs-1024x768-1-300x225.jpg 300w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/guggs-1024x768-1-768x576.jpg 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption id=\"figcaption_attachment_3898\" class=\"wp-caption-text\">Nick Herrin of PSIA-AASI with Steve Guggenheimer of Microsoft<\/figcaption><\/figure><\/p>\n<p><!--more--><\/p>\n<h3>IoT Device and Raw Sensor Data<\/h3>\n<p>Each of the wearable IoT sensors measures position, acceleration, and rotation individually, and records it with a time stamp.\u00a0 The variables emitted include positional variables x, y and z that represent position in <a href=\"https:\/\/en.wikipedia.org\/wiki\/Three-dimensional_space\">three-dimensional space<\/a>; rotational matrix variables qW, qX, qY, qZ that represent <a href=\"http:\/\/web.cse.ohio-state.edu\/~shen.94\/781\/Site\/Slides_files\/quaternion.pdf\">3D rotation<\/a>; and aX, aY and aZ that represent coordinate\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Acceleration_(special_relativity)#Three-acceleration\">acceleration<\/a>.<\/p>\n<p>While the sample rate of sensors varies, we found 100 Hz to be a minimum rate for modeling.\u00a0 There are a number of IoT sensors that capture these data at this sample rate (<a href=\"https:\/\/github.com\/CatalystCode\/sportssensor\/blob\/master\/Hardware\/IoTSensorReference.md\">more details\u00a0on hardware\u00a0options<\/a>).<\/p>\n<p>Below is an example of the raw data variables emitted by each one of the sensors.<\/p>\n<p><!-- TODO - missing image \n[caption id=\"attachment_4042\" align=\"alignnone\" width=\"709\"]<img decoding=\"async\" class=\"wp-image-4042\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/long-data-schema.png\" alt=\"Raw Sensor Data Example\" width=\"709\" height=\"384\" \/> Raw Sensor Data Example[\/caption]\n--><\/p>\n<h3>Human Sensor Placement<\/h3>\n<p>We placed the sensors according to the diagram included below.\u00a0 The data from the bright green sensors\u00a0were used in the final classification model, as\u00a0features generated from these sensors&#8217; data were highly predictive of skier expertise.\u00a0 The\u00a0features derived from the light green colored sensors&#8217; data\u00a0were not as predictive and were discarded from the final model.<\/p>\n<p>We calibrated the sensors to align the timestamp for each sampling before gathering data.\u00a0\u00a0If you cannot calibrate sensors prior to\u00a0data gathering, this manual timestamp alignment can be\u00a0done as a post-processing step, presuming the sampling rate was high enough, and there is minimal missing signal.\u00a0 For reference, here is a <a href=\"https:\/\/github.com\/CatalystCode\/sportssensor\/blob\/master\/RScripts\/StepOptional_TimeAlignRawData\">time align R script example<\/a>.<\/p>\n<p><img decoding=\"async\" class=\"aligncenter wp-image-4023\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/sensorpositions2.png\" alt=\"Sensor Positions on the Body. \" width=\"235\" height=\"414\" \/><\/p>\n<h3>Experimental Design<\/h3>\n<p>We\u00a0set out to capture\u00a0data for several PSIA-AASI professional-level skiers and several intermediate skiers as they performed a number of skill drills.\u00a0 Our skill drills included short, medium and large radius turns.\u00a0 We noted the start of the actual drill in order to exclude non-drill data from the data model.\u00a0 We also noted the name, drill and skill level of the skier in order to annotate the data file later.<\/p>\n<p><figure id=\"attachment_3897\" aria-labelledby=\"figcaption_attachment_3897\" class=\"wp-caption aligncenter\" ><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/recording-768x729-1.jpg\" alt=\"Image recording 768 215 729\" width=\"768\" height=\"729\" class=\"aligncenter size-full wp-image-10918\" srcset=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/recording-768x729-1.jpg 768w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/recording-768x729-1-300x285.jpg 300w\" sizes=\"(max-width: 768px) 100vw, 768px\" \/><figcaption id=\"figcaption_attachment_3897\" class=\"wp-caption-text\">Making Field Notes on Experiment<\/figcaption><\/figure><\/p>\n<p>We stored our experimental data locally, then batch uploaded it to an Azure SQL database\u00a0after the experiments were complete.\u00a0 We also tested devices using Bluetooth to stream the data to a mobile device carried by the skier, though in modeling we only used the locally stored data as we didn&#8217;t have a time constraint requiring real-time streaming, and the\u00a0locally stored data had no data loss.<\/p>\n<h3>Clean and Transform the Data<\/h3>\n<p>We wrote our data to Azure SQL, and applied experiment and athlete label data, then imported it into the <a href=\"https:\/\/azure.microsoft.com\/en-us\/services\/machine-learning\/\">Azure Machine Learning<\/a> workspace for the rest of the data transformation and modeling.\u00a0 Options for AML data import include Azure SQL Database, Azure Blob Storage, Azure Table, Azure Document DB and others.<\/p>\n<p><figure id=\"attachment_3400\" aria-labelledby=\"figcaption_attachment_3400\" class=\"wp-caption aligncenter\" ><img decoding=\"async\" class=\"wp-image-3400 size-full\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/skisensor7.png\" alt=\"Time Series Visualization of x, y, and z Positional Variables\" width=\"529\" height=\"315\" \/><figcaption id=\"figcaption_attachment_3400\" class=\"wp-caption-text\">Time Series Visualization of x, y, and z Positional Variables<\/figcaption><\/figure><\/p>\n<p>Originally, the data was saved in a &#8220;long&#8221; format, meaning that every row in the dataset represents readings from a sensor at one sampling time stamp.\u00a0 You can see the time series\u00a0graph of each of these rows above. \u00a0If we had five sensors, there would be five rows of records for each sampling time stamp.\u00a0 This data format presents some challenges for data analysis and modeling.\u00a0 We transformed this raw &#8220;long&#8221; data into the &#8220;wide&#8221; data format, with one row for each athlete and experiment. Refer to the <a href=\"https:\/\/aka.ms\/sports-sensor-transform-data\">R script<\/a> for the specific code we used for this transformation, as well as the transformations and modeling that follow.<\/p>\n<h3>Generate Virtual Sensory Data from Physical Sensors<\/h3>\n<p>With guidance from the PSIA-AASI professionals, we characterized what hallmark differences might be captured between the pros and amateurs in the sensor data. These differences included the relative position of their upper body vs. lower body, the relative position of each of their legs and feet, and how they took a turn. With this domain expert insight,\u00a0we generate virtual\u00a0sensory data from the raw physical sensors\u00a0that characterize limb position relative to one another, and the upper body relative to the lower body.<\/p>\n<p>Later on, at the data analysis and modeling stage,\u00a0we learned\u00a0the features that best illustrate these differences in skiing include the normalized difference between upper body and lower body, the relational positions and rotation of upper body and lower body, and the relational position and rotation of the limbs.\u00a0 The screenshot below shows a few of these engineered features.\u00a0 You can find all the\u00a0extra virtual sensory data we created\u00a0in the <a href=\"https:\/\/aka.ms\/sports-sensor-generated-sensors\">R script<\/a>.<\/p>\n<p><figure id=\"attachment_3398\" aria-labelledby=\"figcaption_attachment_3398\" class=\"wp-caption alignnone\" ><img decoding=\"async\" class=\"wp-image-3398 size-full\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/skisensor9.png\" alt=\"R Script Module to Generate Virtual Sensory Data Features in Azure ML\" width=\"900\" height=\"504\" \/><figcaption id=\"figcaption_attachment_3398\" class=\"wp-caption-text\">R Script Module to Generate Virtual Sensory Data Features in Azure ML<\/figcaption><\/figure><\/p>\n<h3>Engineer Features by Generating Periodic Descriptive Measures<\/h3>\n<p>Next, we broke our ski activity sample into small activity interval slices; in our case, time slices of two seconds apiece. We generated summary statistics as well as frequency and frequency covariance statistics over these time windows. We then ran a <a href=\"https:\/\/stat.ethz.ch\/R-manual\/R-devel\/library\/stats\/html\/fft.html\">fast-discrete Fourier transform function <\/a>on each of the raw sensory signals to\u00a0convert the data from being represented in a time domain to being represented in a frequency domain. This is measured by the magnitude of each frequency (from 0Hz to half of the sample frequency, 240Hz) for a given time interval slice. These generated frequency components of our sensor signal include constant power (frequency 0), low-band average power (1-40Hz), mid-band average power (41-80Hz), and high-band average power (81-120Hz). Finally, we generated cross-correlation measures on select variables to measure the similarity of various two series combinations (for more details, see the <a href=\"https:\/\/aka.ms\/sports-sensor-feature-engineering\">R script <\/a>of this feature engineering).<\/p>\n<h3>Create a Classification ML Model<\/h3>\n<p>After feature engineering, we had 589 observations and 2144 features.\u00a0 Training a machine learning model with such a wide data set with so few observations would have introduced over-fitting problems, even using the simplest logistic regression model.\u00a0 So, we first used &#8216;Filter Based Feature Selection&#8217; in Azure ML to select the top thirty features, based on strength of their relationship with the target variable, &#8216;SkillLevel&#8217;, as measured by Mutual Information. \u00a0 We did the feature selection only on the training data, which is a randomly selected 70% of the original dataset.\u00a0 The remaining 30% of records were held out from the feature selection and model training to be used as validation data.<\/p>\n<p><figure id=\"attachment_3930\" aria-labelledby=\"figcaption_attachment_3930\" class=\"wp-caption aligncenter\" ><img decoding=\"async\" class=\"wp-image-3930 size-large\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/modelpic3.png\" alt=\"Overview of Classification Model in Azure ML\" width=\"780\" height=\"594\" \/><figcaption id=\"figcaption_attachment_3930\" class=\"wp-caption-text\">Overview of Classification Model in Azure ML<\/figcaption><\/figure><\/p>\n<p>When we tested the model on the validation data, we found the model correctly predicted classification of professional vs. amateur expertise levels across all skills 99% of the time.<\/p>\n<p><figure id=\"attachment_3928\" aria-labelledby=\"figcaption_attachment_3928\" class=\"wp-caption alignnone\" ><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/modelpic-1024x655-1.png\" alt=\"Image modelpic 1024 215 655\" width=\"1024\" height=\"655\" class=\"aligncenter size-full wp-image-10917\" srcset=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/modelpic-1024x655-1.png 1024w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/modelpic-1024x655-1-300x192.png 300w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/modelpic-1024x655-1-768x491.png 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption id=\"figcaption_attachment_3928\" class=\"wp-caption-text\">Model Evaluation Results with ROC Curve and Confusion Matrix in Azure ML<\/figcaption><\/figure><\/p>\n<h3>Access the R Script and the Full Experiment<\/h3>\n<p>The entire R script can be <a href=\"https:\/\/aka.ms\/sensorkitongithub\">accessed on GitHub<\/a>. \u00a0The end-to-end pipeline, including the data transformations, feature engineering, feature selection and machine learning model training and validation has been implemented in an Azure Machine Learning experiment and <a href=\"https:\/\/aka.ms\/sports-sensor-aml-exp\">published in the Cortana Intelligence Suite Gallery<\/a>.\u00a0 With this gallery experiment, you can reproduce the results presented here.<\/p>\n<h3>Animate Sensor Data with Avatars<\/h3>\n<p>For even more insight, you can turn your sensor data into your own action avatar and compare it with avatars from the pros&#8217; sensor data.<\/p>\n<p>We processed the data into a 3D skeletal animation format and imported the data into the <a href=\"https:\/\/unity3d.com\/\">Unity development environment<\/a> to create the video you see below.\u00a0 We relied on the Humanoid Animation Retarget system to create a visual representation of a skier drill.\u00a0 In addition to this video, the 3D animation can be visualized from different perspectives and angles to better understand differences of the amateur on the left\u00a0and one of the\u00a0professionals on the right\u00a0in terms of body positioning.<\/p>\n<p><iframe title=\"Sports Avatar with Unity\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/aBu8NNi8Ack?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p>&nbsp;<\/p>\n<h2 id=\"try-it-out\">Try It Out Yourself with our IoT Sensor Data<\/h2>\n<p>We invite\u00a0sports enthusiasts and those interested\u00a0in IoT\u00a0Machine Learning\u00a0to analyze the<a href=\"https:\/\/aka.ms\/sportssensordata\">\u00a0full raw data set<\/a> (~1.4 GB) and build the classification model for themselves.\u00a0 Since data transformations and feature engineering on the full raw sensor data set take some time (more than a couple of hours), we also\u00a0<a href=\"https:\/\/aka.ms\/sports-sensor-feature-set\">share the smaller feature engineered dataset<\/a>\u00a0(~22MB) for your convenience.<\/p>\n<h3>Drill into Specific Comparisons<\/h3>\n<p>In addition to the expertise classification information, the amateur can receive guidance\u00a0on areas for improvement by investigating their own skiing differences with the professionals&#8217; performance against specific measures.<\/p>\n<p>Here is an example of insights you can get from the data analysis.\u00a0 The variable depicted here, &#8216;trunk_twist_max&#8217; is a feature we developed to measure the angle between the upper trunk (a plane spanned by both shoulders and pelvis), and the lower trunk (a plane spanned by both lower legs and pelvis).\u00a0 In this case, a positive value means the angle is smaller than ninety degrees.\u00a0 From this visualization, you can instantly tell that the professional skiers (SkillLevel\u00a0of 1) hold more of a crouching pose, and the amateur skiers (SkillLevel of 0) have more of a back-sitting pose. Does this insight resonate with your experience?<\/p>\n<p><figure id=\"attachment_3890\" aria-labelledby=\"figcaption_attachment_3890\" class=\"wp-caption alignleft\" ><img decoding=\"async\" class=\"wp-image-3890 size-large\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/Drilldown.png\" alt=\"Box Plot of Trunk Twist Max Variable - Comparison of Intermediate (0) vs. Professional (1)\" width=\"780\" height=\"412\" \/><figcaption id=\"figcaption_attachment_3890\" class=\"wp-caption-text\">Box Plot of Trunk Twist Max Variable &#8211; Comparison of Intermediate (0) vs. Professional (1)<\/figcaption><\/figure><\/p>\n<h2>Help Us Expand this Model to Other Sports<\/h2>\n<p>We also invite you to help us expand this sports activity model by adding sports activity models for additional sports and activities. Read about our latest Sports Sensor, IoT,\u00a0<a href=\"\/developerblog\/category\/machine-learning\/\">Machine Learning <\/a>and Unity3D\u00a0work\u00a0at <a href=\"\/developerblog\/\">Real Life Code<\/a>.<\/p>\n<p>Contribute to the sensor kit, sports activity data, data analysis and feature engineering scripts found in <a href=\"https:\/\/aka.ms\/sensorkitongithub\">the project&#8217;s GitHub repo<\/a>. \u00a0You can also reach out to<a href=\"https:\/\/twitter.com\/kashleytwit\"> Kevin Ashley<\/a> or <a href=\"https:\/\/www.linkedin.com\/in\/maxzilberman\/\">Max Zilberman<\/a> at Microsoft, or leave us comments below.<\/p>\n<p><figure id=\"attachment_3894\" aria-labelledby=\"figcaption_attachment_3894\" class=\"wp-caption aligncenter\" ><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/cse\/wp-content\/uploads\/sites\/55\/2017\/06\/snowbird-1024x768-1.jpg\" alt=\"Image snowbird 1024 215 768\" width=\"1024\" height=\"768\" class=\"aligncenter size-full wp-image-10923\" srcset=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/snowbird-1024x768-1.jpg 1024w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/snowbird-1024x768-1-300x225.jpg 300w, https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2017\/06\/snowbird-1024x768-1-768x576.jpg 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption id=\"figcaption_attachment_3894\" class=\"wp-caption-text\">Microsoft Team Members Max Zilberman, Sebastien Vandenberghe and Kevin Ashley<\/figcaption><\/figure><\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>We use IoT sensors to collect positional and motion data from professional and amateur skiers to classify expertise and skill level through machine learning.<\/p>\n","protected":false},"author":21374,"featured_media":10914,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[10,11,18,19],"tags":[81,141,216,239,321,338,369],"class_list":["post-3394","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-azure-app-services","category-big-data","category-iot","category-machine-learning","tag-azure-machine-learning-ml-studio","tag-data-preparation","tag-iot","tag-machine-learning-ml","tag-sensors","tag-sports","tag-unity3d"],"acf":[],"blog_post_summary":"<p>We use IoT sensors to collect positional and motion data from professional and amateur skiers to classify expertise and skill level through machine learning.<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/posts\/3394","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/users\/21374"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/comments?post=3394"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/posts\/3394\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/media\/10914"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/media?parent=3394"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/categories?post=3394"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/tags?post=3394"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}