{"id":797,"date":"2021-10-31T10:57:46","date_gmt":"2021-10-31T17:57:46","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/?p=797"},"modified":"2021-10-31T10:57:46","modified_gmt":"2021-10-31T17:57:46","slug":"time-of-flight-camera-characterization-with-functional-modeling-for-synthetic-scene-generation","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/time-of-flight-camera-characterization-with-functional-modeling-for-synthetic-scene-generation\/","title":{"rendered":"Time-of-flight camera characterization with functional modeling for synthetic scene generation"},"content":{"rendered":"<p><span class=\"break-words\"><span dir=\"ltr\">Staying at the forefront of 3D sensing technology development requires continuous investment and innovation.\nMicrosoft&#8217;s paper : \u201c<strong>Time-of-flight camera characterization with functional modeling for synthetic scene generation<\/strong>\u201d,\nwhich was recently published in &#8220;Optics express&#8221; is a clear demonstration of the depth (pun intended) we go to at Microsoft,\nto make sure partners and customers get the best 3D sensing capabilities possible &#8211; <a href=\"https:\/\/www.osapublishing.org\/oe\/fulltext.cfm?uri=oe-29-23-37661&amp;id=462704\">click here<\/a> to read.<\/p>\n<p>In this manuscript, we design, describe, and present a functional model of Time-of-Flight (ToF) cameras.\nThe model can be used to generate randomized scenes that incorporate depth scenarios with various objects\nat various depths with varied orientations and illumination intensity. It also includes ToF artifacts such as Signal Noise,\nCrosstalk and Multipath. Our work can be used to generate as many images as needed for neural network (NN) training and testing.<\/p>\n<p>For the tech geeks out there &#8211; enjoy the read!<\/p>\n<p>For those who are not experts in this technology, but know exactly what machine automation can do to your operational cost, efficiency and quality : \u00a0when you choose solutions based on Microsoft depth sensing technology, not only you get a market leading technology, but also the brain power of some of the most talented optics physicists and engineers in the world, to constantly work on improving your machines automation using 3D sensing &amp; AI.<\/span><\/span><\/p>\n<p>Kudos to Sergio Ortiz Egea, Mukhil Azhagan and Augustine Cha for the publication !<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Staying at the forefront of 3D sensing technology development requires continuous investment and innovation. Microsoft&#8217;s paper : \u201cTime-of-flight camera characterization with functional modeling for synthetic scene generation\u201d, which was recently published in &#8220;Optics express&#8221; is a clear demonstration of the depth (pun intended) we go to at Microsoft, to make sure partners and customers get [&hellip;]<\/p>\n","protected":false},"author":63233,"featured_media":798,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-797","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-azure-depth-platform"],"acf":[],"blog_post_summary":"<p>Staying at the forefront of 3D sensing technology development requires continuous investment and innovation. Microsoft&#8217;s paper : \u201cTime-of-flight camera characterization with functional modeling for synthetic scene generation\u201d, which was recently published in &#8220;Optics express&#8221; is a clear demonstration of the depth (pun intended) we go to at Microsoft, to make sure partners and customers get [&hellip;]<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/posts\/797","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/users\/63233"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/comments?post=797"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/posts\/797\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/media\/798"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/media?parent=797"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/categories?post=797"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-depth-platform\/wp-json\/wp\/v2\/tags?post=797"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}