{"id":975,"date":"2018-03-19T10:00:44","date_gmt":"2018-03-19T18:00:44","guid":{"rendered":"https:\/\/blogs.msdn.microsoft.com\/directx\/?p=975"},"modified":"2019-03-08T20:58:08","modified_gmt":"2019-03-09T04:58:08","slug":"announcing-microsoft-directx-raytracing","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/directx\/announcing-microsoft-directx-raytracing\/","title":{"rendered":"Announcing Microsoft DirectX Raytracing!"},"content":{"rendered":"<p>If you just want to see what DirectX Raytracing can do for gaming, check out the videos from <a href=\"https:\/\/youtu.be\/J3ue35ago3Y\">Epic<\/a>, <a href=\"https:\/\/youtu.be\/81E9yVU-KB8\">Futuremark<\/a> and <a href=\"https:\/\/www.youtube.com\/watch?v=LXo0WdlELJk\">EA, SEED<\/a>.\u00a0 To learn about the magic behind the curtain, keep reading.<\/p>\n<h3><strong>3D Graphics is a Lie<\/strong><\/h3>\n<p>For the last thirty years, almost all games have used the same general technique\u2014rasterization\u2014to render images on screen.\u00a0 While the internal representation of the game world is maintained as three dimensions, rasterization ultimately operates in two dimensions (the plane of the screen), with 3D primitives mapped onto it through transformation matrices.\u00a0 Through approaches like z-buffering and occlusion culling, games have historically strived to minimize the number of spurious pixels rendered, as normally they do not contribute to the final frame.\u00a0 And in a perfect world, the pixels rendered would be exactly those that are directly visible from the camera:<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1a.png\"><img decoding=\"async\" width=\"362\" height=\"636\" class=\"size-full wp-image-1065 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1a.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1a.png 362w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1a-171x300.png 171w\" sizes=\"(max-width: 362px) 100vw, 362px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p><em>Figure 1a: a top-down illustration of various pixel reduction techniques. Top to bottom: no culling, view frustum culling, viewport clipping<\/em><\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1b.png\"><img decoding=\"async\" width=\"382\" height=\"452\" class=\"size-full wp-image-1115 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1b.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1b.png 382w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1b-254x300.png 254w\" sizes=\"(max-width: 382px) 100vw, 382px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p><em>Figure 1b: back-face culling, z-buffering<\/em><\/p>\n<p>&nbsp;<\/p>\n<p>Through the first few years of the new millennium, this approach was sufficient.\u00a0 Normal and parallax mapping continued to add layers of realism to 3D games, and GPUs provided the ongoing improvements to bandwidth and processing power needed to deliver them.\u00a0 It wasn\u2019t long, however, until games began using techniques that were incompatible with these optimizations.\u00a0 Shadow mapping allowed off-screen objects to contribute to on-screen pixels, and environment mapping required a complete spherical representation of the world.\u00a0 Today, techniques such as screen-space reflection and global illumination are pushing rasterization to its limits, with SSR, for example, being solved with level design tricks, and GI being solved in some cases by processing a full 3D representation of the world using async compute.\u00a0 In the future, the utilization of full-world 3D data for rendering techniques will only increase.<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig-2.png\"><img decoding=\"async\" width=\"422\" height=\"358\" class=\"size-full wp-image-1095 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig-2.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig-2.png 422w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig-2-300x255.png 300w\" sizes=\"(max-width: 422px) 100vw, 422px\" \/><\/a><\/p>\n<p><em>Figure 2: a top-down view showing how shadow mapping can allow even culled geometry to contribute to on-screen shadows in a scene<\/em><\/p>\n<p>Today, we are introducing a feature to DirectX 12 that will bridge the gap between the rasterization techniques employed by games today, and the full 3D effects of tomorrow.\u00a0 This feature is DirectX Raytracing.\u00a0 By allowing traversal of a full 3D representation of the game world, DirectX Raytracing allows current rendering techniques such as SSR to naturally and efficiently fill the gaps left by rasterization, and opens the door to an entirely new class of techniques that have never been achieved in a real-time game. Readers unfamiliar with rasterization and raytracing will find more information about the basics of these concepts in the appendix below.<\/p>\n<p>&nbsp;<\/p>\n<h4><strong>What is DirectX Raytracing?<\/strong><\/h4>\n<p>At the highest level, DirectX Raytracing (DXR) introduces four, new concepts to the DirectX 12 API:<\/p>\n<ol>\n<li>The <em>acceleration structure<\/em> is an object that represents a full 3D environment in a format optimal for traversal by the GPU.\u00a0 Represented as a two-level hierarchy, the structure affords both optimized ray traversal by the GPU, as well as efficient modification by the application for dynamic objects.<\/li>\n<li>A new command list method, <em>DispatchRays<\/em>, which is the starting point for tracing rays into the scene.\u00a0 This is how the game actually submits DXR workloads to the GPU.<\/li>\n<li>A set of new HLSL shader types including <em>ray-generation<\/em>, <em>closest-hit<\/em>, <em>any-hit<\/em>, and <em>miss<\/em> shaders.\u00a0 These specify what the DXR workload actually does computationally.\u00a0 When DispatchRays is called, the ray-generation shader runs.\u00a0 Using the new <em>TraceRay<\/em> intrinsic function in HLSL, the ray generation shader causes rays to be traced into the scene.\u00a0 Depending on where the ray goes in the scene, one of several hit or miss shaders may be invoked at the point of intersection.\u00a0 This allows a game to assign each object its own set of shaders and textures, resulting in a unique material.<\/li>\n<li>The <em>raytracing pipeline state<\/em>, a companion in spirit to today\u2019s Graphics and Compute pipeline state objects, encapsulates the raytracing shaders and other state relevant to raytracing workloads.<\/li>\n<\/ol>\n<p>&nbsp;<\/p>\n<p>You may have noticed that DXR does not introduce a new GPU engine to go alongside DX12\u2019s existing Graphics and Compute engines.\u00a0 This is intentional \u2013 DXR workloads can be run on either of DX12\u2019s existing engines.\u00a0 The primary reason for this is that, fundamentally, DXR is a compute-like workload. It does not require complex state such as output merger blend modes or input assembler vertex layouts.\u00a0 A secondary reason, however, is that representing DXR as a compute-like workload is aligned to what we see as the future of graphics, namely that hardware will be increasingly general-purpose, and eventually most fixed-function units will be replaced by HLSL code.\u00a0 The design of the raytracing pipeline state exemplifies this shift through its name and design in the API. With DX12, the traditional approach would have been to create a new CreateRaytracingPipelineState method.\u00a0 Instead, we decided to go with a much more generic and flexible <em>CreateStateObject<\/em> method.\u00a0 It is designed to be adaptable so that in addition to Raytracing, it can eventually be used to create Graphics and Compute pipeline states, as well as any future pipeline designs.<\/p>\n<h4><strong>Anatomy of a DXR Frame<\/strong><\/h4>\n<p>The first step in rendering any content using DXR is to build the acceleration structures, which operate in a two-level hierarchy.\u00a0 At the bottom level of the structure, the application specifies a set of <em>geometries<\/em>, essentially vertex and index buffers representing distinct objects in the world.\u00a0 At the top level of the structure, the application specifies a list of instance descriptions containing references to a particular geometry, and some additional per-instance data such as transformation matrices, that can be updated from frame to frame in ways similar to how games perform dynamic object updates today.\u00a0 Together, these allow for efficient traversal of multiple complex geometries.<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig3.png\"><img decoding=\"async\" width=\"404\" height=\"276\" class=\"size-full wp-image-1125 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig3.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig3.png 404w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig3-300x205.png 300w\" sizes=\"(max-width: 404px) 100vw, 404px\" \/><\/a><\/p>\n<p><em>Figure 3: Instances of 2 geometries, each with its own transformation matrix<\/em><\/p>\n<p>The second step in using DXR is to create the raytracing pipeline state.\u00a0 Today, most games batch their draw calls together for efficiency, for example rendering all metallic objects first, and all plastic objects second.\u00a0 But because it\u2019s impossible to predict exactly what material a particular ray will hit, batching like this isn\u2019t possible with raytracing.\u00a0 Instead, the raytracing pipeline state allows specification of multiple sets of raytracing shaders and texture resources.\u00a0 Ultimately, this allows an application to specify, for example, that any ray intersections with object A should use shader P and texture X, while intersections with object B should use shader Q and texture Y.\u00a0 This allows applications to have ray intersections run the correct shader code with the correct textures for the materials they hit.<\/p>\n<p>The third and final step in using DXR is to call DispatchRays, which invokes the ray generation shader.\u00a0 Within this shader, the application makes calls to the TraceRay intrinsic, which triggers traversal of the acceleration structure, and eventual execution of the appropriate hit or miss shader.\u00a0 In addition, TraceRay can also be called from within hit and miss shaders, allowing for ray recursion or \u201cmulti-bounce\u201d effects.<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig-4.png\">\n<img decoding=\"async\" width=\"434\" height=\"298\" class=\"size-full wp-image-1105 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig-4.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig-4.png 434w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig-4-300x206.png 300w\" sizes=\"(max-width: 434px) 100vw, 434px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p><em>Figure 4: an illustration of ray recursion in a scene<\/em><\/p>\n<p>Note that because the raytracing pipeline omits many of the fixed-function units of the graphics pipeline such as the input assembler and output merger, it is up to the application to specify how geometry is interpreted.\u00a0 Shaders are given the minimum set of attributes required to do this, namely the intersection point\u2019s <a href=\"https:\/\/en.wikipedia.org\/wiki\/Barycentric_coordinate_system\">barycentric coordinates<\/a> within the primitive.\u00a0 Ultimately, this flexibility is a significant benefit of DXR; the design allows for a huge variety of techniques without the overhead of mandating particular formats or constructs.<\/p>\n<h4><strong>PIX for Windows Support Available on Day 1<\/strong><\/h4>\n<p>As new graphics features put an increasing array of options at the disposal of game developers, the need for great tools becomes increasingly important.\u00a0 The great news is that PIX for Windows will support the DirectX Raytracing API from day 1 of the API\u2019s release.\u00a0 PIX on Windows supports capturing and analyzing frames built using DXR to help developers understand how DXR interacts with the hardware. Developers can inspect API calls, view pipeline resources that contribute to the raytracing work, see contents of state objects, and visualize acceleration structures. This provides the information developers need to build great experiences using DXR.<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Multithreading.png\"><\/a><\/p>\n<p><img decoding=\"async\" width=\"1024\" height=\"771\" class=\"size-large wp-image-1235 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Multithreading-1024x771.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Multithreading-1024x771.png 1024w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Multithreading-300x226.png 300w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Multithreading-768x578.png 768w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Multithreading.png 1218w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/p>\n<h4><strong>What Does This Mean for Games?<\/strong><\/h4>\n<p>DXR will initially be used to supplement current rendering techniques such as screen space reflections, for example, to fill in data from geometry that\u2019s either occluded or off-screen.\u00a0 This will lead to a material increase in visual quality for these effects in the near future.\u00a0 Over the next several years, however, we expect an increase in utilization of DXR for techniques that are simply impractical for rasterization, such as true global illumination.\u00a0 Eventually, raytracing may completely replace rasterization as the standard algorithm for rendering 3D scenes.\u00a0 That said, until everyone has a <a href=\"https:\/\/en.wikipedia.org\/wiki\/Light_field\">light-field<\/a> display on their desk, rasterization will continue to be an excellent match for the common case of rendering content to a flat grid of square pixels, supplemented by raytracing for true 3D effects.<\/p>\n<p>Thanks to our friends at <a href=\"https:\/\/www.ea.com\/seed\">SEED, Electronic Arts<\/a>, we can show you a <a href=\"https:\/\/www.youtube.com\/watch?v=LXo0WdlELJk\">glimpse of what future gaming scenes could look like<\/a>.<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/SEED-screenshot.png\"><img decoding=\"async\" width=\"1024\" height=\"576\" class=\"size-large wp-image-1045 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/SEED-screenshot-1024x576.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/SEED-screenshot-1024x576.png 1024w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/SEED-screenshot-300x169.png 300w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/SEED-screenshot-768x432.png 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/a><\/p>\n<p><em>Project PICA PICA from SEED, Electronic Arts<\/em><\/p>\n<p>And, our friends at <a href=\"https:\/\/www.unrealengine.com\/en-US\/blog\/epic-games-demonstrates-real-time-ray-tracing-in-unreal-engine-4-with-ilmxlab-and-nvidia\">EPIC<\/a>, with collaboration from ILMxLAB and NVIDIA,\u00a0 have also put together a <a href=\"https:\/\/youtu.be\/J3ue35ago3Y\">stunning technology demo<\/a> with some characters you may recognize.<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1ab.png\"><img decoding=\"async\" width=\"1024\" height=\"560\" class=\"size-large wp-image-1365 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1ab-1024x560.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1ab-1024x560.png 1024w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1ab-300x164.png 300w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/raytracing-fig1ab-768x420.png 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/a><\/p>\n<p>Of course, what new PC technology would be complete without support from Futuremark benchmark?\u00a0 Fortunately, <a href=\"https:\/\/www.futuremark.com\/pressreleases\/watch-our-new-directx-raytracing-tech-demo\">Futuremark<\/a> has us covered <a href=\"https:\/\/youtu.be\/81E9yVU-KB8\">with their own incredible visuals<\/a>.<\/p>\n<p>&nbsp;<\/p>\n<p>In addition, while today marks the first public announcement of DirectX Raytracing, we have been working closely with hardware vendors and industry developers for nearly a year to design and tune the API.\u00a0 In fact, a significant number of studios and engines are already planning to integrate DXR support into their games and engines, including:<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/frostbite.png\"><img decoding=\"async\" width=\"652\" height=\"367\" class=\"size-full wp-image-1245 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/frostbite.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/frostbite.png 652w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/frostbite-300x169.png 300w\" sizes=\"(max-width: 652px) 100vw, 652px\" \/><\/a><\/p>\n<p style=\"text-align: center\"><em>Electronic Arts, Frostbite<\/em><\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/seed-working.jpg\"><img decoding=\"async\" width=\"381\" height=\"381\" class=\" wp-image-1285 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/seed-working.jpg\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/seed-working.jpg 380w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/seed-working-150x150.jpg 150w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/seed-working-300x300.jpg 300w\" sizes=\"(max-width: 381px) 100vw, 381px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p style=\"text-align: center\"><em>Electronic Arts,\u00a0 SEED<\/em><\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Unreal_Engine_Black.png\"><img decoding=\"async\" width=\"341\" height=\"388\" class=\"wp-image-1265 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Unreal_Engine_Black-900x1024.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Unreal_Engine_Black-900x1024.png 900w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Unreal_Engine_Black-264x300.png 264w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Unreal_Engine_Black-768x874.png 768w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/Unreal_Engine_Black.png 1125w\" sizes=\"(max-width: 341px) 100vw, 341px\" \/><\/a><\/p>\n<p style=\"text-align: center\"><em>Epic Games, Unreal Engine<\/em><\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/3dmarkworking.png\"><img decoding=\"async\" width=\"696\" height=\"145\" class=\"wp-image-1275 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/3dmarkworking.png\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/3dmarkworking.png 850w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/3dmarkworking-300x62.png 300w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/3dmarkworking-768x160.png 768w\" sizes=\"(max-width: 696px) 100vw, 696px\" \/><\/a><\/p>\n<p style=\"text-align: center\"><em>Futuremark, 3DMark<\/em><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/unitylogo.jpg\"><img decoding=\"async\" width=\"448\" height=\"182\" class=\" wp-image-1155 aligncenter\" alt=\"\" src=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/unitylogo.jpg\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/unitylogo.jpg 537w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/unitylogo-300x122.jpg 300w\" sizes=\"(max-width: 448px) 100vw, 448px\" \/><\/a><\/p>\n<p style=\"text-align: center\"><em>Unity Technologies, Unity Engine<\/em><\/p>\n<p>And more will be coming soon.<\/p>\n<p>&nbsp;<\/p>\n<h4><strong>What Hardware Will DXR Run On?<\/strong><\/h4>\n<p>Developers can use currently in-market hardware to get started on DirectX Raytracing.\u00a0 There is also a fallback layer which will allow developers to start experimenting with DirectX Raytracing that does not require any specific hardware support.\u00a0 For hardware roadmap support for DirectX Raytracing, please contact hardware vendors directly for further details.<\/p>\n<h4><strong>Available now for experimentation!<\/strong><\/h4>\n<p>Want to be one of the first to bring real-time raytracing to your game?\u00a0 Start by attending our <a href=\"http:\/\/schedule.gdconf.com\/session\/directx-evolving-microsofts-graphics-platform-presented-by-microsoft\/856594\">Game Developer Conference Session on DirectX Raytracing\u00a0<\/a>for all the technical details you need to begin, then download the <a href=\"http:\/\/aka.ms\/dxrsdk\">Experimental DXR SDK<\/a> and start coding!\u00a0 Not attending GDC?\u00a0 No problem! \u00a0Click <a href=\"http:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2018\/03\/GDC_DXR_deck.pdf\">here <\/a>to see our GDC slides.<\/p>\n<p>&nbsp;<\/p>\n<h4><strong>Appendix \u2013 Primers on rasterization, raytracing and DirectX Raytracing<\/strong><\/h4>\n<p><strong>\u00a0<\/strong><\/p>\n<p><strong>Intro to Rasterization<\/strong><\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p>Of all the rendering algorithms out there, by far the most widely used is <strong>rasterization<\/strong>. Rasterization has been around since the 90s and has since become the dominant rendering technique in video games. This is with good reason: it\u2019s incredibly efficient and can produce high levels of visual realism.<\/p>\n<p>&nbsp;<\/p>\n<p>Rasterization is an algorithm that in a sense doesn\u2019t do all its work in 3D. This is because rasterization has a step where 3D objects get projected onto your 2D monitor, before they are colored in. This work can be done efficiently by GPUs because it\u2019s work that can be done in parallel: the work needed to color in one pixel on the 2D screen can be done independently of the work needed to color one the pixel next to it.<\/p>\n<p>&nbsp;<\/p>\n<p>There\u2019s a problem with this: in the real world the color of one object will have an impact on the objects around it, because of the complicated interplay of light.\u00a0 This means that developers must resort to a wide variety of clever techniques to simulate the visual effects that are normally caused by light scattering, reflecting and refracting off objects in the real world. The shadows, reflections and indirect lighting in games are made with these techniques.<\/p>\n<p>&nbsp;<\/p>\n<p>Games rendered with rasterization can look and feel incredibly lifelike, because developers have gotten extremely good at making it look as if their worlds have light that acts in convincing way. Having said that, it takes an incredible deal of technical expertise to do this well and there\u2019s also an upper limit to how realistic a rasterized game can get, since information about 3D objects gets lost every time they get projected onto your 2D screen.<\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p><strong>Intro to Raytracing<\/strong><\/p>\n<p>&nbsp;<\/p>\n<p>Raytracing calculates the color of pixels by tracing the path of light that would have created it and simulates this ray of light\u2019s interactions with objects in the virtual world. Raytracing therefore calculates what a pixel would look like if a virtual world had real light. The beauty of raytracing is that it preserves the 3D world and visual effects like shadows, reflections and indirect lighting are a natural consequence of the raytracing algorithm, not special effects.<\/p>\n<p>&nbsp;<\/p>\n<p>Raytracing can be used to calculate the color of every single pixel on your screen, or it can be used for only some pixels, such as those on reflective surfaces.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>How does it work?<\/strong><\/p>\n<p>&nbsp;<\/p>\n<p>A ray gets sent out for each pixel in question. The algorithm works out which object gets hit first by the ray and the exact point at which the ray hits the object. This point is called the first point of intersection and the algorithm does two things here: 1) it estimates the incoming light at the point of intersection and 2) combines this information about the incoming light with information about the object that was hit.<\/p>\n<p>&nbsp;<\/p>\n<p>1)\u00a0\u00a0\u00a0\u00a0\u00a0 To estimate what the incoming light looked like at the first point of intersection, the algorithm needs to consider where this light was reflected or refracted from.<\/p>\n<p>2)\u00a0\u00a0\u00a0\u00a0\u00a0 Specific information about each object is important because objects don\u2019t all have the same properties: they absorb, reflect and refract light in different ways:<\/p>\n<p>&#8211;\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 different ways of absorption are what cause objects to have different colors (for example, a leaf is green because it absorbs all but green light)<\/p>\n<p>&#8211;\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 different rates of reflection are what cause some objects to give off mirror-like reflections and other objects to scatter rays in all directions<\/p>\n<p>&#8211;\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 different rates of refraction are what cause some objects (like water) to distort light more than other objects.<\/p>\n<p>Often to estimate the incoming light at the first point of intersection, the algorithm must trace that light to a second point of intersection (because the light hitting an object might have been reflected off another object), or even further back.<\/p>\n<p>&nbsp;<\/p>\n<p>Savvy readers with some programming knowledge might notice some edge cases here.<\/p>\n<p>&nbsp;<\/p>\n<p>Sometimes light rays that get sent out never hit anything. Don\u2019t worry, this is an edge case we can cover easily by measuring for how far a ray has travelled so that we can do additional work on rays that have travelled for too far.<\/p>\n<p>&nbsp;<\/p>\n<p>The second edge case covers the opposite situation: light might bounce around so much that it\u2019ll slow down the algorithm, or an infinite number of times, causing an infinite loop. The algorithm keeps track of how many times a ray gets traced after every step and gets terminated after a certain number of reflections. We can justify doing this because every object in the real world absorbs some light, even mirrors. This means that a light ray loses energy (becomes fainter) every time it\u2019s reflected, until it becomes too faint to notice. So even if we could, tracing a ray an arbitrary number of times doesn\u2019t make sense.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>What is the state of raytracing today?<\/strong><\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<p>Raytracing a technique that\u2019s been around for decades. It\u2019s used quite often to do CGI in films and several games already use forms of raytracing. For example, developers might use offline raytracing to do things like pre-calculating the brightness of virtual objects before shipping their games.<\/p>\n<p>&nbsp;<\/p>\n<p>No games currently use real-time raytracing, but we think that this will change soon: over the past few years, computer hardware has become more and more flexible: even with the same TFLOPs, a GPU can do more.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>How does this fit into DirectX?<\/strong><\/p>\n<p>&nbsp;<\/p>\n<p>We believe that DirectX Raytracing will bring raytracing within reach of real-time use cases, since it comes with dedicated hardware acceleration and can be integrated seamlessly with existing DirectX 12 content.<\/p>\n<p>&nbsp;<\/p>\n<p>This means that it\u2019s now possible for developers to build games that use rasterization for some of its rendering and raytracing to be used for the rest. For example, developers can build a game where much of the content is generated with rasterization, but DirectX Raytracing calculates the shadows or reflections, helping out in areas where rasterization is lacking.<\/p>\n<p>&nbsp;<\/p>\n<p>This is the power of DirectX Raytracing: it lets developers have their cake and eat it.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>If you just want to see what DirectX Raytracing can do for gaming, check out the videos from Epic, Futuremark and EA, SEED.\u00a0 To learn about the magic behind the curtain, keep reading. 3D Graphics is a Lie For the last thirty years, almost all games have used the same general technique\u2014rasterization\u2014to render images on [&hellip;]<\/p>\n","protected":false},"author":1916,"featured_media":1335,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[17,21,25,31,44,45],"class_list":["post-975","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-directx","tag-directx","tag-directx12","tag-dxr","tag-graphics","tag-ray-tracing","tag-raytracing"],"acf":[],"blog_post_summary":"<p>If you just want to see what DirectX Raytracing can do for gaming, check out the videos from Epic, Futuremark and EA, SEED.\u00a0 To learn about the magic behind the curtain, keep reading. 3D Graphics is a Lie For the last thirty years, almost all games have used the same general technique\u2014rasterization\u2014to render images on [&hellip;]<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/posts\/975","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/users\/1916"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/comments?post=975"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/posts\/975\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/media\/1335"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/media?parent=975"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/categories?post=975"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/tags?post=975"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}