{"id":7479,"date":"2023-02-15T11:06:04","date_gmt":"2023-02-15T19:06:04","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/directx\/?p=7479"},"modified":"2023-02-15T11:06:04","modified_gmt":"2023-02-15T19:06:04","slug":"video-acceleration-api-va-api-now-available-on-windows","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/directx\/video-acceleration-api-va-api-now-available-on-windows\/","title":{"rendered":"Video acceleration API (VA-API) now available on Windows!"},"content":{"rendered":"<h2><span style=\"font-size: 36pt;\">Introduction<\/span><\/h2>\n<h3>What&#8217;s VA-API?<\/h3>\n<p>Originally developed by Intel, VAAPI (Video Acceleration API) is an open-source library and API specification, which provides access to graphics hardware acceleration capabilities for video processing. It consists of a main library and driver-specific acceleration backends for each supported hardware vendor. Apps can use it to access video hardware acceleration capabilities via its interface specification for workloads such as decode, encode, processing, etc. The <a href=\"https:\/\/github.com\/intel\/libva\">libva library<\/a>\u00a0-an Intel implementation for VA-API- is used along with a <strong>VA-API driver<\/strong>, to which the video operations are delegated into from the libva library, and then implemented in the driver targeting the specific hardware platform the app is running on. This API is primarily used in Linux-based environments today and several well-known media applications utilize it to access GPU acceleration for diverse video operations.<\/p>\n<h3>What&#8217;s the benefit of VA-API on Windows?<\/h3>\n<p>Having VA-API on Windows enables developers to build video apps that work across platforms. This work continues our pattern of supporting a wide range of other APIs by layering them on top of the Direct3D 12 driver model (ie. <a href=\"https:\/\/devblogs.microsoft.com\/directx\/announcing-the-opencl-and-opengl-compatibility-pack-for-windows-10-on-arm\/\">OpenCL\u2122 and OpenGL\u00ae Compatibility Pack<\/a>) and having this layering makes it easier to port existing media applications such as <a href=\"https:\/\/ffmpeg.org\/\" target=\"_blank\" rel=\"noopener\">FFmpeg<\/a>\u00a0or\u00a0<a href=\"https:\/\/gstreamer.freedesktop.org\/\" target=\"_blank\" rel=\"noopener\">GStreamer<\/a>, which can have their existing VA-API backends target <strong>libva-win32<\/strong> and work across platforms and hardware vendors.<\/p>\n<h3>How does it work on Windows?<\/h3>\n<h4>VA-API library and driver implementations<\/h4>\n<p>As described more in detail in <a href=\"https:\/\/devblogs.microsoft.com\/commandline\/d3d12-gpu-video-acceleration-in-the-windows-subsystem-for-linux-now-available\/\">D3D12 GPU Video acceleration in the Windows Subsystem for Linux now available! &#8211; Windows Command Line<\/a>, we have recently added support for video hardware acceleration via VA-API in the Windows Subsystem for Linux with a VA-API driver implemented using the D3D12 Video APIs.<\/p>\n<p>With the objective of layering a wide range of software on top of the Direct3D 12 driver model, to give developers and end users the full benefit of hardware acceleration, we worked on supporting this stack on Windows as well.<\/p>\n<ul>\n<li><span style=\"font-size: 12pt;\">Since <a href=\"https:\/\/github.com\/intel\/libva\/releases\/tag\/2.17.0\">libva 2.17<\/a>\u00a0a\u00a0new <strong>libva-win32<\/strong> node was added to enable support on Windows platforms and provide VA-API acceleration from any chosen GPU adapter.<\/span><\/li>\n<li><span style=\"font-size: 12pt;\">Since\u00a0<a href=\"https:\/\/docs.mesa3d.org\/relnotes\/22.3.0.html\">Mesa 22.3<\/a>\u00a0the same VA-API driver used in WSL can be compiled for Windows, and we named it <strong>VAOn12<\/strong>. Implemented on top of the D3D12 Video APIs, provides cross-hardware-vendor VA-API acceleration.<\/span><\/li>\n<\/ul>\n<p><span style=\"font-size: 12pt;\">These two new components combined expose the same VA entry points and profiles as in WSL to applications that can now target both Windows and other existing VA-API platforms (ie. DRM, X11, etc).<\/span><\/p>\n<h4><span style=\"font-size: 18pt;\">VADisplay creation<\/span><\/h4>\n<p>The new <a href=\"https:\/\/github.com\/intel\/libva\/blob\/master\/va\/win32\/va_win32.h#L61\">vaGetDisplayWin32<\/a> function creates a VADisplay from an <a href=\"https:\/\/learn.microsoft.com\/en-us\/windows\/win32\/api\/winnt\/ns-winnt-luid\"><strong>Adapter LUID<\/strong><\/a> pointer which can be NULL for adapter auto selection or can be used together with DXGI or DXCore APIs to enumerate and select the adapter which will perform the video acceleration.<\/p>\n<h4><span style=\"font-size: 18pt;\">D3D12 Interoperability<\/span><\/h4>\n<p>We added texture interoperability support for both <a href=\"https:\/\/learn.microsoft.com\/en-us\/windows\/win32\/api\/d3d12\/nn-d3d12-id3d12resource\">ID3D12Resource<\/a> and <a href=\"https:\/\/learn.microsoft.com\/en-us\/windows\/win32\/api\/d3d12\/nf-d3d12-id3d12device-opensharedhandle\">HANDLE<\/a> types when using libva in D3D12 projects.<\/p>\n<ul>\n<li><a href=\"https:\/\/github.com\/intel\/libva\/blob\/6e86b4fb4dafa123b1e31821f61da88f10cfbe91\/va\/va.h#L1775\">vaCreateSurfaces<\/a> can import existing D3D12 resources into VASurface objects.<\/li>\n<li><a href=\"https:\/\/github.com\/intel\/libva\/blob\/6e86b4fb4dafa123b1e31821f61da88f10cfbe91\/va\/va.h#L4001\">vaExportSurfaceHandle<\/a> can export an existing VASurface into a D3D12 resource.<\/li>\n<li><a href=\"https:\/\/github.com\/intel\/libva\/blob\/6e86b4fb4dafa123b1e31821f61da88f10cfbe91\/va\/va.h#L3909\">vaAcquireBufferHandle<\/a> and <a href=\"https:\/\/github.com\/intel\/libva\/blob\/6e86b4fb4dafa123b1e31821f61da88f10cfbe91\/va\/va.h#L3942\">vaReleaseBufferHandle<\/a> have similar support.<\/li>\n<\/ul>\n<p>Using the functions above on their D3D12 app, developers can now inter-operate between D3D12 and libva workloads in both directions. For example, a libva video decode\/processing workload can feed exported surfaces to a D3D12 pipeline for further rendering to screen using a swapchain. Conversely, a D3D12 rendering pipeline can feed textures to be imported as VASurface\u2019s and used for libva workloads such as processing\/encode for video streaming.<\/p>\n<h2><span style=\"font-size: 36pt;\">How do I get it?<\/span><\/h2>\n<p>Both libva-win32 and the VaOn12 driver can be downloaded from <a href=\"https:\/\/www.nuget.org\/packages\/Microsoft.Direct3D.VideoAccelerationCompatibilityPack\/\">NuGet Gallery | Microsoft.Direct3D.VideoAccelerationCompatibilityPack<\/a> and can be used by configuring the environment variables:<\/p>\n<pre class=\"prettyprint language-default\"><code class=\"language-default\">LIBVA_DRIVER_NAME=vaon12\r\nLIBVA_DRIVERS_PATH=&lt;path to folder containing vaon12_drv_video.dll&gt;<\/code><\/pre>\n<h2><span style=\"font-size: 36pt;\">Code samples<\/span><\/h2>\n<p><figure id=\"attachment_7560\" aria-labelledby=\"figcaption_attachment_7560\" class=\"wp-caption alignright\" ><a href=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVADecode.png\"><img decoding=\"async\" class=\"size-medium wp-image-7560\" src=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVADecode-300x277.png\" alt=\"HelloVADecode Sample\" width=\"300\" height=\"277\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVADecode-300x277.png 300w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVADecode.png 502w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/a><figcaption id=\"figcaption_attachment_7560\" class=\"wp-caption-text\">HelloVaDecode Sample performing Video decoding with VA-API, and rendering the result with a Direct3D swapchain to screen.<\/figcaption><\/figure><\/p>\n<p>A couple of new DirectX-Graphics-Samples: <a href=\"https:\/\/github.com\/microsoft\/DirectX-Graphics-Samples\/tree\/master\/Samples\/Desktop\/D3D12HelloWorld\/src\/HelloVAEncode\"><em>D3D12HelloVAEncode<\/em><\/a>, <a href=\"https:\/\/github.com\/microsoft\/DirectX-Graphics-Samples\/tree\/master\/Samples\/Desktop\/D3D12HelloWorld\/src\/HelloVADecode\"><em>D3D12HelloVADecode<\/em> <\/a>and <a href=\"https:\/\/github.com\/microsoft\/DirectX-Graphics-Samples\/tree\/master\/Samples\/Desktop\/D3D12HelloWorld\/src\/HelloVAResourceInterop\"><em>D3D12HelloVAResourceInterop<\/em> <\/a>were written with the new nuget package, showing how D3D12-based applications can now be developed with interoperability between libva for video decode, encode and processing workloads and D3D12.<\/p>\n<p>These samples will first check for support on your underlying hardware and then implement some scenarios for video operations between D3D12 and libva on platforms that support the underlying GPU Video capabilities, as well as how to present the contents of the rendered surfaces to screen. Also present in the samples you can find examples of how to enumerate and select adapters to use with <em>vaGetDisplayWin32, <\/em>importing of existing D3D12 resources into VASurfaces to be video decoded or processed and finally encoded into a compressed video bitstream.<\/p>\n<p>Let us know what you think! Feel free to get in touch with us on our Discord server at\u00a0<a href=\"https:\/\/discord.gg\/directx\" target=\"_blank\" rel=\"noopener\">discord.gg\/directx<\/a>.<\/p>\n<p><figure id=\"attachment_7563\" aria-labelledby=\"figcaption_attachment_7563\" class=\"wp-caption alignleft\" ><a href=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVAEncode-1.png\"><img decoding=\"async\" class=\"wp-image-7563 size-large\" src=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVAEncode-1-1024x489.png\" alt=\"HelloVAEncode Sample\" width=\"640\" height=\"306\" srcset=\"https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVAEncode-1-1024x489.png 1024w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVAEncode-1-300x143.png 300w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVAEncode-1-768x367.png 768w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVAEncode-1-1536x734.png 1536w, https:\/\/devblogs.microsoft.com\/directx\/wp-content\/uploads\/sites\/42\/2023\/02\/HelloVAEncode-1-2048x979.png 2048w\" sizes=\"(max-width: 640px) 100vw, 640px\" \/><\/a><figcaption id=\"figcaption_attachment_7563\" class=\"wp-caption-text\">HelloVAEncode Sample performing Video processing with VA-API, and rendering the result with a Direct3D swapchain to screen. The rendered content is also encoded using VA-API into a H264 video file.<\/figcaption><\/figure><\/p>\n<h2><\/h2>\n<h2><\/h2>\n<h2><\/h2>\n<h2><\/h2>\n<h2><\/h2>\n<h2><span style=\"font-size: 36pt;\">Platforms supported<\/span><\/h2>\n<h2><span style=\"font-size: 24pt;\">OS\/D3D12 requirements for using VAOn12 driver<\/span><\/h2>\n<p>For video decoding and processing, at least Windows 10 November 2019 Update or Windows 11 is required.<\/p>\n<p>For video encoding, at least Windows 10 November 2019 Update with the usage of <a href=\"https:\/\/devblogs.microsoft.com\/directx\/directx12agility\/\">DirectX 12 Agility SDK 1.608.2+<\/a>, or Windows 11 is required.<\/p>\n<h2><\/h2>\n<h2><span style=\"font-size: 24pt;\">GPU hardware and driver requirements for using VAOn12 driver<\/span><\/h2>\n<p>The availability of the VA entrypoints, profiles, formats and other specific features exposed by the VAOn12 driver (vaon12_drv_video.dll) are directly tied to the capabilities reported by the underlying hardware and drivers via the D3D12 Video APIs. No additional requirements are introduced by the usage of VAOn12 respect to using the D3D12 Video APIs directly.<\/p>\n<h2><span style=\"font-size: 24pt;\">Libva requirements<\/span><\/h2>\n<p>There are no additional OS, hardware nor driver requirements to using the libva runtime in your applications (va.dll and va_win32.dll)<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Introduction What&#8217;s VA-API? Originally developed by Intel, VAAPI (Video Acceleration API) is an open-source library and API specification, which provides access to graphics hardware acceleration capabilities for video processing. It consists of a main library and driver-specific acceleration backends for each supported hardware vendor. Apps can use it to access video hardware acceleration capabilities via [&hellip;]<\/p>\n","protected":false},"author":70024,"featured_media":12651,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-7479","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-directx"],"acf":[],"blog_post_summary":"<p>Introduction What&#8217;s VA-API? Originally developed by Intel, VAAPI (Video Acceleration API) is an open-source library and API specification, which provides access to graphics hardware acceleration capabilities for video processing. It consists of a main library and driver-specific acceleration backends for each supported hardware vendor. Apps can use it to access video hardware acceleration capabilities via [&hellip;]<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/posts\/7479","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/users\/70024"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/comments?post=7479"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/posts\/7479\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/media\/12651"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/media?parent=7479"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/categories?post=7479"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/directx\/wp-json\/wp\/v2\/tags?post=7479"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}