MRTK’s configuration dialog will attempt to set depth buffer settings for both XR SDK and legacy WSA, but it’s good to check those tabs and verify the settings in Unity.
To set whether your Unity app will provide a depth buffer to Windows:
[!NOTE] It is generally recommended to use 16 bit depth buffers for improved performance. However, if using 16-bit depth format, stencil buffer required effects (like some Unity UI scroll panels) will not work because Unity does not create a stencil buffer in this setting. Selecting 24-bit depth format conversely will generally create an 8-bit stencil buffer if applicable on the endpoint graphics platform.
To set whether your Unity app will provide a depth buffer to Windows:
A depth buffer can improve visual quality so long as Windows can accurately map the normalized per-pixel depth values in your depth buffer back to distances in meters, using the near and far planes you’ve set in Unity on the main camera. If your render passes handle depth values in typical ways, you should generally be fine here, though translucent render passes that write to the depth buffer while showing through to existing color pixels can confuse the reprojection. If you know that your render passes will leave many of your final depth pixels with inaccurate depth values, you are likely to get better visual quality by unchecking “Enable Depth Buffer Sharing”.
[!NOTE] It is generally recommended to use 16 bit depth buffers for improved performance. However, if using 16-bit depth format, stencil buffer required effects (like some Unity UI scroll panels) will not work because Unity does not create a stencil buffer in this setting. Selecting 24-bit depth format conversely will generally create an 8-bit stencil buffer if applicable on the endpoint graphics platform.