This is why displaying SDR in HDR causes quantization artifacts in grayscale gradients

Channel:
Subscribers:
472
Published on ● Video Link: https://www.youtube.com/watch?v=Omj98bdyjqU



Duration: 6:06
708 views
21


Download photos from the video as PNG:
https://drive.google.com/drive/folders/1u64DI-VMlZvq830MGpgbBe_hNQiuUjFZ?usp=sharing

This video explores the visual artifacts seen in grayscale gradients when displaying 8 bit SDR content in a 10 bit HDR environment. The focus is on 10 bit HDR because 12 bit is nearly artifact-free. These are not typical banding issues; they're called quantization artifacts, a unique exception in how SDR content translates into HDR.

We compare ideal SDR (sRGB) and HDR (PQ) monitors to understand how numerical values convert to brightness units, or nits. In doing so, we identify some rounding errors—quantization—that occur when displaying SDR values in an HDR setup. The discrepancies, though minor, can affect the quality of your display in color-critical tasks involving grayscale gradients.

The issue can be somewhat mitigated with higher brightness settings or specialized software options. Interestingly, a 12 bit panel isn't required to fix these artifacts; a 10 bit panel accepting 12 bit input suffices. Essentially, 12 bit HDR is preferable if you wish to do SDR color work without toggling settings.

So, if you're running your display in HDR all the time (as HDR should in theory be used) and care about precise SDR color representation, go for 12 bit input if your display allows it.