What's up with HDR?

I never owned an HDR screen, but I have owned multiple SDR screens and I have a good understanding of color and a very good understanding of visual media encoding and compression (ffmpeg is my obsession too). I just can’t get the purpose of HDR.

You can have an SDR screen with a high bit depth, a wide color gamut and a high contrast ratio, like a 10bit OLED screen. All SDR media (desktops, games, video, images, etc) are supported and they look amazing. Color-managed applications map the color space according to the screen and fix potential issues too.

So why does HDR exist? HDR’s market focus is having a better contrast ratio, color gamut and bit depth than SDR, but SDR is equally able to have this, and SDR is universally supported and straightforward, while HDR is still a niche thing and needs a manual support implementation for everything that is visual.

What am I missing about HDR?

1 Like

I guess “equally” would be the word to call into question here.

HDR and SDR are both methods of displaying video content, but HDR offers a wider range of colors and brightness levels, resulting in a more vivid and lifelike picture.

Compared to SDR, HDR video generally shows more detail in both bright and dark areas, and provides a more dynamic range of color. This can result in a more immersive viewing experience, particularly when watching high-quality video content such as movies.

I am watching this comparison image that is SDR and being displayed on an SDR screen, these monitor benchmarks are always exaggerated visually, like the 60Hz vs 144

SDR is still capable of very high color gamut and contrast ratio, this seems to me more of a digital mapping protocol, because the monitor’s capabilities are there regardless of display protocol.

First of all, HDR displays do not definitively have more contrast, a wider color gamut or a bigger bit-depth than SDR displays.

The difference is that HDR displays are required(In theory) to have those things. This is especially true for TVs where most TVs didn’t have those things years ago.

Further, those displays need to be able understand the HDR video formats which add metadata to help provide information about brightness levels and color gamuts. In some cases on a per scene or per frame basis.

4 Likes

So in the end, it’s just a protocol specification that goes beyond just doing color space mapping, for a more granular control of image.

Nah it’s even better…
Its PR and marketing departaments go BRRRRRRRRRRRRRrrrrrr!!!1111111

:rofl:

See, in today’s day and age you can’t make idiots buy something unless you have term and it’s number.
666 G…Wi-fi 500…USB 300…you get the deal.

As someone who have professional wide gamut SDR display - i can assure you that this article is a bunch of :horse: :poop:, especially those “comparison” pictures.

@spacebanana is very correct on that:

Only things that really matter for quality picture are monitor’s:

  • Resolution
  • DPI
  • Refresh rate
  • Gamut

And GPU / driver ability to back that up :upside_down_face:

2 Likes

Yeah, it’s what drives me crazy.

You can have an SDR screen with a wide gamut, high contrast ratio, high bit-depth and it just works because media display is extremely straightforward and universal, and the only quirk is color space mapping and applications having to be color-managed.

Now we have HDR, a protocol specification that needs to be implemented for all pieces of visual media individually and apparently also makes SDR content look bat.

People who work in video and cinema production encode HDR videos, having absolutely no idea of what the hell is happening to the encoding and where it’s different in comparison to SDR and is marketed based on things that SDR can achieve with less hassle.

In the marketing, you have then the side-by-side comparison of a low contrast and high contrast images and you go “woow”, but they are both part of an SDR image being displayed on an SDR screen, in the name of HDR.

Then you have the average TV which claims to support HDR but it’s just adjusting the backlight globally according to how bright or dark the content gets.

But there’s no way that this is just useless marketting, there has to be an advantage that HDR gives but I just can’t see it.

I’d say that contrast ratio is also really good, but I believe that a higher gamut implies a higher contrast ratio usually too. Response time is also important but nowadays any IPS, previously known for bad ghosting, will have good enough response time.

But why am I going so crazy to find a purpose for HDR? Because Wayland is focusing so much on it. I want Wayland to implement manual RGB balance configuration for the screen, for those who do not have a colorimeter to generate a perfect ICC profile for their screen, but instead they are implementing HDR as a higher priority, when not even Windows got HDR straight I believe.

2 Likes

SDR content:

Think twice!

honka_memes-128px-25

Tell you what, as long as it makes better support 10 bpc and higher displays (especially on freakin Nvidia driver) - i’m in! They can market all they want, as long as it’s finally properly implemented… :rofl:

Coz now it’s a bit of a mess, known only to professionals.

2 Likes

SDR: “I am vengeance”