Yeah, it’s what drives me crazy.
You can have an SDR screen with a wide gamut, high contrast ratio, high bit-depth and it just works because media display is extremely straightforward and universal, and the only quirk is color space mapping and applications having to be color-managed.
Now we have HDR, a protocol specification that needs to be implemented for all pieces of visual media individually and apparently also makes SDR content look bat.
People who work in video and cinema production encode HDR videos, having absolutely no idea of what the hell is happening to the encoding and where it’s different in comparison to SDR and is marketed based on things that SDR can achieve with less hassle.
In the marketing, you have then the side-by-side comparison of a low contrast and high contrast images and you go “woow”, but they are both part of an SDR image being displayed on an SDR screen, in the name of HDR.
Then you have the average TV which claims to support HDR but it’s just adjusting the backlight globally according to how bright or dark the content gets.
But there’s no way that this is just useless marketting, there has to be an advantage that HDR gives but I just can’t see it.
I’d say that contrast ratio is also really good, but I believe that a higher gamut implies a higher contrast ratio usually too. Response time is also important but nowadays any IPS, previously known for bad ghosting, will have good enough response time.
But why am I going so crazy to find a purpose for HDR? Because Wayland is focusing so much on it. I want Wayland to implement manual RGB balance configuration for the screen, for those who do not have a colorimeter to generate a perfect ICC profile for their screen, but instead they are implementing HDR as a higher priority, when not even Windows got HDR straight I believe.