Originally Posted by KidHorn
OK, but DV and HDR+ map brightness as a pct of the displays brightness capabilities, while regular HDR uses absolute values. The advantages of DV and HDR+ only manifest themselves when the brightness of HDR for a given pixel exceeds the brightness capabilities of the TV. If you had a scene where the maximum pixel brightness was say 2000 nits and your TV could only manage 1500 nits, regular HDR would display all pixels over 1500 nits at 1500 nits, while DV and HDR+ would adjust the brightness of every pixel to 75% of what they were supposed to show. HDR+ and DV would maintain the relative brightness of each pixel.
So the advantage of HDR+ and DV diminish as the maximum brightness capabilities go up. And when watching relatively dark content, there may be no difference.
This is absolutely not the way HDR10 works, or should work. Most displays do not clip at their max brightness, it would produce a horrible picture, especially on OLEDs or even more projectors. They do some tonemapping to display the whole desired range within the available range of the display.
For example, I can display 4000nits titles and resolve up to 4000nits in the content on my JVC projector with a peak brightness of 100-200nits max, simply by designing a custom gamma curve with a gradual soft clipping above 50nits (in fact, above around 400nits in the content) so that it fits into the available dynamic range.
HDR10+ and Dolby Vision simply make sure that the available dynamic range is best used for each scene. For example, if you have a dark scene, you can use the whole of the dynamic range of the display to represent it, which is again comparable to a dynamic iris on projectors.
This is why displays with the least native contrast are bound to benefit more from HDR10+ or Dolby Vision than display with the best native contrast. The peak brightness is one element of that, but only when the black floor is also taken into account.
For example, a projector with a great native contrast of say 120,000:1 such as my JVC with a peak brightness of 120nits will benefit a lot less from HDR10+/Dolby Vision than a crappy LED TV with a peak brightness of 1000nits but a grey black floor (hence a poor native on/off contrast). This is really what these are advanced HDR formats are designed for.
If you have a good OLED or a good projector with great native on/off, I wouldn't fret too much about the lack of HDR10+ or Dolby Vision. If you have a crappy projector or a LED TC with limited native on/off, then the improvements brought by dynamic metadata should be a lot more significant.