Quote:
Originally Posted by Geoff D
It's not any different from standard HDR in terms of the visuals at source, the only difference is that it comes with content-derived metadata that allows the TV to tone map it more accurately.
Do you actually have a HDR10+ compatible TV and UHD player?
|
I think there is more to this. For one, dynamic metadata doesn't do a lot for "tone mapping" per se. The display has to know what it is capable of and then exactly what the content is. With Dolby, there is a file upload that tells the Dolby processor in the display exactly what it can do, so the metadata helps. With a HDR10+ display, it has to have something similar that outlines the displays capability and then what to do for content above that. If the tone mapping inside the display is already aware of its peak limitation, the tone map may not be that different. It doesn't do much for the bulk of the content anyway, because PQ is absolute, so it shouldn't change all the way from 0 to the point where the tone map starts to roll. This is why there isn't a lot of difference in most content between the formats.
But what HDR10+ and DV do allow is for a FAR more intelligent dynamic contrast system. I've mentioned this is why Samsung was hot to trot on this in the first place. By including scene by scene data, you know EXACTLY where the dynamic backlight and any gamma manipulation should be set for instead of guessing on the fly, which leads to pumping artifacts and other gamma issues.
We also don't know how accurate the numbers actually are for HDR10+, because they were ANYTHING but accurate for HDR10. Without a way to look at them, for all we know there aren't any at all, or not on a frame by frame basis.
I think people expect bigger differences with different formats than there really are or should be. It is only the most difficult cases that will likely show differences, and even then you may need to see something side by side to show the differences clearly. Plus most content on the market is still relatively low in peak output, so with a lot of the displays out there, tone mapping isn't even required, or the amount that actually IS tone mapped is VERY small. You'd probably see color luminance differences before you'd see tone map differences, and even then it would require side by side comparisons most likely.