Quote:
Originally Posted by Geoff D
I don't think "industry standard" means what you think it means. By definition Dolby Vision on UHD disc is an optional HDR format whereas HDR10 - the open source system that requires no royalties - is mandatory for all UHD discs and TVs that contain HDR, now and into the future. You won't find DV on Samsung or Panasonic TVs either.
But you're missing the point with all my tech jargon in the post you quoted, as well as the follow-up: if the HDR10 grade has been supervised by the filmmaker and your TV is capable of realising that image in its entirety (or near enough) then you ARE still seeing it as intended. Even Dolby Vision will reduce mapping to a minimum if the TV can actually 'do' the image that it's been presented with. Don't just take my word for it:

|
Name a film where HDR10 has been overseen by the filmmaker?
Because I can certainly name more than a handful of filmmakers and Colorists that support Dolby Vision.
Again, because the latter is the industry standard.
No matter how much you want to dress it up in technical terms and buzz words, HDR10 is not as good--nor accurate--as Dolby Vision.
Quote:
"Dolby has been working on this stuff longer than anyone. They’ve been developing this technology since long before anyone even heard the term HDR, around 2003. They practically invented the HDR LED local dimming display for crying out loud! The result: the Dolby Vision system. Dolby Vision is a complete and comprehensive system encompassing tools and standards for monitoring and grading of content, through delivery and transmission, and finally to the end user’s display in their home.
On the content creation side, on the surface at least, it’s pretty straight forward: The capabilities of the monitor are known, the artist is free to work their magic, and a ton of metadata is tagged to the content. The real genius comes at the other end.
Here I am with my Dolby Vision TV with, let’s say, a 500 nit capability. I play content which was mastered to 4,000 nits. What happens to the picture data between 500 and 4,000 nits? There is a Dolby Vision processor chip in the TV which, using all that juicy metadata generated during encoding, “remaps” the luminance portion which is above the display’s peak. This is not a canned “4,000-to-500 nits” tone map: it is dynamic and content aware, employed on a per-frame basis.
So for example if one scene has data up to 4,000 nits and the next peaks at 1,500, the two will be tone mapped differently. As cliché as this sounds, it’s going to looks as good as it possibly can while being faithful to the source content within the display’s capabilities. Dolby Vision displays are calibrated by referencing something somewhat humorously called the model’s “Golden Reference”."
|
Source.
Also, keep in mind that even when HDR10+ introduces metadata, it's going to be the TV manufacturer that decides how to process it.
With Dolby, you're getting the image you're supposed to see: scene by scene, or frame by frame.
EDIT: Also, Samsung is trash. And I couldn't tell you what Panasonic is doing since they don't sell sets in the States anymore.
I will, however, point out that Sony, Universal, Lionsgate, Warner Brothers, LG, Vizio, and many others are backing Dolby Vision.
Not that that really matters, though. The HDR format "War" isn't going to be like HD-DVD and Blu-Ray. Multiple HDR formats will around for the foreseeable future; Dolby Vision included.