Best 4K Blu-ray Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
Guardians of the Galaxy 4K (Blu-ray)
$11.99
35 min ago
Sony Pictures Classics: 30th Anniversary Collection 4K (Blu-ray)
$142.11
7 hrs ago
Spider-Man: Across the Spider-Verse 4K (Blu-ray)
$24.99
19 hrs ago
Tenebrae 4K (Blu-ray)
$42.84
4 hrs ago
The Deer Hunter 4K (Blu-ray)
$17.99
1 day ago
Witness 4K (Blu-ray)
$38.68
4 hrs ago
The Equalizer 3-Movie Collection 4K (Blu-ray)
$48.55
 
Battleship 4K (Blu-ray)
$10.99
1 hr ago
The Flash 4K (Blu-ray)
$24.99
 
Bill & Ted's Excellent Adventure 4K (Blu-ray)
$19.99
1 day ago
Streets of Fire 4K (Blu-ray)
$21.99
1 day ago
The Equalizer 3 4K (Blu-ray)
$29.96
 
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)

Go Back   Blu-ray Forum > 4K Ultra HD > 4K Blu-ray and 4K Movies

Reply
 
Thread Tools Display Modes
Old 01-24-2018, 12:53 AM   #261
dlb99 dlb99 is offline
Active Member
 
Nov 2012
Default

Quote:
Originally Posted by TheSweetieMan View Post
I'm too busy laughing at HDR10+ being one, single universal algorithm, that isn't going to yield the same consistent results all across the board, the same way Dolby Vision does for its compatible panels.
Having Dolby Vision is better sure; but it will never be ubiquitous.

So cheering for team Dolby Vision will mean disappointment when some of your favourite content will ship without it. Warners giving up on Dolby Vision seems a pretty big deal.
  Reply With Quote
Old 01-24-2018, 12:54 AM   #262
TheSweetieMan TheSweetieMan is offline
Blu-ray Samurai
 
TheSweetieMan's Avatar
 
Nov 2009
473
473
Default

You're not telling me anything I don't already know about dynamic metadata.

What you're failing to realize here is, Dolby uses its multiple profiles for each piece of hardware, to ensure every compatible piece of hardware it's implemented in, gets the best results of that hardware.

What HDR10+ is, a single algorithm that incorporates its version of dynamic metadata. Meaning, it may work well for two panels, but could be completely BORKED for another two.

That's why it's a joke. In the end, it's no real different from static HDR10, and having a panel like say the ZD9, where dynamic metadata isn't really needed, in terms of resolving detail.

You could have a 500-nit panel, yet still end up with terrible end-to-end playback, based on the HDR10+ algorithm not being well integrated into that panel's hardware/software.

At least with Dolby, you know you're getting the best HDR experience your panel has to offer.

EDIT: Warner didn't give up on Dolby. They're literally releasing Justice League in March, IN DOLBY VISION.
  Reply With Quote
Old 01-24-2018, 01:56 AM   #263
EbonDragon EbonDragon is offline
Expert Member
 
EbonDragon's Avatar
 
Apr 2015
182
Default

Quote:
Originally Posted by dlb99 View Post
Having Dolby Vision is better sure; but it will never be ubiquitous.

So cheering for team Dolby Vision will mean disappointment when some of your favourite content will ship without it. Warners giving up on Dolby Vision seems a pretty big deal.
Seriously, the WB giving up DV falsehood needs to be put to rest. If that was the case, like Sweetie pointed out, JL wouldn’t have it.
  Reply With Quote
Thanks given by:
Aidenag (01-24-2018)
Old 01-24-2018, 02:01 AM   #264
dlb99 dlb99 is offline
Active Member
 
Nov 2012
Default

Quote:
Originally Posted by TheSweetieMan View Post
EDIT: Warner didn't give up on Dolby. They're literally releasing Justice League in March, IN DOLBY VISION.
Been in the can for a while.

There may still a few Warners DV disks to come, the last holdovers prior to their switch in camps.
  Reply With Quote
Old 01-24-2018, 02:07 AM   #265
dlb99 dlb99 is offline
Active Member
 
Nov 2012
Default

Quote:
Originally Posted by TheSweetieMan View Post
That's why it's a joke. In the end, it's no real different from static HDR10, and having a panel like say the ZD9, where dynamic metadata isn't really needed, in terms of resolving detail.
It is different in that in won't require the TV to do heavy processing to tone-map appropriately.

TVs in the mid-range will have weak processors hence won't have the same capability to do on-the-fly analysis such as done by the ZD9 or LG OLEDs.

HDR10+ (and DV to an extent) are a technologys primarily benefiting the mid-range TV world.

So HDR10+ is not a joke for that mid-range TV segment; it will be greatly beneficial. The issue will be the same as DV, not all content will be HDR10+.
  Reply With Quote
Old 01-24-2018, 02:59 AM   #266
Dubstar Dubstar is offline
Blu-ray Prince
 
Dubstar's Avatar
 
Jun 2008
down at Fraggle Rock
1
196
1940
304
4
33
26
Default

Part of me wouldn't give a rat's ass if this up and died - in fact, I'd applaud it; the whole idea and implementation of HDR10+ reeks of attention grabbing and upsetting the apple cart
  Reply With Quote
Thanks given by:
Aidenag (01-24-2018), EbonDragon (01-24-2018), gkolb (01-24-2018), PeterTHX (01-24-2018)
Old 01-24-2018, 03:25 AM   #267
BrownianMotion BrownianMotion is offline
Power Member
 
Nov 2013
Default

Quote:
Originally Posted by Dubstar View Post
Part of me wouldn't give a rat's ass if this up and died - in fact, I'd applaud it; the whole idea and implementation of HDR10+ reeks of attention grabbing and upsetting the apple cart
Insider 2themax revealed that they're likely going to get HDR10+ and Dolby Vision on the same encode on UHD Blu-Ray. So depending on which studios decide to support both formats, we could be seeing these two formats on the same discs.

This is the farthest thing from a format war.
  Reply With Quote
Thanks given by:
EbonDragon (01-24-2018)
Old 01-24-2018, 03:25 AM   #268
nugget2016 nugget2016 is offline
Active Member
 
Feb 2016
Default

so if i bought the 2018 samsung flagship which hasnt been revealed yet, and it had around 2500 nits, how much of a benefit would hdr10+ give on that set over standard hdr10? i'm assuming not a lot at all, considering it's a universal algorithm. i probably wont even bother buying into hdr10+ with the tv.
  Reply With Quote
Old 01-24-2018, 11:21 AM   #269
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1097
2399
6
33
Default

Quote:
Originally Posted by TheSweetieMan View Post
Pretty much.

And since all sets/players will be using the same algorithm, for what work for one set, might not work for another.

So, it's just like the whole tone-mapping thing all over again.

There's no real benefit here. In fact, you can basically say that LG beat Samsung to the punch with this concept.
I could be wrong here Sweetie but I thought the mentions of it being an automated algorithm related to the actual generation of the metadata at source? Heck, even Dolby's first metadata pass is automated, but it can be adjusted manually from there unlike HDR10+.

The thing about HDR10+ in playback terms is that without a similar kind of Dolby 'engine' which knows how best to process the data for that TV then we're back to square one as to how X manufacturer will actually use the HDR10+ data, e.g. will it prioritise maximum content luminance, maximum mastering luminance, frame average luminance and so on. Just like current HDR10 TVs, the point is that they're NOT all using the same methods to process the dynamic HDR10+ information.

BUT a key thing to consider when everyone's waving their dynamic metadata about is that Dolby themselves have quietly dropped the 'golden reference' system for each specific line of TVs. So while it may still have the superior metadata in terms of how that data was physically created and can also go frame by frame if necessary, I'm not sure that Dolby's processing 'engine' is now any less random than X implementation of HDR10+.
  Reply With Quote
Thanks given by:
legends of beyond (01-24-2018)
Old 01-24-2018, 11:30 AM   #270
Sabre95 Sabre95 is offline
Power Member
 
Sabre95's Avatar
 
Nov 2017
New York
383
684
3
1
4
1020
27
Default

I have yet to see a big difference between Dolby Vision and HDR when I use my Apple TV 4K. What am I missing?
  Reply With Quote
Old 01-24-2018, 01:41 PM   #271
BrownianMotion BrownianMotion is offline
Power Member
 
Nov 2013
Default

Quote:
Originally Posted by Geoff D View Post
I could be wrong here Sweetie but I thought the mentions of it being an automated algorithm related to the actual generation of the metadata at source? Heck, even Dolby's first metadata pass is automated, but it can be adjusted manually from there unlike HDR10+.

The thing about HDR10+ in playback terms is that without a similar kind of Dolby 'engine' which knows how best to process the data for that TV then we're back to square one as to how X manufacturer will actually use the HDR10+ data, e.g. will it prioritise maximum content luminance, maximum mastering luminance, frame average luminance and so on. Just like current HDR10 TVs, the point is that they're NOT all using the same methods to process the dynamic HDR10+ information.

BUT a key thing to consider when everyone's waving their dynamic metadata about is that Dolby themselves have quietly dropped the 'golden reference' system for each specific line of TVs. So while it may still have the superior metadata in terms of how that data was physically created and can also go frame by frame if necessary, I'm not sure that Dolby's processing 'engine' is now any less random than X implementation of HDR10+.
That's my understanding as well. Might be helpful to revisit Penton-Man's post on the matter.

Quote:
Originally Posted by Penton-Man View Post
To generate HDR10plus metadata the user needs to load the HDR master encoded with the PQ curve and add to the timeline. The statistics generation is implemented as a metadata encoder, called: HDR10 Plus. After adding the encoder, the user needs to render the file, which will be a human readable JSON file containing the mathematical coefficients defining the tone map curve for each scene. By default the encoder automatically detects cuts between scenes based on the light level statistics.



Encoder to generate the HDR10plus metadata



The metadata in the generated file targets a specific brightness level, defined in nits in the TargetLevel parameter of the encoder. The maximum brightness of the source master should be set in the MasteringLevel setting. To QC or visualize the result of the dynamic remap, a HDR10plus node is added to the node page.

Then the user can import the generated JSON file through the Timeline menu by clicking on Cut Detection with HDR10plus JSON. This operation adds the cuts defined in the JSON file as well as populates the HDR10plus node with the appropriate metadata. As soon as the metadata is loaded into the nodes, the remap is processed. To validate, park it on a frame with bright pixels exceeding the nit level defined in the TargetLevel of the metadata, and toggle Bypass on the HDR10plus node.



HDR10plus node
And as he says later on, in theory, the metadata found on the JSON file can be edited by hand: https://forum.blu-ray.com/showpost.p...postcount=3977
  Reply With Quote
Thanks given by:
Geoff D (01-24-2018)
Old 01-24-2018, 03:42 PM   #272
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1097
2399
6
33
Default

Still, even though Dolby has dumped the 'golden reference' a DV display still has a Dolby decoder inside that, on paper, will interpret the metadata using the preferred characteristics of the display. But because HDR10+ has no such thing I wonder if that's what's causing the lag in the metadata that was reported here: https://www.flatpanelshd.com/news.ph...&id=1516011308

What I mean is, of course there's some sort of HDR10+ decoding inside a relevant TV otherwise it wouldn't be able to recognise the the dynamic metadata at all. But if it's there solely to read the data off the disc and pass it to the TV's own processing which then decides what it's going to do with it, instead of those processes all being carried out inside the same decoder as per Dolby's implementation, then I can see why such a lag would occur. I'm sure that this will be fixed but clearly it's not just the big D who's having dynamic droopage right about now...
  Reply With Quote
Thanks given by:
BrownianMotion (01-24-2018), gkolb (01-25-2018)
Old 01-24-2018, 04:03 PM   #273
BrownianMotion BrownianMotion is offline
Power Member
 
Nov 2013
Default

Quote:
Originally Posted by Geoff D View Post
Still, even though Dolby has dumped the 'golden reference' a DV display still has a Dolby decoder inside that, on paper, will interpret the metadata using the preferred characteristics of the display. But because HDR10+ has no such thing I wonder if that's what's causing the lag in the metadata that was reported here: https://www.flatpanelshd.com/news.ph...&id=1516011308

What I mean is, of course there's some sort of HDR10+ decoding inside a relevant TV otherwise it wouldn't be able to recognise the the dynamic metadata at all. But if it's there solely to read the data off the disc and pass it to the TV's own processing which then decides what it's going to do with it, instead of those processes all being carried out inside the same decoder as per Dolby's implementation, then I can see why such a lag would occur. I'm sure that this will be fixed but clearly it's not just the big D who's having dynamic droopage right about now...
Yup, I brought this up a while ago. Looks like both of the main dynamic HDR formats are experiencing issues over HDMI.

Scott Wilkinson on AVS brought up something interesting. He claims that the "infoframe" solution that Samsung is using to get HDR10+ to work over HDMI 2.0 is only a partial implementation and so only part of the metadata is being transmitted. He says that HDMI 2.1 is required for the full implementation.

You have to wonder what other consequences there are to this. The lag may be partially a result of this, but perhaps the partial implementation would produce inferior picture quality compared to the full version?

Not sure how accurate his claim is, though.
  Reply With Quote
Thanks given by:
Geoff D (01-24-2018)
Old 01-24-2018, 04:20 PM   #274
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1097
2399
6
33
Default

Dang, we've got partial HDR10+ vs partial Dolby Vision, it's like a race to the bottom at the moment.
  Reply With Quote
Thanks given by:
BrownianMotion (01-24-2018), Doctorossi (01-24-2018), Fendergopher (01-24-2018), FilmFreakosaurus (01-25-2018), gkolb (01-25-2018), legends of beyond (01-25-2018), ray0414 (01-24-2018), Staying Salty (01-24-2018), Wing Wang17 (01-25-2018)
Old 01-24-2018, 04:32 PM   #275
Doctorossi Doctorossi is online now
Blu-ray Knight
 
Doctorossi's Avatar
 
Feb 2009
120
466
Default

Quote:
Originally Posted by Geoff D View Post
Dang, we've got partial HDR10+ vs partial Dolby Vision, it's like a race to the bottom at the moment.
As long as the CE manufacturers send us all new equipment as soon as this is sorted out and standardized.

They're gonna... right?
  Reply With Quote
Thanks given by:
gkolb (01-25-2018)
Old 01-24-2018, 05:19 PM   #276
gates70 gates70 is online now
Blu-ray Champion
 
gates70's Avatar
 
Jul 2009
Canada
248
1037
4559
1
Default

Quote:
Originally Posted by Sabre95 View Post
I have yet to see a big difference between Dolby Vision and HDR when I use my Apple TV 4K. What am I missing?
Nothing.

And everyone is talking out of their ass about HDR10+ until it's actually seen by the masses.
  Reply With Quote
Old 01-24-2018, 06:14 PM   #277
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1097
2399
6
33
Default

Quote:
Originally Posted by Doctorossi View Post
As long as the CE manufacturers send us all new equipment as soon as this is sorted out and standardized.

They're gonna... right?
Yes sir, the check is in the mail.
  Reply With Quote
Thanks given by:
Doctorossi (01-24-2018)
Old 01-24-2018, 10:48 PM   #278
dlb99 dlb99 is offline
Active Member
 
Nov 2012
Default

Quote:
Originally Posted by BrownianMotion View Post
Yup, I brought this up a while ago. Looks like both of the main dynamic HDR formats are experiencing issues over HDMI.

Scott Wilkinson on AVS brought up something interesting. He claims that the "infoframe" solution that Samsung is using to get HDR10+ to work over HDMI 2.0 is only a partial implementation and so only part of the metadata is being transmitted. He says that HDMI 2.1 is required for the full implementation.

You have to wonder what other consequences there are to this. The lag may be partially a result of this, but perhaps the partial implementation would produce inferior picture quality compared to the full version?

Not sure how accurate his claim is, though.
Meta-data handling and tone-mapping inside the 4K player would solve a lot of these issues.

The upcoming Panasonic UB820 could be such a player with its HDR10+ and Dolby Vision support. It will have the same top-of-the-line HCX processor as in their excellent OLED TVs and the player will have a feature called HDR Optimizer which is basically an in-player tone-mapper. You set your desired peak luminance (either 500nits, 1000nits or 1500nits) and the player will emit a tone-mapped signal.

Surely the UB820 could read the disk meta-data (HDR10+ or DV) and then emit an appropriate tone-mapped HDR10 signal? Why not?

It would avoid all these HDMI issues.

It would be the equivalent of DTS-MA or TrueHD decoding to PCM inside the player.

Does the Dolby Vision license prohibit a source player from doing the above?

If this was possible then it would not matter if your TV did not have HDR10+ or DV, let the source do the tone-mapping.
  Reply With Quote
Old 01-25-2018, 12:39 AM   #279
Pyoko Pyoko is offline
Blu-ray Guru
 
Pyoko's Avatar
 
Apr 2008
151
722
Default

Quote:
Originally Posted by dlb99 View Post
Meta-data handling and tone-mapping inside the 4K player would solve a lot of these issues.
I'm not sure it would work that well at this point, how would you prevent the TV from doing a double-whammy and tone-map the already tone-mapped input? And if the tone-mapping on the TV was done in conjuncture with the ABL for example, you wouldn't be able to take that into account, and likely would not have a choice (don't know if it actually is, but it seems possible.)
  Reply With Quote
Old 01-25-2018, 01:31 AM   #280
dlb99 dlb99 is offline
Active Member
 
Nov 2012
Default

Quote:
Originally Posted by Pyoko View Post
I'm not sure it would work that well at this point, how would you prevent the TV from doing a double-whammy and tone-map the already tone-mapped input? And if the tone-mapping on the TV was done in conjuncture with the ABL for example, you wouldn't be able to take that into account, and likely would not have a choice (don't know if it actually is, but it seems possible.)
I don't see double-whammy being that big an issue.

Let's say you have a TV with 1,000nits of brightness. You tell the 4K player to tone-map to 1,000nits.

You feed the player a 4,000nits title (e.g BvS); the player then spits out a tone-mapped signal up to 1,000nits with meta-data signalling that MAX values are 1,000nits (as if it were a 1,000nits title to begin with). The TV should then do no tone-mapping at all.

My speculation is why can't the upcoming Panasonic player (UB-820) take HDR10+ (or Dolby Vision) meta-data into account when doing this tone-mapping? (named HDR Optimizer). If it can then it won't matter if your TV does not support HDR10+ or DV since the decoding/tone-mapping will be done inside the player.

Note, I don't know of the Panasonic will do this; but I hope it will.
  Reply With Quote
Reply
Go Back   Blu-ray Forum > 4K Ultra HD > 4K Blu-ray and 4K Movies


Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 03:57 PM.