|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $11.99 35 min ago
| ![]() $142.11 7 hrs ago
| ![]() $24.99 19 hrs ago
| ![]() $42.84 4 hrs ago
| ![]() $17.99 1 day ago
| ![]() $38.68 4 hrs ago
| ![]() $48.55 | ![]() $10.99 1 hr ago
| ![]() $24.99 | ![]() $19.99 1 day ago
| ![]() $21.99 1 day ago
| ![]() $29.96 |
![]() |
#261 | |
Active Member
Nov 2012
|
![]() Quote:
So cheering for team Dolby Vision will mean disappointment when some of your favourite content will ship without it. Warners giving up on Dolby Vision seems a pretty big deal. |
|
![]() |
![]() |
#262 |
Blu-ray Samurai
|
![]()
You're not telling me anything I don't already know about dynamic metadata.
What you're failing to realize here is, Dolby uses its multiple profiles for each piece of hardware, to ensure every compatible piece of hardware it's implemented in, gets the best results of that hardware. What HDR10+ is, a single algorithm that incorporates its version of dynamic metadata. Meaning, it may work well for two panels, but could be completely BORKED for another two. That's why it's a joke. In the end, it's no real different from static HDR10, and having a panel like say the ZD9, where dynamic metadata isn't really needed, in terms of resolving detail. You could have a 500-nit panel, yet still end up with terrible end-to-end playback, based on the HDR10+ algorithm not being well integrated into that panel's hardware/software. At least with Dolby, you know you're getting the best HDR experience your panel has to offer. EDIT: Warner didn't give up on Dolby. They're literally releasing Justice League in March, IN DOLBY VISION. |
![]() |
![]() |
#264 |
Active Member
Nov 2012
|
![]() |
![]() |
![]() |
#265 | |
Active Member
Nov 2012
|
![]() Quote:
TVs in the mid-range will have weak processors hence won't have the same capability to do on-the-fly analysis such as done by the ZD9 or LG OLEDs. HDR10+ (and DV to an extent) are a technologys primarily benefiting the mid-range TV world. So HDR10+ is not a joke for that mid-range TV segment; it will be greatly beneficial. The issue will be the same as DV, not all content will be HDR10+. |
|
![]() |
![]() |
#266 |
Blu-ray Prince
|
![]()
Part of me wouldn't give a rat's ass if this up and died - in fact, I'd applaud it; the whole idea and implementation of HDR10+ reeks of attention grabbing and upsetting the apple cart
|
![]() |
Thanks given by: |
![]() |
#267 | |
Power Member
Nov 2013
|
![]() Quote:
This is the farthest thing from a format war. |
|
![]() |
Thanks given by: | EbonDragon (01-24-2018) |
![]() |
#268 |
Active Member
Feb 2016
|
![]()
so if i bought the 2018 samsung flagship which hasnt been revealed yet, and it had around 2500 nits, how much of a benefit would hdr10+ give on that set over standard hdr10? i'm assuming not a lot at all, considering it's a universal algorithm. i probably wont even bother buying into hdr10+ with the tv.
|
![]() |
![]() |
#269 | |
Blu-ray Emperor
|
![]() Quote:
The thing about HDR10+ in playback terms is that without a similar kind of Dolby 'engine' which knows how best to process the data for that TV then we're back to square one as to how X manufacturer will actually use the HDR10+ data, e.g. will it prioritise maximum content luminance, maximum mastering luminance, frame average luminance and so on. Just like current HDR10 TVs, the point is that they're NOT all using the same methods to process the dynamic HDR10+ information. BUT a key thing to consider when everyone's waving their dynamic metadata about is that Dolby themselves have quietly dropped the 'golden reference' system for each specific line of TVs. So while it may still have the superior metadata in terms of how that data was physically created and can also go frame by frame if necessary, I'm not sure that Dolby's processing 'engine' is now any less random than X implementation of HDR10+. |
|
![]() |
Thanks given by: | legends of beyond (01-24-2018) |
![]() |
#271 | ||
Power Member
Nov 2013
|
![]() Quote:
Quote:
|
||
![]() |
Thanks given by: | Geoff D (01-24-2018) |
![]() |
#272 |
Blu-ray Emperor
|
![]()
Still, even though Dolby has dumped the 'golden reference' a DV display still has a Dolby decoder inside that, on paper, will interpret the metadata using the preferred characteristics of the display. But because HDR10+ has no such thing I wonder if that's what's causing the lag in the metadata that was reported here: https://www.flatpanelshd.com/news.ph...&id=1516011308
What I mean is, of course there's some sort of HDR10+ decoding inside a relevant TV otherwise it wouldn't be able to recognise the the dynamic metadata at all. But if it's there solely to read the data off the disc and pass it to the TV's own processing which then decides what it's going to do with it, instead of those processes all being carried out inside the same decoder as per Dolby's implementation, then I can see why such a lag would occur. I'm sure that this will be fixed but clearly it's not just the big D who's having dynamic droopage right about now... |
![]() |
Thanks given by: | BrownianMotion (01-24-2018), gkolb (01-25-2018) |
![]() |
#273 | |
Power Member
Nov 2013
|
![]() Quote:
Scott Wilkinson on AVS brought up something interesting. He claims that the "infoframe" solution that Samsung is using to get HDR10+ to work over HDMI 2.0 is only a partial implementation and so only part of the metadata is being transmitted. He says that HDMI 2.1 is required for the full implementation. You have to wonder what other consequences there are to this. The lag may be partially a result of this, but perhaps the partial implementation would produce inferior picture quality compared to the full version? Not sure how accurate his claim is, though. |
|
![]() |
Thanks given by: | Geoff D (01-24-2018) |
![]() |
#274 |
Blu-ray Emperor
|
![]()
Dang, we've got partial HDR10+ vs partial Dolby Vision, it's like a race to the bottom at the moment.
|
![]() |
Thanks given by: | BrownianMotion (01-24-2018), Doctorossi (01-24-2018), Fendergopher (01-24-2018), FilmFreakosaurus (01-25-2018), gkolb (01-25-2018), legends of beyond (01-25-2018), ray0414 (01-24-2018), Staying Salty (01-24-2018), Wing Wang17 (01-25-2018) |
![]() |
#276 |
Blu-ray Champion
|
![]() |
![]() |
![]() |
#277 |
Blu-ray Emperor
|
![]() |
![]() |
Thanks given by: | Doctorossi (01-24-2018) |
![]() |
#278 | |
Active Member
Nov 2012
|
![]() Quote:
The upcoming Panasonic UB820 could be such a player with its HDR10+ and Dolby Vision support. It will have the same top-of-the-line HCX processor as in their excellent OLED TVs and the player will have a feature called HDR Optimizer which is basically an in-player tone-mapper. You set your desired peak luminance (either 500nits, 1000nits or 1500nits) and the player will emit a tone-mapped signal. Surely the UB820 could read the disk meta-data (HDR10+ or DV) and then emit an appropriate tone-mapped HDR10 signal? Why not? It would avoid all these HDMI issues. It would be the equivalent of DTS-MA or TrueHD decoding to PCM inside the player. Does the Dolby Vision license prohibit a source player from doing the above? If this was possible then it would not matter if your TV did not have HDR10+ or DV, let the source do the tone-mapping. |
|
![]() |
![]() |
#279 |
Blu-ray Guru
|
![]()
I'm not sure it would work that well at this point, how would you prevent the TV from doing a double-whammy and tone-map the already tone-mapped input? And if the tone-mapping on the TV was done in conjuncture with the ABL for example, you wouldn't be able to take that into account, and likely would not have a choice (don't know if it actually is, but it seems possible.)
|
![]() |
![]() |
#280 | |
Active Member
Nov 2012
|
![]() Quote:
Let's say you have a TV with 1,000nits of brightness. You tell the 4K player to tone-map to 1,000nits. You feed the player a 4,000nits title (e.g BvS); the player then spits out a tone-mapped signal up to 1,000nits with meta-data signalling that MAX values are 1,000nits (as if it were a 1,000nits title to begin with). The TV should then do no tone-mapping at all. My speculation is why can't the upcoming Panasonic player (UB-820) take HDR10+ (or Dolby Vision) meta-data into account when doing this tone-mapping? (named HDR Optimizer). If it can then it won't matter if your TV does not support HDR10+ or DV since the decoding/tone-mapping will be done inside the player. Note, I don't know of the Panasonic will do this; but I hope it will. |
|
![]() |
|
|
![]() |
![]() |
Thread Tools | |
Display Modes | |
|
|