As an Amazon associate we earn from qualifying purchases. Thanks for your support!                               
×

Best 4K Blu-ray Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
A Better Tomorrow Trilogy 4K (Blu-ray)
$82.99
6 hrs ago
Superman I-IV 5-Film Collection 4K (Blu-ray)
$74.99
 
Congo 4K (Blu-ray)
$28.10
50 min ago
Jurassic World: 7-Movie Collection 4K (Blu-ray)
$99.99
 
Alfred Hitchcock: The Ultimate Collection 4K (Blu-ray)
$124.99
1 day ago
The Toxic Avenger 4K (Blu-ray)
$39.02
5 hrs ago
The Bad Guys 2 4K (Blu-ray)
$33.54
2 hrs ago
Superman 4K (Blu-ray)
$29.95
 
Teenage Mutant Ninja Turtles Trilogy 4K (Blu-ray)
$70.00
 
Back to the Future Part III 4K (Blu-ray)
$24.96
 
The Howling 4K (Blu-ray)
$35.99
 
Death Wish 3 4K (Blu-ray)
$33.49
 
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)

Go Back   Blu-ray Forum > 4K Ultra HD > 4K Ultra HD Players, Hardware and News
Register FAQ Community Calendar Today's Posts Search


Closed Thread
 
Thread Tools Display Modes
Old 11-06-2017, 04:03 PM   #3201
DanBa DanBa is offline
Senior Member
 
Sep 2010
Default Request to the BDA to impose the support of all is specified HDR formats

The BDA could stop the HDR confusion in consumers' minds.

Quote:
Originally Posted by mrtickleuk View Post
Dolby Digital vs. DTS or Dolby Atmos vs. DTS:X could be audio format wars.
But consumers don’t have to endure hardship of audio format war thanks to the consumer-driven audio/video receivers (AVR) makers. They support all audio formats. It is not their logic business to interfere in consumer choice: a playback device should be able to play any existing content the consumers want.


Different audio or video formats are just different software running on playback devices (AVR, media player, TV).
http://www.avsforum.com/forum/465-hi...l#post50302361
http://www.avsforum.com/forum/465-hi...l#post50480993

Furthermore, software development is much easier and faster on modern operating systems of HDR TV than on rigid and complex DSP platforms of audio/video receiver.


We, consumers, can’t let "what happens in the industry happens".
After all, we, consumers, pay!

Consumers can just require Dolby Vision TV makers or HDR10+ TV makers to just add a piece of HDR software in order to be compatible with the other HDR format. This upgraded TV will remain the same: same panel, same electronic parts, same mechanical parts!

For example, the Sony Z9D doesn’t support Dolby Vision at launch.
Quote:
Sony already made the announcement - no Dolby Vision support.
Someone mentioned that Samsung's dynamic metadata HDR10 proposal passed with the competent authorities and perhaps we will see something like a dynamic HDR update on the Z in the future, but this is pure speculation.
Nobody should buy this set expecting a DV upgrade in the future - it simply ain't happening.
http://www.avsforum.com/forum/166-lc...l#post46372345

Later, the Sony Z9D will get Dolby Vision with a free software upgrade.
http://www.avsforum.com/forum/40-ole...l#post54095697
http://www.avsforum.com/forum/166-lc...l#post49661985


Dolby Vision is currently the most complex HDR format due to its dynamic metadata color volume mapping and its 12-bit dual layer architecture. Therefore, if a TV’s SoC like the Sony Z9D TV's SoC is powerful enough to support Dolby Vision, this SoC is able to support any other HEVC HDR format like HDR10+.

In the same way, if a UHD Blu-ray player’s SoC is powerful enough to support Dolby Vision, this SoC is able to support HDR10+.


The BDA wants to avoid a HDR format war, because a format war is detrimental to consumers, and therefore to its members (manufacturers, movie studios …).
In exchange for the addition of HDR10+, the BDA should require the TV makers to support both HDR10+ and Dolby Vision.


Last edited by DanBa; 11-06-2017 at 04:34 PM.
 
Thanks given by:
Staying Salty (11-06-2017)
Old 11-06-2017, 04:06 PM   #3202
HeatEquation HeatEquation is offline
Banned
 
Jan 2017
Default

Quote:
Originally Posted by ray0414 View Post
Europe only. USA stays with Dolby Vision ( and no Philips oled in USA)
Philips' strongest TV market is in Europe, so no surprise they went with HDR10+ in their biggest market.
 
Old 11-06-2017, 04:41 PM   #3203
PeterTHX PeterTHX is offline
Banned
 
PeterTHX's Avatar
 
Sep 2006
563
14
Default

Quote:
Originally Posted by HeatEquation View Post
Philips' strongest TV market is in Europe, so no surprise they went with HDR10+ in their biggest market.
In my experience Philips seriously underperforms in comparison to its rivals in the market, so I agree - no surprise there.
 
Old 11-06-2017, 04:42 PM   #3204
HeatEquation HeatEquation is offline
Banned
 
Jan 2017
Default

Quote:
Originally Posted by PeterTHX View Post
In my experience Philips seriously underperforms in comparison to its rivals in the market, so I agree - no surprise there.
Your only experience with Philips products comes from Philips Funai/USA, which support DV.

Their higher quality products in Europe, which is their biggest market, are the ones that will be supporting HDR10+.

Last edited by HeatEquation; 11-06-2017 at 04:59 PM.
 
Old 11-06-2017, 04:49 PM   #3205
DanBa DanBa is offline
Senior Member
 
Sep 2010
Default

Philips Funai and Philips TP Vision are not the same company.
https://forum.blu-ray.com/showthread...1#post13353761
 
Old 11-06-2017, 05:28 PM   #3206
JustinLH1125 JustinLH1125 is offline
Active Member
 
JustinLH1125's Avatar
 
Apr 2011
44
Default

Never really got an answer to my question, so I figured it wouldn't hurt to ask again. This may be a bit of a dumb, noobish question to ask (and I'm not sure if this is even the right place to ask) but, I have an Xbox One S and I use it to play my 4K Blu-rays and I've never had any issues until I tried to play Spider-Man (2002). It did the same thing with it's HDR that Netflix seems to do when I try to play something like The Defenders. It causes my screen to dim to near darkness on my Hisense 55 inch 55H8C.

None of my other HDR enhanced 4K BRs had this problem so I tried to research the answer and came up with next to nothing short than having people say to turn off HDR or try High Contrast Mode (which turns menus to garbage; especially in OneGuide). So I was wondering if there was any sort of work-around to help me with my issue so that I can really experience the difference in my movies and Netflix content. Is it really worth it for Netflix if my screen darkens and none of the colors really pop and I can't truly experience HDR on any movie? Sorry for such a noobish question.
 
Old 11-06-2017, 06:09 PM   #3207
HeatEquation HeatEquation is offline
Banned
 
Jan 2017
Default

Quote:
Originally Posted by TheSweetieMan View Post
Honestly, the recent remarks from artists like Roger Deakins, has kind of soured my taste on wanting a panel that has "THE HIGHEST NITS EVAH!"
Sounds like you don't understand the technology very well, then. If you want to truly experience what the directors intended, you're going to need a TV that can display the appropriate peak brightness levels. Right now, movies are mastered in either 1000 or 4000 nits. Some TVs (not yours) can do 1000+ nits. None can do 4000 nits.

Tone mapping and dynamic metadata can get you closer to the director's intent, but you're still not going to experience precisely what was intended. Also, the higher the peak brightness, the less tone mapping your TV will have to do, and therefore the higher the APL. Conversely, the lower the peak brightness, the more tone mapping your display will require, and lower the APL - and when you're dealing with an OLED, you also have to consider the ABL.

Nits are very important to anyone who comprehends the format.
 
Thanks given by:
bruceames (11-06-2017)
Old 11-06-2017, 07:19 PM   #3208
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1348
2525
6
33
Default

Quote:
Originally Posted by TheSweetieMan View Post
This is why you've become one of my favorite members on here, despite a couple 'heated' exchanges in the past.

You're a bit more level-headed when it comes to the HDR debate.
Nice of you to say so, I try to calls it as I sees it. And you've lightened up yourself, hence the lack of recent heated debate action between us.
 
Old 11-06-2017, 07:29 PM   #3209
TheSweetieMan TheSweetieMan is offline
Banned
 
Nov 2009
515
515
Default

Quote:
Originally Posted by HeatEquation View Post
Sounds like you don't understand the technology very well, then. If you want to truly experience what the directors intended, you're going to need a TV that can display the appropriate peak brightness levels. Right now, movies are mastered in either 1000 or 4000 nits. Some TVs (not yours) can do 1000+ nits. None can do 4000 nits.

Tone mapping and dynamic metadata can get you closer to the director's intent, but you're still not going to experience precisely what was intended. Also, the higher the peak brightness, the less tone mapping your TV will have to do, and therefore the higher the APL. Conversely, the lower the peak brightness, the more tone mapping your display will require, and lower the APL - and when you're dealing with an OLED, you also have to consider the ABL.

Nits are very important to anyone who comprehends the format.
No, I understand the format just fine.

There's a reason why Deakins is still on record saying his preferred viewing of 'Sicario' is still the ol' SDR/REC.709 version. On his forum, he discussed the complexity of having to grade a specific scene in the HDR version, where Emily Blunt's character stands in front of curtained window, making it hard to adjust the lighting and contrast between where she was positioned, and the lighting of the room. He was very specific to mention that TV brands, nor consumers, will dictate how bright his lighting needs to be--or how dark his shadows need to be. He, too, was also specific, about how the HDR grade for 'Blade Runner 2049', was graded to be as closely replicated to the REC.709 version as well.

FYI, it doesn't matter if HDR content is currently graded at 1,000-nits, or at 4,000-nits. Based on the scientific analysis that's been provided on this forum before, most titles currently mastered in the HDR format don't even hit those numbers. Not consistently, at least. For instance, a catalogue release like 'Goodfellas' never even exceeds 350-nits at any point during its run time. Yet, the HDR is still beneficial in the sense of also being able to utilize wider gradations of color. And better yet, that's also the benefit of a format like Dolby Vision; it's beneficial to panels that can't hit those higher numbers. But you still retain detail, color, and overall image accuracy. Which is more of a shortcoming for HDR10 than it is OLED.

Also, with the OLED's peak brightness shortcomings notwithstanding, Vincent Teoh, has also brought up how OLED, as a technology in itself, still provides extended dynamic range due to its infinite contrast ratio; the fact it hits 'true black; and the fact that due to the combination of those two features, also provides infinite F-stops.

Oh, and for OLEDs not being able to be the be-all, end-all for ya, it doesn't seem to affect Steve Yedlin, the DOP for 'The Last Jedi', who is currently using a 55-inch B7 LG OLED to double as his reference monitor.

You're the type of lowest common denominator that brands like Samsung market to. You'll be getting caught up in the penis measuring contest of 'nit-wars', while those of us that prefer content accuracy--as provided with context from a legend like Roger Deakins--won't be bothered with that nonsense.

The whole "HDR gets you closer to the director's intent!" thing isn't universally shared, now is it.
 
Old 11-06-2017, 08:04 PM   #3210
HeatEquation HeatEquation is offline
Banned
 
Jan 2017
Default

Quote:
Originally Posted by TheSweetieMan View Post
No, I understand the format just fine.

There's a reason why Deakins is still on record saying his preferred viewing of 'Sicario' is still the ol' SDR/REC.709 version. On his forum, he discussed the complexity of having to grade a specific scene in the HDR version, where Emily Blunt's character stands in front of curtained window, making it hard to adjust the lighting and contrast between where she was positioned, and the lighting of the room. He was very specific to mention that TV brands, nor consumers, will dictate how bright his lighting needs to be--or how dark his shadows need to be. He, too, was also specific, about how the HDR grade for 'Blade Runner 2049', was graded to be as closely replicated to the REC.709 version as well.

FYI, it doesn't matter if HDR content is currently graded at 1,000-nits, or at 4,000-nits. Based on the scientific analysis that's been provided on this forum before, most titles currently mastered in the HDR format don't even hit those numbers. Not consistently, at least. For instance, a catalogue release like 'Goodfellas' never even exceeds 350-nits at any point during its run time. Yet, the HDR is still beneficial in the sense of also being able to utilize wider gradations of color. And better yet, that's also the benefit of a format like Dolby Vision; it's beneficial to panels that can't hit those higher numbers. But you still retain detail, color, and overall image accuracy. Which is more of a shortcoming for HDR10 than it is OLED.

Also, with the OLED's peak brightness shortcomings notwithstanding, Vincent Teoh, has also brought up how OLED, as a technology in itself, still provides extended dynamic range due to its infinite contrast ratio; the fact it hits 'true black; and the fact that due to the combination of those two features, also provides infinite F-stops.

Oh, and for OLEDs not being able to be the be-all, end-all for ya, it doesn't seem to affect Steve Yedlin, the DOP for 'The Last Jedi', who is currently using a 55-inch B7 LG OLED to double as his reference monitor.

You're the type of lowest common denominator that brands like Samsung market to. You'll be getting caught up in the penis measuring contest of 'nit-wars', while those of us that prefer content accuracy--as provided with context from a legend like Roger Deakins--won't be bothered with that nonsense.

The whole "HDR gets you closer to the director's intent!" thing isn't universally shared, now is it.
It doesn't matter if the content doesn't actually hit 4000 nits, as many tone mapping algorithms (including LG's, I believe) will compress the entire 4000 nits into whatever the TV's peak brightness is. Dynamic metadata can help with this, but so can a display that can reach 4000+ nits. A display that can do 4000+ nits would be a much better solution, because it would also be beneficial when viewing content that gets much brighter than what today's displays can support. Indeed, as TVs begin to support higher peak brightness, dynamic metadata will become more and more irrelevant.

You can keep pretending that this is part of some "nit-war" (I believe you're the first person to ever mention the concept of a "nit war") but these are just facts. If you want accuracy, then peak brightness definitely plays a part in that.
 
Old 11-06-2017, 08:10 PM   #3211
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1348
2525
6
33
Default

Sweetie: it's worth reiterating that Deakins' preference for SDR (over a studio mandated HDR pass that he had nothing to do with) isn't taking place in the 709 domain but in the theatrical one so we're still talking about P3 gamut and 10-bit dynamic range at the minimum. Yes, the home DVD and Blu-ray versions will be SDR so in terms of those three letters they may well hew closer to his intent, but they will still require a trim pass to adjust the aforementioned master into basic 8-bit 709 consumer video and compromises are usually made there too.
 
Thanks given by:
TheSweetieMan (11-06-2017)
Old 11-06-2017, 08:15 PM   #3212
TheSweetieMan TheSweetieMan is offline
Banned
 
Nov 2009
515
515
Default

Quote:
Originally Posted by HeatEquation View Post
It doesn't matter if the content doesn't actually hit 4000 nits, as many tone mapping algorithms (including LG's, I believe) will compress the entire 4000 nits into whatever the TV's peak brightness is. Dynamic metadata can help with this, but so can a display that can reach 4000+ nits. A display that can do 4000+ nits would be a much better solution, because it would also be beneficial when viewing content that gets much brighter than what today's displays can support. Indeed, as TVs begin to support higher peak brightness, dynamic metadata will become more and more irrelevant.

You can keep pretending that this is part of some "nit-war" (I believe you're the first person to ever mention the concept of a "nit war") but these are just facts. If you want accuracy, then peak brightness definitely plays a part in that.
If I want accuracy, I'll go by what the content creator says; not by what some industry trying to peddle a gimmick to sell more TVs.

Jordan Vogt-Roberts is another, who stated that 'Kong: Skull Island' is not meant for HDR viewing.

Neill Blomkamp recently laughed at the idea of Samsung's LCD panels on Twitter.

Rian Johnson recently stood by Steve Yedlin's backing of the LG B7 OLED on Twitter.

These are actual content creators--NOT BRANDS--that don't seem to either take issue with the shortcomings of brightness with LG's OLEDs, or just don't outright buy into the notion that HDR somehow automatically means you are getting closer to what the director intended.

Also, you get solely caught up in nits as it is. The idea of HDR isn't just based on peak brightness. There are other features that come along with it, hence why Deakins graded BR2049 to be as close to the SDR version as possible. Taking advantage of the wider gradients of color is one of them. Which has no significance when it comes to peak brightness, especially if he's using a more subdued approach.

Hell, even in the Westworld thread, when you look at the interviews with the creators of the show, they used HDR as a complimentary feature. They didn't use it to provide blinding, hot highlights--or to create something to be in stark contrast.

Also, I'll gladly take credit for being the first person to coin the term 'nit wars'. Because based on more info I gather and research through actual filmmakers, colorists, and cinematographers, they don't seem to be as gung-ho about peak brightness, and HDR altogether, as you think Samsung tells you they are.
 
Old 11-06-2017, 08:18 PM   #3213
TheSweetieMan TheSweetieMan is offline
Banned
 
Nov 2009
515
515
Default

Quote:
Originally Posted by Geoff D View Post
Sweetie: it's worth reiterating that Deakins' preference for SDR (over a studio mandated HDR pass that he had nothing to do with) isn't taking place in the 709 domain but in the theatrical one so we're still talking about P3 gamut and 10-bit dynamic range at the minimum. Yes, the home DVD and Blu-ray versions will be SDR so in terms of those three letters they may well hew closer to his intent, but they will still require a trim pass to adjust the aforementioned master into basic 8-bit 709 consumer video and compromises are usually made there too.
That's fine. And maybe I slightly misinterpreted what he said by the SDR version. So, if what I'm understanding is, if I force my OPPO to turn the UHD's HDR off, but it still displays the rec.2020 version, then that would be his preferred method of viewing the film then, correct?

I think why I went to the REC.709 is because that's how he worded it for BR2049. Perhaps he was referencing the blu-ray and not the UHD.

It's the same reason why I also brought up Jordan Vogt-Roberts' thoughts on the manner, and how he didn't approve the HDR grade for 'Kong: Skull Island.'
 
Old 11-06-2017, 08:26 PM   #3214
HeatEquation HeatEquation is offline
Banned
 
Jan 2017
Default

Quote:
Originally Posted by TheSweetieMan View Post
If I want accuracy, I'll go by what the content creator says; not by what some industry trying to peddle a gimmick to sell more TVs.

Jordan Vogt-Roberts is another, who stated that 'Kong: Skull Island' is not meant for HDR viewing.

Neill Blomkamp recently laughed at the idea of Samsung's LCD panels on Twitter.

Rian Johnson recently stood by Steve Yedlin's backing of the LG B7 OLED on Twitter.

These are actual content creators--NOT BRANDS--that don't seem to either take issue with the shortcomings of brightness with LG's OLEDs, or just don't outright buy into the notion that HDR somehow automatically means you are getting closer to what the director intended.

Also, you get solely caught up in nits as it is. The idea of HDR isn't just based on peak brightness. There are other features that come along with it, hence why Deakins graded BR2049 to be as close to the SDR version as possible. Taking advantage of the wider gradients of color is one of them. Which has no significance when it comes to peak brightness, especially if he's using a more subdued approach.

Hell, even in the Westworld thread, when you look at the interviews with the creators of the show, they used HDR as a complimentary feature. They didn't use it to provide blinding, hot highlights--or to create something to be in stark contrast.

Also, I'll gladly take credit for being the first person to coin the term 'nit wars'. Because based on more info I gather and research through actual filmmakers, colorists, and cinematographers, they don't seem to be as gung-ho about peak brightness, and HDR altogether, as you think Samsung tells you they are.
You are just being irrational and emotional here. Now, all of a sudden, Samsung has a monopoly on high nit TVs? The highest nit TV on the market is a Sony TV, owned by your "favorite poster."

Actually, peak brightness is indeed one of the key aspects of HDR. What you're referring to is how HDR interacts with other features of UHD, such as wide color gamut, and how it allows for higher color volume. That's also an important aspect of UHD. Looking solely at HDR in a vacuum, it's silly to suggest that peak brightness doesn't matter.

You also seem to be suggesting that OLED and high peak brightness are intrinsically mutually exclusive. That need not be the case, and I imagine that OLEDs will become much brighter in years to come, as manufacturers realize the benefits of peak brightness for HDR.
 
Old 11-06-2017, 08:31 PM   #3215
TheSweetieMan TheSweetieMan is offline
Banned
 
Nov 2009
515
515
Default

Quote:
Originally Posted by HeatEquation View Post
You are just being irrational and emotional here. Now, all of a sudden, Samsung has a monopoly on high nit TVs? The highest nit TV on the market is a Sony TV, owned by your "favorite poster."

Actually, peak brightness is indeed one of the key aspects of HDR. What you're referring to is how HDR interacts with other features of UHD, such as wide color gamut, and how it allows for higher color volume. That's also an important aspect of UHD. Looking solely at HDR in a vacuum, it's silly to suggest that peak brightness doesn't matter.

You also seem to be suggesting that OLED and high peak brightness are intrinsically mutually exclusive. That need not be the case, and I imagine that OLEDs will become much brighter in years to come, as manufacturers realize the benefits of peak brightness for HDR.
So, providing the input and voices of actual content creators makes me emotional now? I'm not the one parading on this forum about the alleged encoding issues of Dolby Vision, or the benefits of HDR10+, without even seeing either first hand.

Also, I never once implied that peak brightness didn't matter when it comes to HDR. Learn reading comprehension, while you are at it.

I said that peak brightness isn't the only thing that matters.

A couple months ago, in the GITS thread, when people were crying about that film taking a more subdued approach to the HDR grade, I even provided an article where the colorist for 'Doctor Strange', provided that his favorite thing about HDR wasn't so much peak brightness, but the wider gradations of color--particularly black--where he can choose to either create more contrast in darker/lighter portions of an image--or choose to elevate the blacks to his liking, revealing more detail that you otherwise may not get in the SDR-graded version.

I'm only providing the benefits of HDR here, since according to you, I didn't understand the format.

But hey, in all of my supposed obliviousness to the format, I will still gladly take the words of Deakins, Johnson, Yedlin, Vogt-Roberts, and Blomkamp, over whatever my or your understanding of the format is. ;-)
 
Old 11-06-2017, 08:32 PM   #3216
StingingVelvet StingingVelvet is offline
Blu-ray Grand Duke
 
StingingVelvet's Avatar
 
Jan 2014
Philadelphia, PA
851
2331
111
12
69
Default

Quote:
Originally Posted by JustinLH1125 View Post
Never really got an answer to my question, so I figured it wouldn't hurt to ask again. This may be a bit of a dumb, noobish question to ask (and I'm not sure if this is even the right place to ask) but, I have an Xbox One S and I use it to play my 4K Blu-rays and I've never had any issues until I tried to play Spider-Man (2002). It did the same thing with it's HDR that Netflix seems to do when I try to play something like The Defenders. It causes my screen to dim to near darkness on my Hisense 55 inch 55H8C.

None of my other HDR enhanced 4K BRs had this problem so I tried to research the answer and came up with next to nothing short than having people say to turn off HDR or try High Contrast Mode (which turns menus to garbage; especially in OneGuide). So I was wondering if there was any sort of work-around to help me with my issue so that I can really experience the difference in my movies and Netflix content. Is it really worth it for Netflix if my screen darkens and none of the colors really pop and I can't truly experience HDR on any movie? Sorry for such a noobish question.
I googled your TV and Rtings says it has 200ish nits peak brightness, poor local dimming and no wide color gamut. With those specs you're not seeing HDR discs anywhere near how they're intended to be seen, so you're going to get a lot of dim, faded and washed out images. You can try and compensate for this by turning up color or turning on dynamic contrast, but if you care about accuracy and want a consistent picture it would be best to stick to normal BDs until you upgrade.
 
Old 11-06-2017, 08:37 PM   #3217
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1348
2525
6
33
Default

Quote:
Originally Posted by TheSweetieMan View Post
That's fine. And maybe I slightly misinterpreted what he said by the SDR version. So, if what I'm understanding is, if I force my OPPO to turn the UHD's HDR off, but it still displays the rec.2020 version, then that would be his preferred method of viewing the film then, correct?

I think why I went to the REC.709 is because that's how he worded it for BR2049. Perhaps he was referencing the blu-ray and not the UHD.

It's the same reason why I also brought up Jordan Vogt-Roberts' thoughts on the manner, and how he didn't approve the HDR grade for 'Kong: Skull Island.'
But turning off HDR means you're at the mercy of whatever tone mapping the player is doing, and (as the paper posted by Penton a few pages back proves) you can get unintended results from forcing another transfer function onto content that's mastered for a different one (and the OPPO's SDR conversion isn't the best). So while you'll get an SDR 2020 output you may still end up with something that Deakins wouldn't piss on if it was on fire.

Deakins has said on his site that he doesn't grade in 709, he works in "Log C colour space with a proprietary LUT" (basically that's the raw camera image with an initial logarithmic transform applied and a fine-tuned Look-Up-Table on top) and the 709 trim only comes when he's specifically grading for home video, he even said it's a space "more suitable for television".

Skull Island is interesting though, I thought the HDR was so damned close to the SDR that I wondered if my settings were wrong (they weren't) so I then wonder if matey had seen the HDR on a properly calibrated set? It certainly sounds like he wasn't involved with the HDR pass, the way that he dismissed it in that tweet.
 
Thanks given by:
mrtickleuk (11-07-2017)
Old 11-06-2017, 08:42 PM   #3218
TheSweetieMan TheSweetieMan is offline
Banned
 
Nov 2009
515
515
Default

Quote:
Originally Posted by Geoff D View Post
But turning off HDR means you're at the mercy of whatever tone mapping the player is doing, and (as the paper posted by Penton a few pages back proves) you can get unintended results from forcing another transfer function onto content that's mastered for a different one (and the OPPO's SDR conversion isn't the best). So while you'll get an SDR 2020 output you may still end up with something that Deakins wouldn't piss on if it was on fire.

Deakins has said on his site that he doesn't grade in 709, he works in "Log C colour space with a proprietary LUT" (basically that's the raw camera image with an initial logarithmic transform applied and a fine-tuned Look-Up-Table on top) and the 709 trim only comes when he's specifically grading for home video, he even said it's a space "more suitable for television".

Skull Island is interesting though, I thought the HDR was so damned close to the SDR that I wondered if my settings were wrong (they weren't) so I then wonder if matey had seen the HDR on a properly calibrated set? It certainly sounds like he wasn't involved with the HDR pass, the way that he dismissed it in that tweet.
Thanks for providing more insight into Deakins' process. Sometimes he's rather short with his responses on his forum--which is understandable--that it's hard to get the full intent and context of what he's saying. Either way, he obviously signed off on the HDR pass for 'Sicario'. But man, the way he was adamant about no one being able to tell him how bright or dark his content needs to look, was awfully telling to me.

And, it's not like Deakins is some old timer who is against new technologies. He was one of the first adopters of digital, after all. But he did provide some perspective for me when it comes to HDR, and how even he finds it silly to have these brands attempting to have these panels hit upwards to 10,000-nits.

IMO, I think once you need a bias light with your home theatre setup, that's when you know you don't need any more brightness.

I would also be curious to see what Vogt-Roberts viewed the HDR grade on. He later shared his excitement in regard to Dolby Vision (don't know if you saw that or not)--but still stated that for 'Kong', it wasn't meant to be viewed in HDR10 or Dolby Vision. That's why I'm content with owning the regular blu-ray for that film.
 
Old 11-06-2017, 08:47 PM   #3219
HeatEquation HeatEquation is offline
Banned
 
Jan 2017
Default

Quote:
Originally Posted by TheSweetieMan View Post
So, providing the input and voices of actual content creators makes me emotional now? I'm not the one parading on this forum about the alleged encoding issues of Dolby Vision, or the benefits of HDR10+, without even seeing either first hand.

Also, I never once implied that peak brightness didn't matter when it comes to HDR. Learn reading comprehension, while you are at it.

I said that peak brightness isn't the only thing that matters.

A couple months ago, in the GITS thread, when people were crying about that film taking a more subdued approach to the HDR grade, I even provided an article where the colorist for 'Doctor Strange', provided that his favorite thing about HDR wasn't so much peak brightness, but the wider gradations of color--particularly black--where he can choose to either create more contrast in darker/lighter portions of an image--or choose to elevate the blacks to his liking, revealing more detail that you otherwise may not get in the SDR-graded version.

I'm only providing the benefits of HDR here, since according to you, I didn't understand the format.

But hey, in all of my supposed obliviousness to the format, I will still gladly take the words of Deakins, Johnson, Yedlin, Vogt-Roberts, and Blomkamp, over whatever my or your understanding of the format is. ;-)
But their opinion doesn't contradict anything I said. It doesn't even apply. They are simply telling us their favorite part about HDR. I don't even disagree with them. Most of my favorite HDR content doesn't have highlights that burn your retinas. That's irrelevant though. Since these aren't the only content creators on the planet, if you want accuracy across all content, then your display will need to support the right level of peak brightness, the proper color coverage, etc. Dynamic metadata will get you close, but not quite there.
 
Old 11-06-2017, 08:51 PM   #3220
TheSweetieMan TheSweetieMan is offline
Banned
 
Nov 2009
515
515
Default

Quote:
Originally Posted by HeatEquation View Post
But their opinion doesn't contradict anything I said. It doesn't even apply. They are simply telling us their favorite part about HDR. I don't even disagree with them. Most of my favorite HDR content doesn't have highlights that burn your retinas. That's irrelevant though. Since these aren't the only content creators on the planet, if you want accuracy across all content, then your display will need to support the right level of peak brightness, the proper color coverage, etc. Dynamic metadata will get you close, but not quite there.
It does contradict what you're saying, though. They're either 1.) not all that in favor of HDR as a format as is--or 2.) They're not preoccupied with how bright or dark their images need to look.

The consumer side of the industry can say things need to be graded at whatever nits they choose, if the content creators aren't signing off on it, then I'm going out of my way to view the film the way they intended.

Again, the fact that Steve Yedlin is using a lowly-nit B7 as a reference monitor, is good enough for me, regardless of whatever the nits dictates HDR be graded to.
 
Closed Thread
Go Back   Blu-ray Forum > 4K Ultra HD > 4K Ultra HD Players, Hardware and News



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 10:50 PM.