Quote:
Originally Posted by Geoff D
Looking at one of my 100-nit calibrations I can get on/off contrast of 4515:1 (12.1 f/stops) and ANSI contrast of 2559:1. (11.3 f/stops) Does that meet with your approval?
[edit] Just had another look. On my current 100-nit settings I'm getting 8164:1 (13 f/stops) on/off and 4248:1 ANSI (12.1 f/stops) with black at 0.013 nits and peak white at 106. With my UHD settings I'm getting blacks at 0.043 and peak white of 351 nits, which - as you rightly say Deci - results in much the same ratios of 8221:1 (13 f/stops) on/off and 4311:1 (12.1 f/stops) ANSI contrast. But - and I'm waiting for the inevitable correction - Sony appear to have over-engineered the dynamic range of the TV in the first place (it was marketed as having something called "X-Tended Dynamic Range") so it's still comfortable with reproducing this small semblance of the higher range when fed the converted SDR 2020 output from the Panasonic player.
Sure, one could argue that because my blacks are being lifted (although I watch with a small bias light anyway so I haven't noticed a huge difference in outright black levels) I'm not getting the benefit at the lower end, but then that's the nature of the beast with LCD tech. I've seen plenty of remarks from people who DO have the proper HDR EOTF that when the TV maxes out the backlight their blacks go to shit anyway, so I'm not losing any sleep over that aspect of it. Fact is, I've still seen some HUGE - and some not so huge - differences vs regular Blu-ray on the UHD discs that I've watched so far.
|
ok first a question. Why is your on/off ratio 1 more f/stop range than the ANSI in that model. Is it because it has some dynamic dimming, or because with the 50% pattern some kind of "flare" (right now it escapes me which TV it is tho I've probably read it in one of your posts), or because with the 50% pattern there's some light output limiting. I'm just curious thinking about your real CR. (I've inserted the f/stop equivalents in your post in
bold)
In any case assuming it's ~8000:1 (tho ~4000:1 is pretty respectable too) you have much more range than cinema (and SDR), as this table
provided by Penton shows (which I've expanded the curves on it somewhat, to infinity and beyond, I mean, a bit towards the Visionary future) that you exceed them by ~2 f/stops, and that's also just 2 f/stops shy from the specs posted around for the mastering requirements of HDR (15 f/stops) so your range sits smack in the middle of those two.
So even tho your TV is technically not HDR, since it has extended contrast, it is., as it
has higher dynamic range. Apart of the P3in2020 and 10bits, the UHD/HDR curve (the PQ) is just an alternate contrast curve that reaping the benefits of the extra bits and the perceptual coding of the levels redistributes a larger contrast range in the 10 bit signal. If you have the display contrast capability and the range of adjustment in the controls you can basically get the HDR, from that. What would be missing is the exact PQ curve shape, it not being a straight line slope as a gamma curve is. But in simplified terms HDR vs SDR, is the signal being coded to be viewed in a high gamma like ~4 with a twist, vs in a standard gamma like ~2 with it's straight line.
Those with displays with narrow dynamic range (i.e. 1000:1) will struggle with it a lot more than a display with 4000:1 or more. Bright highlights with milkier blacks or dark shadows with limited highlights, because it's "difficult" to fit in the high dynamic range where there is much less.
Quote:
Originally Posted by Kris Deering
HDR10 is not mastered at 1000 nits. HDR10 supports up to 10,000 nits
|
Quote:
Originally Posted by Geoff D
The test patterns on the Sony discs go all the way up to 10,000
|
Quote:
Originally Posted by bruceames.
So this statement is wrong?
Quote:
Originally Posted by avforums
HDR10 uses the ST 2084 EOTF and is mastered at 1,000 nits using 10-bit video depth and a colour space that can go up to Rec.2020.
|
|
The legal signal goes all the way to 10,000 (level 940, 100%). I think it would be more accurate to say the HDR10 in the current discs is being mastered with the maximum in the signals reaching around the 1000 nits levels (~levels 720-725, ~75%)
Quote:
Originally Posted by Crimsoncleaver
Bear in mind though that I have not calibrated with UHD. I can only do by eye so Geoff's in a better position. Are there any test patterns on the UHD discs though? Ideally need something running in the HDR to SDR mode to get colours correct.
|
Geoff has a colorimetre and a wide gamut display, no? He could try to adjust the color in his TV so that the P3
RGBCMY windows fall at around their corresponding x,y points on the TV, therefore calibrating the signal. But then of course the 2020 colors would clip. And Geoff don't like no clipping

. Might explain why some people get
bright orange instead of a
bright saturated red, as
deep orange is within smaller gamut TV while the new
bright saturated red might not? I think in the Spider-Man clown hair HDR ->SDR pic there seems to be some bright red to orange clip/mapping too.
Quote:
Originally Posted by Crimsoncleaver
Quote:
Originally Posted by Deciazulado
Checking quickly on a few shots
|
How did you measure this, sorry am new to luminance measurements etc. For example, how would you go from a lux measurement to the nits, if possible?
|
I just measured the digital values on the images and resolved for them using the PQ formula. That gives you the nits.
And, tho this is not what I measured, but 1 lux = 1 lumen/square meter/steradian = 1 nit
(1 lumen/square foot/steradian = 10.76 nits)