As an Amazon associate we earn from qualifying purchases. Thanks for your support!                               
×

Best 4K Blu-ray Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
Back to the Future 4K (Blu-ray)
$29.96
10 hrs ago
Hard Boiled 4K (Blu-ray)
$49.99
 
Casino 4K (Blu-ray)
$29.99
11 hrs ago
Undisputed 4K (Blu-ray)
$22.49
3 hrs ago
In the Mouth of Madness 4K (Blu-ray)
$36.69
 
Shin Godzilla 4K (Blu-ray)
$34.96
 
Spawn 4K (Blu-ray)
$31.99
 
The Sound of Music 4K (Blu-ray)
$37.99
 
I Know What You Did Last Summer 4K (Blu-ray)
$39.99
 
Creepshow 2 4K (Blu-ray)
$32.99
 
The Toxic Avenger 4K (Blu-ray)
$29.96
1 day ago
The Terminator 4K (Blu-ray)
$14.44
1 day ago
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)

Go Back   Blu-ray Forum > 4K Ultra HD > 4K Blu-ray and 4K Movies
Register FAQ Community Calendar Today's Posts Search


Reply
 
Thread Tools Display Modes
Old 07-19-2018, 10:15 PM   #1061
Pyoko Pyoko is offline
Blu-ray Guru
 
Pyoko's Avatar
 
Apr 2008
151
722
Default

Of course there would be losses, my point was just that the mapping of data to higher luminance output values on the display would be roughly the same which is what HDR is to me, but I guess were talking different definitions here. You could have a limited-range, gamma-encoded image in 10bit, but I wouldn't call that HDR just because it's 10bit.
  Reply With Quote
Old 07-20-2018, 01:49 AM   #1062
5150z 5150z is online now
Senior Member
 
Oct 2012
1
148
Default

Quote:
Originally Posted by Geoff D View Post
The funny thing is, someone did actually complain upthread of seeing rampant compression artefacts in the skies in Ghost Protocol, and there were several folks who also got blocking/banding in the skies when watching Dunkirk, which just ties back in to what I said about having the proper EOTF to view this content in, i.e. even having a poorly executed HDR tone map (never mind an SDR conversion) can make these artefacts stand out a lot more.
That was me Geoff. You replied to say you didn't see it on the Z9. The blocking was on Dolby Vision, with the default DV settings on my 65B6 OLED. When forcing HDR10, there is no banding. Gone! I've yet to see any blocking on HDR10 discs on my OLED. BR2049 is perfect, as an example. Saw none on Dunkirk either.
  Reply With Quote
Thanks given by:
Geoff D (07-20-2018)
Old 07-20-2018, 05:26 AM   #1063
andreasy969 andreasy969 is offline
Blu-ray Knight
 
Aug 2008
125
Default

Quote:
Originally Posted by Pyoko View Post
Of course there would be losses, my point was just that the mapping of data to higher luminance output values on the display would be roughly the same which is what HDR is to me, but I guess were talking different definitions here. You could have a limited-range, gamma-encoded image in 10bit, but I wouldn't call that HDR just because it's 10bit.
Yes, I think we are. To me HDR is not (only) about higher luminance (and I agree that the output will be roughly the same, but it will also be less accurate (resp. wrong) in other places the higher you go), to me HDR is about "proper" higher dynamic range (incl. higher luminance) without losing/sacrificing anything.

EDIT: Reg. the 10 bits ... The way I understand the higher nits SDR conversions, this is what those do: If you use 300 nits, it will basically map 300 nits luminance levels (instead of 100) to the SDR max of 100 nits.You then (from my understanding) basically boost this SDR image to 300 nits, so that the 300 nits luminance will indeed be displayed roughly correctly (and everything else being a best effort as well, 0 being correct at any rate - I always cringe when people talk about better blacks with HDR). But since you're mapping luminance levels of 0-300 nits to a "container" that is supposed to display only a range of 0-100 nits, lower values will have to be mapped to wrong (for example too low) luminance levels. Something that is only "almost black" in the HDR signal may (and will imo) therefore become just black resp. 0 then.

I do agree that 10 bit SDR will only give you more precision in the 0-100 nits range, but not higher luminance levels.

Last edited by andreasy969; 07-20-2018 at 03:17 PM.
  Reply With Quote
Old 07-20-2018, 06:07 AM   #1064
Markgway Markgway is offline
Blu-ray Prince
 
Markgway's Avatar
 
Jul 2013
Scotland
13
Default

  Reply With Quote
Old 07-20-2018, 09:50 PM   #1065
Pyoko Pyoko is offline
Blu-ray Guru
 
Pyoko's Avatar
 
Apr 2008
151
722
Default

Quote:
Originally Posted by andreasy969 View Post
The way I understand the higher nits SDR conversions, this is what those do: If you use 300 nits, it will basically map 300 nits luminance levels (instead of 100) to the SDR max of 100 nits.You then (from my understanding) basically boost this SDR image to 300 nits, so that the 300 nits luminance will indeed be displayed roughly correctly
Yep that's correct, so while the image itself is gamma-encoded, the 300-nit setting when initially decoding the st2084 EOTF means it should look like a 300 nit HDR display when viewing on an SDR display set to 300 nits.

In such a screenshot scenario the entire chain would be something like:

linear light in (light as it exists in the real world, what goes into the camera) -> camera/post stuff, eventually leading to the light being encoded in st2084 on disc -> st2084 decoded by player to linear, clipped and mapped so 300 nits becomes the max data value -> linear light encoded using gamma, saved as screenshot -> screenshot decoded using gamma by display -> linear light out (with max data value displaying as 300 nits)

It's worth remembering that the 8/10/whatever bits files/streams are just sets of data values that could mean anything, and as long as the sender and receiver are in agreement you could map any range you wanted, using any transfer function, in any bitdepth container (being limited only by precision, with some combinations making more sense than others, which is what st2084 is all about.)
  Reply With Quote
Old 07-20-2018, 10:00 PM   #1066
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1348
2525
6
33
Default

"as long as the sender and receiver are in agreement" is HDR's #1 problem right there, lol
  Reply With Quote
Old 07-20-2018, 10:40 PM   #1067
andreasy969 andreasy969 is offline
Blu-ray Knight
 
Aug 2008
125
Default

Quote:
Originally Posted by Pyoko View Post
Yep that's correct, so while the image itself is gamma-encoded, the 300-nit setting when initially decoding the st2084 EOTF means it should look like a 300 nit HDR display when viewing on an SDR display set to 300 nits.

In such a screenshot scenario the entire chain would be something like:

linear light in (light as it exists in the real world, what goes into the camera) -> camera/post stuff, eventually leading to the light being encoded in st2084 on disc -> st2084 decoded by player to linear, clipped and mapped so 300 nits becomes the max data value -> linear light encoded using gamma, saved as screenshot -> screenshot decoded using gamma by display -> linear light out (with max data value displaying as 300 nits)

It's worth remembering that the 8/10/whatever bits files/streams are just sets of data values that could mean anything, and as long as the sender and receiver are in agreement you could map any range you wanted, using any transfer function, in any bitdepth container (being limited only by precision, with some combinations making more sense than others, which is what st2084 is all about.)
Thx. And maybe I'm really not giving those higher nits SDR conversions enough credit. I also only use it for the screenshots since my projector can't do the higher nits SDR stuff anyway ...

But I can tell that it does look very good with the highest nits display that I own at any rate, which happens to be my phone (Note 4) whose AMOLED display goes up to 750 nits in its "super bright" mode and according SDR caps really look very good on it (even the HDR caps look decent) (but unfortunately I don't watch my movies on my phone).

Quote:
Originally Posted by Geoff D View Post
"as long as the sender and receiver are in agreement" is HDR's #1 problem right there, lol
But I still also see this problem.
  Reply With Quote
Old 07-20-2018, 11:25 PM   #1068
Pyoko Pyoko is offline
Blu-ray Guru
 
Pyoko's Avatar
 
Apr 2008
151
722
Default

I've also tried it with my OLED, playing the HDR stream converted to SDR at 300 nits in madVR while boosting the SDR display settings to max, and then compared to the HDR. It's roughly close and you definitely get a proper HDR effect. Of course the higher the nit setting the less efficient gamma is going to be for encoding the image (highlights stealing too many bits), and you will probably want an OLED or a display with good local dimming, because on a regular old monitor pushing 300 nits, the black level will be so poor it's likely going to ruin the HDR contrast from the other end.
  Reply With Quote
Old 07-20-2018, 11:46 PM   #1069
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1348
2525
6
33
Default

I was basically doing this two years ago when watching converted SDR UHDs on my non-HDR Sony 4K which could hit about 450 nits peak brightness with everything cranked up, this was using my Panny UB900 to output SDR 2020. Black levels weren't a problem (dat dimming) but I did see a shit-ton of artefacts and banding on certain titles because, as you rightly say, it simply runs out of bit depth in converted gamma and so the artefacts buried in those highlights were brutally exposed, along with heavy banding in stuff like the Peanuts Movie.

If the SDR conversion were first overdriving the HDR signal to at least 11 bits then gamma artefacts would be less of a problem, you'd be giving the highlights in the 10-bit HDR signal a roughly similar amount of bit depth as in gamma so when it gets mapped across it's a better fit (this is also why the processing is so shit on certain HDR TVs, they still operate in gamma and when the HDR is converted across it's not given enough headroom in the bit depth to map across properly).
  Reply With Quote
Old 07-20-2018, 11:47 PM   #1070
andreasy969 andreasy969 is offline
Blu-ray Knight
 
Aug 2008
125
Default

Quote:
Originally Posted by Pyoko View Post
I've also tried it with my OLED, playing the HDR stream converted to SDR at 300 nits in madVR while boosting the SDR display settings to max, and then compared to the HDR. It's roughly close and you definitely get a proper HDR effect.
I tend to re-visit my own 1000 nits caps on my phone and yes those conversions look great and like HDR with according brightness indeed. But I always thought they would still "crush" things. It's not that I perceive it that way, it's just what I think "should" happen. When doing the screenshots, the main purpose is to show the increased highlight detail though. That it actually also looks great on an according displayl, is sth. I noticed only later. (And my projector is annoying me even more since then.)
  Reply With Quote
Old 07-21-2018, 12:07 PM   #1071
Deciazulado Deciazulado is offline
Site Manager
 
Deciazulado's Avatar
 
Aug 2006
USiberia
6
1161
7055
4063
Smile screenshot celebrity death match

Quote:
Originally Posted by andreasy969 View Post
Yes, 300 nits HDR=>SDR caps will only be displayed "properly" if one pushes the brightness of the monitor to 300 nits
Well the specs of the 2 year old computer monitor I have is 300 nits (When new it reached 400) (on the other hand the one that's more than a half a dozen years old is 100). Newest iPhone specs I saw were 500.

So I wouldn't call it pushing.

(btw 500 is +0.66 f/stop than 320, 100 is -1.66 f/stop). I'm waiting for 1000 nits ones*.


Quote:
Originally Posted by andreasy969 View Post
Yes, 300 nits HDR=>SDR caps will only be displayed "properly" if one pushes the brightness of the monitor to 300 nits (and will usually also look much better than a 100 nits conversion on calibrated 100 nits), BUT, and that's important, you still have an SDR image then.
No, it has +1.67 higher dynamic range than the SDR image. (For now*)

Quote:
Originally Posted by andreasy969 View Post
The colors and brightness are therefore still wrong
No, the colors are correct here. The colors within the gamut of your monitor are displayed correctly. If you have a P3 computer monitor (like previously mentioned iMac, even iPhones and tablets are getting on the P3 bandwagon) since the colors within the triangle will be correct, you will see all the P3 colors. If you have an Adobe 1998 comp monitor (a wide gamut monitor with less saturated red than P3, but a more saturated green between P3 and 2020) (the green that the SONY OLED Pro has) the colors within the A1998 triangle will be correct, and if the disc has any A1998 greens you'll see those. If you have a rec709/sRGB monitor the colors within the triangle will be correct, outside the triangle will be at the triangle boundaries.

As for brightness if you display them at 320 nits, (and probably within a f/stop of that) the brightness and contrast of the image you see will be correct. (unless you have the gamma of your monitor out of whack. Indeed if you had 2 same model monitors, at the same peak brightness for 100%RGB, one calibrated correctly for 2.2 and another correctly for 2.4; these 320s should look the same on both.)

Quote:
Originally Posted by andreasy969 View Post
most importantly, the dynamic range is also still limited to the dynamic range of SDR. By giving more room to the highlights (which is what higher nits HDR=>SDR conversions basically do), you will lose sth. elsewhere (resp. get crushed blacks for ex). (If you have a HDR signal that peaks at only 100 nits (or even lower), 100 nits is therefore also the way to go with the SDR conversion.)
Well the screenshot is PC levels 8-bit instead of Video level 10-bit that's true, but that's the only difference. The screenshot (signal) is still the HDR/PQ curve, so it is PC 8-bit HDR of 256 levels, instead of Video 10-bit HDR of 877 levels. (~30%).

Now the question actually is if changing that level of quantization (10bit video levels HDR to 8bit PC levels HDR) might make the image and tones more visibly crude or not compared to the original HDR quantization, which is one thing, but that doesn't change an HDR image into an SDR per se. The image still has the rec2020 colors and HDRtone encoding, on the properly done screenshot, so it is HDR, just 8bit HDR. Instead of 10bitHDR or 12bit HDR. Bits doesn't make HDR, as there is 10bit SDR, 12bitSDR, even 16bitSDR. In Photoshop you can change your settings to 16bits. 16bits setting doesn't make an image HDR. What makes it HDR is the tone curve and display of higher range. Highlight display (on the 320) is +1.67 more range.

The only issue about doing that on 10-bit HDR to 8-bit HDR screenshots is that the tone discrimination, (levels), is being reduced. As I said this could make the images visibly cruder but that of itself doesn't affect the range of highlights to shadows directly.

Let's address this anyway. On HDR-10, for 50-100 nits (Video levels levels 449-509), one f/stop; we actually have 60 video levels. So we have 1/60 of an f/stop gradations. On 8bit PC levels HDR, from 50-100 nits (about PC levels 112-129) there are about 18 levels. So we have 1/18 of an f/stop gradations.

1/18 f/stops is the difference between distinguishing 100nits from 104nits if you can check that. Your camera normally can change exposure in thirds of an f/stop (100 nits <-> 126 nits). On a photographic print diffuse whites were usually printed about 1/6 of an f/stop darker (0.16 f/stop) than specular highlights so they could be told apart as that was considered the minimum variation needed to distinguish (discriminate one from the other) specular highlights from non specular whites (1/6 is the difference between 100 nits vs 113 nits) (So you see prints have extremely low SDR highlights )

1/18 is 3 times finer gradations than 1/6. I'm not saying it won't be noticeable, but presently that's the limit we have to abide for on screenshots and jpgs. (I'm waiting for 10bit jpgs* )
Some posterization and banding might show up. But that by itself doesn't change HDR to SDR. What it would change is Smoothness of Gradation.

Now what about crushed blacks. That is always a concern. I pixel-peeped.
This is a screenshot of the lowest blacks brightened to see how low 'til what level a level was distinguishable from the bottom end from a familiar pattern.

screenshothot2018-07-21at3.52.20amtoshowshadowdetailofhdrjpg.jpg

As you can see on the 8bitHDR shot 0.005 nits is distinct from black. Not 0.001 tho. But science has its limits. Be aware this is a brightened screenshot of an screenshot of the actual display of the 4K screenshot by the browser. So it's not the jpg being brightened and screenshoted, it's what you see on the screen screenshoted by the PC and then that digital screenshot of the computer display was brightened and then that was screenshoted again by the PC. (This is done to check if the actual values on the 320 screenshot as seen on the web browser are getting trough so it's the digital values/brightness that your computer is sending to your screen and what you would see if you had Superman's telescopic brightener vision (Kryptonian Carrot Vitamin A) and the display/system is capable of displaying it.

If you have a 320 nit display, dividing 320/0.005 = 64,000:1 range above black shadow detail. I think that's ok.
Now if you have strict gamma 2.2 but the monitor has 1000:1 CR probably everything below 0.5nit is black. (If you have a 4000:1, everything below 0.1, etc. etc.) (Most people have a curve (like sRGB2.2 which is not really g2.2) elevating blacks or they actually increase the brightness (black level) somewhere to get around this low level limit ands see shadow tones. But in the strick sense of the word that would not really show the exact contrast the image has).

Quote:
Originally Posted by andreasy969 View Post
I just want to make it crystal clear that even a 4,000 nits HDR=>SDR converted cap is still SDR with the according limited dynamic range and is therefore still far from being close to the original. (but still very useful of course!)
Yes to that for some, but I maintain the proper ones are not, and are indeed very close (save for the limits of highlights explained elsewhere), the exact opposite from "still far from being close to the original", as I expose here. Mainly somewhat cruder gradations if perceptible.

(I asked Geoff what he was seeing in the hopes it was something I see. The ones here are not perfect by any means but no one has yet pointed out 2 things on them that I've caught. The poor engineer is always having to retest things to check why. )

In any case tho there's always been the disclaimer below
[Show spoiler]
Quote:
A still-image screenshot of a moving image can show more - or less - artifacts due to the non-moving nature of the screenshot.
The screenshots should not be used to discuss whether a Blu-ray title features good or bad video quality, screenshots are included for descriptive
and entertainment purposes and to give the viewer a general idea of what a title will look like.
Screenshots are not representative of the true quality that Blu-ray offers.
, which is true (hint I found 4Ks are much much sharper than screenshots when they are moving around because eye integration, aggregation and circles of confusion, motion blur and all that jazz), so apart that, I think these are very close within limitations.

Quote:
Originally Posted by andreasy969 View Post
I guess you know that, but I'm afraid others might misread your comment into "300 nits SDR converted HDR cap equals HDR with a peak brightness of 300 nits", which isn't the case. (The way I see it, the higher than 100 nits caps are actually just kind of a "gamma trick".)
But that is what I'm saying. It is the case.
I'm not talking about others I've seen, which seem to me their aim is to make them be more or less 100nit/709 SDR-ed like you say.

Have, 4K, HDR contrast, +1.67 increased highlights (for now*) and rec2020, in PC 8bits (for now*)

  Reply With Quote
Thanks given by:
andreasy969 (07-21-2018)
Old 07-21-2018, 12:57 PM   #1072
andreasy969 andreasy969 is offline
Blu-ray Knight
 
Aug 2008
125
Default

@Deciazulado

Thx for your detailed response. We are basically saying the same thing though. I'm just making having only 8 bit instead of 10 bit for luminance values a big(ger) deal and may be more nitpicky reg. calling it "accurate" colors. (I don't watch movies that way, so I cannot say how much of a issue it really is.)

Quote:
Originally Posted by Deciazulado View Post
So I wouldn't call it pushing.
I have my PC monitor calibrated (not professionlly mind you) for 100 nits content (resp. for BDs), which results in a brightness setting of only 32/100, so I for one have to "push" its brightness even for my 200 nits caps - that's why I used the term "push". (With its brightness "pushed" to 100 percent, my LED monitor should reach around 350 nits according to tests, but I barely use that resp. it's kind of wasted potential.)

EDIT: One more thing ... Another problem with calling for ex 320 nits SDR caps "accurate" is that many people will be ignorant about the fact that they will (most lilkely) have to "push" the brightness of their monitor (I know that and am still nitpicky reg. "accurate", others don't even know the former though and will just see a "dark HDR image"). The fact that I was/still am getting the "Why is the image so dark?" even with my 200 nits caps only confirms this.

PS: Zero Dark Thirty coming today.

Last edited by andreasy969; 07-21-2018 at 02:07 PM. Reason: see above and typo
  Reply With Quote
Old 07-21-2018, 02:18 PM   #1073
Deciazulado Deciazulado is offline
Site Manager
 
Deciazulado's Avatar
 
Aug 2006
USiberia
6
1161
7055
4063
Default

Quote:
Originally Posted by andreasy969 View Post
@Deciazulado

Thx for your detailed response. We are basically saying the same thing though. I'm just making having only 8 bit instead of 10 bit for luminance values a big(ger) deal and may be more nitpicky reg. calling it "accurate" colors. (I don't watch movies that way, so I cannot say how much of a issue it really is.)
Yes, I was about to mention that somewhere in the post but it got so long winded, that I could see for you for an image to be considered HDR it had to have 10bits which is one of the components of all this 4K UHD improved thing.

4K x PQ x high specular highlight nits x rec2020 x 10 bits = 4K UHD HDR

We aim to reach as much of that that as technically feasible


Quote:
Originally Posted by andreasy969 View Post
I have my PC monitor calibrated (not professionlly mind you) for 100 nits content (resp. for BDs), which results in a brightness setting of only 32/100, so I for one have to "push" its brightness even for my 200 nits caps - that's why I used the term "push". (With its brightness "pushed" to 100 percent, my LED monitor should reach around 350 nits according to tests, but I barely use that resp. it's kind of wasted potential.)

PS: Zero Dark Thirty coming today.
Well if it looks grey when it's overcast, and you set the black level brightness with the PLUGE pattern it should be ok and close. A light meter and grey tones with aim values would get you closer.

As you haven't said that the site's review's 4K images aren't grey flat and muddy (as in most colors are unsaturated tans or brownish ), you must be color managed and if you inputed the correct color primaries of your monitor (these days most monitors come with a color profile), be them near sRGB or Adobe or P3, etc that should be working too. You could "push" to 100percent to see the 320 "8bitHDRs" and then reset back to 32% for BD screenshots. (That's one reason I keep a 300nit P3 mon and a 100 nit 709 one. Less hassle ). Or you can mmm try lower the levels of BD SDR screenshots to about 60% (255/255/255 becomes like 153/153/153 or so, for g2.2) to look at them all in 300 nits (but then SDR shots become about 7.3 bits so I don't think you'd like that neither).

When BDs came out, monitor and web browser technology were up to date and capable, but since UHD with all the improved components making it up, it has been more of spurts in growth. As I said, trying to reach as much of that as technically feasible. And when the majority of computer monitors get brighter and jpgs or bandwidth makes it possible for 10bit that will be updated. I just checked the specs of the last few iPhones as example

4 390nits 709
5 560nits 709
6 550nits 709
7 600nits P3
8 530nits P3
X 570nits P3
(w/incorporated system wide colour management into iOS since version 9.3)
High end UHDTVs are P3 and about 700-1400 nits (which is 1000 nits +/- 0.5 f/stop) but desktop computer monitors are not there yet. 500 is too much of a small change to go from 320 to 500 (+0.66 brighter). So when the overall average gets to ~1000 which is the same difference between 320 as 320 is to 100, another jump in HDR8bit pics . I think a bigger obstacle is getting 10 bits jpgs, and getting them to work in all browsers. (pngs might work but take 8x or more the bandwidth!)

Anyway the function of shots in reviews and such is to give you a good idea of what you're getting instead of flying blind
  Reply With Quote
Old 07-21-2018, 02:41 PM   #1074
Deciazulado Deciazulado is offline
Site Manager
 
Deciazulado's Avatar
 
Aug 2006
USiberia
6
1161
7055
4063
Default

Quote:
Originally Posted by andreasy969 View Post
EDIT: One more thing ... Another problem with calling for ex 320 nits SDR caps "accurate" is that many people will be ignorant about the fact that they will (most lilkely) have to "push" the brightness of their monitor (I know that and am still nitpicky reg. "accurate", others don't even know the former though and will just see a "dark HDR image"). The fact that I was/still am getting the "Why is the image so dark?" even with my 200 nits caps only confirms this.
Well that's why it says


Most people that complaint must have the monitors optimized at 100nits (or compare the 4K screenshot directly to the 100 nit BD one, and the BD one will look to them twice as bright a "200" one or 3 times as bright as a "300" one, etc. (32 times as bright compared to a 10000 one I posted on MI5s thread!)

This even happens with people with UHDTVs: they put the BDs with 100 nit 100% whites at the maximum nits of the TV lets say 1000 nits, then switch to the UHD, which will have the UHD image darker, with the 100 nit 51% whites at 100 nits and go: UHDis a scam it looks 10 times darker than the BD!

(Which is of course lack of consumer education or orientation about all these extra highlights and rendering intent, so they don't know that on their 1000nit room warmer they had to adjust the settings's contrast slider down for the BD input till it's about 100 nits. Is like when we got BDs in our 1080 HDTVs and people had the sharpness maxed out cus they were watching 480 DVDs and NTSC broadcasts. "But but the Blu-ray looks horrible, look at all the grain and noise! The DVD looks best".)

Last edited by Deciazulado; 07-21-2018 at 03:04 PM.
  Reply With Quote
Old 07-21-2018, 02:50 PM   #1075
andreasy969 andreasy969 is offline
Blu-ray Knight
 
Aug 2008
125
Default

Quote:
Originally Posted by Deciazulado View Post
Well that's why it says
Well, my disclaimer actually even contains the words "too dim" (in bold) by now. But (some) people still won't get it.

EDIT: I also appretiate your screenshots. Forgot to mention that as well.

Last edited by andreasy969; 07-21-2018 at 02:56 PM.
  Reply With Quote
Thanks given by:
Deciazulado (11-05-2018)
Old 07-21-2018, 04:53 PM   #1076
Deciazulado Deciazulado is offline
Site Manager
 
Deciazulado's Avatar
 
Aug 2006
USiberia
6
1161
7055
4063
Default

Geoff et all

Yes that's posterization/clipping/extreme compression/out of bound tones etc. Mostly is in bright skies, where the sky values (which are composed of RGB triplets) and clouds values reach the ceiling (of the screenshot parameters).

Same as like the above-X-nits tones (let's visualize them first as grey tones) are extremely compressed at the "ceiling", so far as to enable to see the tones below them in correct fashion, the colors of the triplets and the color triplet parts bunch up at the ceiling and might do at different points and rates.

So sky which is bright clouds and blues and cyans (and did give trouble on BDs too at the pre-UHDhistory) (cyan is the addition of green + blue) well cyan happens to be the second brghtest color (yellow is first) and also the, one which is more troublesome to reproduce its full saturation in RGB color spaces (rec709's cyan is kind of grayish). (Dunkirk's sky is yellow greenish so if the high brightness R or G components hit the ceiling it would veer too while the blue part is a lower value level) Etc. etc.

So you go from 10bit levels to 8 bit levels which posterizes (decimates) 3-4X the tone/color gradations with btw the way the eye works it's more discriminatory of tones (small density variations) on brighter values (That's one reason people are always complaining about sky reproduction: even in film, be it b/w or color people tend to find sky areas "too grainy": they can perceive the grain there more.

Add to that 8-bit tone decimation, the expansion to make the image look back correct in brightness and then the highest tone values hitting the ceiling, lets say a part of the sky area has one of the RGB values go much higher than the others R 50 G 125 B 95 it may end up on the ceiling as R 50 G 100 B 90 so it veers off-color as it falls on the highlight compression curve but an adjacent part of the area has lower different values of colors so it's not squeezed as much or its RGB components don't have the severe modification as the other part and doesn't veer as much or none at all (Don't know if you are following my explanation is difficult to describe color with words). So you get splotchiness. Banding.

So in an example one sky tone might veer to blue or green or grayer, greens to cyan or yellow, etc, an area ending least saturated might be seen as the complementary color next to the saturated color. etc. The vertical alternating lines in the bright sky in Dunkirk are interesting, might be an artifact of the signal interacting with the tech, or it might just be there as part of the compression (some kind of compression dithering?) but imperceptible if seen at 10bits on without tone compression/decimation with full displayed high nit highlights breathing. The first thing is to look at the raw hdrs but as they are not 10bit one can't be 100% certain if you see it there (however faint it might be) that it's not from the 10->8bit reduction alone or in the actual signal.

I mentioned I've caught a couple of things when checking test patterns. Sometimes there are small variations (1 level) in some greyscale patterns but as there are not many pattern variations on discs you can't double check in the colors (if they affect the colors, affecting one level of the triplet). I made the engineer get and check another unit but the same small level variation were found, so to further confirm would be to get different brands of the stuff.

You might think 1 level out of place out of 256 x 3 might not do much, but as I said the HDR expansion is actually 100 fold x the 3X color expansion so any variation however minute, will be exacerbated in any HDR system/viewiing. In some levels, basically the HDR curve expand contrast ("gamma") a lot like viewing things at gamma 4 instead or 2, and the rec2020 expands the colors like saturating things double or more (There are some shots when you compare the expanded signal to the grayish HDR where it looks that it goes from full beautiful blues/cyan to a black and white image that you'd think has no color blue in it. I'll look for it). Tolerances have to be tightened and reduced. That's why I wish we had 10 bit jags and a 10-bit chain and all that jazz. And andreasy969 demands it!
12 bit internal monitors might also be good. (And DV is 12-bit)

And also the precision of the computer system your viewing this too. I see them in a 10bit OS with a 10bit browser and image editor with a 10-bit monitor. But on another I see banding increases in the same image. So to me somewhere in that one app the code is choking on the color managing/precision, purple and yellow splotches. An example I can even see in my system and point out is look at the Rock's right cheek in BitRate's cropped UHD screenshot where I applied the formula. Look closely, I was wondering if someone would notice: has splotches in the skin tones. But the Br.com ones don't have these obvious splotches or are very minimized (in my system view). Now I'm not saying BitRate's image is bad or wrong, I was pleasently surprised it looked so similar to the others so it was correctly recorded. But why do I see these splotches there then if it's the same 8bits? It makes me think that somewhere along the chain from BitRate's digitizing to here the precision of that particular jpg decreased a certain amount or level. Maybe it was the jpg encoder or the host site. And also if I view/see some banding/splotches on images on my 10/10/10 chain, what about people that have 8 bit systems, 8 bit browsers or image editors, 8 bit monitors or any combination of the 3? Maybe they see horrific oil diffracted skies that I don't see on my system, instead of the amount of "posterization" I do see? That's why levels of precision are important, and 10 bits are better than 8 and 12 > 10, etc., better precision, less divergence, tighter tolerance, even when things veer off tolerances.

Last edited by Deciazulado; 07-21-2018 at 05:29 PM. Reason: Ahh the days one inserted a VHS and hit play...
  Reply With Quote
Old 07-21-2018, 04:58 PM   #1077
andreasy969 andreasy969 is offline
Blu-ray Knight
 
Aug 2008
125
Default Zero Dark Thirty

Zero Dark Thirty

When I first looked into the disc back then, I thought that it looked great already (even in 1080p/SDR) and didn't remember the BD looking THAT good (and also said so). But I never compared. Doing the caps confirmed what I thought: Another 2K DI UHD that is (relatively) "destroying" the BD throughout imho (source noise intact, real HDR, colors).

I also have some comments below.

(screenshotcomparison.com just gave me an error when I uploaded the URLs and I won't be be doing this twice - sorry)

BD (upscaled) left, UHD-BD (madVR/SDR/200 nits) right

Disclaimer as to why the UHD-BD images may appear to be too dim:
[Show spoiler]Please note that the UHD-BD shots have been converted from HDR to SDR using special techniques, which drastically compresses the dynamic range of the original image (the color bit depth has been compressed as well). The UHD-BD shots are therefore not an accurate representation of the original HDR image - dynamic range, colors (tone and intensity) and contrast should be taken with a big pinch of salt and the main focus should be on comparing details. Typically, the image will appear too dark (which is by design when the caps are done at 200 nits; on its own they should be viewed with monitor brightness set to 200 nits), may lack a certain "pop" and may at times also appear "boosted" when compared to the BD shots. The SDR conversion should still give you a good idea of the actual image of the UHD-BD though and one should also be able to at least catch a glimpse of the increased dynamic range. The BD shots have been upscaled for comparison purposes, but other than that should be accurate. You might also want to check out this post of mine (incl. the further link there) where I tried to show/explain this:
https://forum.blu-ray.com/showpost.p...&postcount=589

1.

2.

3.

4.

5. (#3 1000 nits)

6. (#3 1000 nits)

7.

8.

9.

10. (#3 1000 nits)

11.

12.

13. different framing - my best guess is they corrected it on the UHD (#3 1000 nits)

14. (#3 1000 nits)

15. just look at the wall, especially with the 1000 nits cap (#3 1000 nits)

16.

17.

18.

19.

20. (#3 1000 nits)

21.

22.

23.

24.

25.

26.

27.

28.

29. (#3 1000 nits - in this case included because of the red digital display (did the same with 2 other caps))

30.

31.

32. (#3 1000 nits)

33. the UHD looks somewhat filtered here, but the BD is even worse, so I guess it's source related

34.

35. (#3 1000 nits)

36.

37. (#3 1000 nits)

38.

39. just green on the UHD resp. yellow "gone" (#3 1000 nits)

40. (#3 1000 nits)

41.

42.

43.

44.

45. I found the title to be sharper (incl. aliasing) on the BD and I also think the BD should better be compared with the 700 nits version (#3 700 nits)
  Reply With Quote
Thanks given by:
aetherhole (07-23-2018), chip75 (07-29-2018), HD Goofnut (07-25-2018), juanbauty@yahoo.es (07-23-2018), NDcowboy (07-21-2018), Pieter V (07-21-2018)
Old 07-21-2018, 06:19 PM   #1078
Deciazulado Deciazulado is offline
Site Manager
 
Deciazulado's Avatar
 
Aug 2006
USiberia
6
1161
7055
4063
Default maybe an exercise in futility but

Ok here's what I see on my screen. This is a capture off my screen of the jpgs


I see some posterization/banding in the clouds but minimal (at 2PH, equivalent to 130" at 10 feet) and up close (4PH equivalent to 260" at 10 feet) it looks more like noise/grain (I'd call that posterization noise cut it's splotchy and iregular unlike grain which is small and regular). Knowing this is from an 8bit jpg conversion from a 30MB HDR screenshot that's being transformed to look normal on a normal monitor I don't think is perfectly smooth but considering the subject matter (blurry feathery clouds with smooth gradations of tones) it's not much bothersome. But maybe this is all in vain, and you might be seeing posterization bands and rainbows galore in the clouds above the planes in the png screenshot I took of the jpg screenshot.



On this I see more splotchiness and posterization on the kind of smooth overcast clouds sky, kind of what I would see on certain CGI animated blu-ray's smooth gradations because of the compression of tones. But again it's bright skies getting a "grey ramp" of bright to dark. What's more troublesome is if I blow up the image to 4PH (again equip to 260" at 10 feet) (2x on on my computer screen to 4320 x 7680) i see this vertical lines like a faint vertical dither pattern, but at 2PH (equiv 130" at 10 feet, 1x on my computer screen) they are fainter but sharper. They look w/o knowing what I see like dirty digital noise. Oh and I see the pink splotches.. Well you can't make an HDRette without breaking any levels

But Again I wonder if the mild artifacting I see is what you see.

I'm already getting blind by looking at bright skies for a long time

ok here's another done from the same HDR (the grey one) on the site but applied the for 10Knit transform straight with no highlight curves, which makes it look almost black, then brightened up in Phshp, much less irregular color splotchiness and the vertical lines pattern almost gone but then is brighter contrastier than the others,

guess I can't wait till we move to 1000nits ones with less compressed highlights curve.
10bits and lots of 1000nit puter monitors

Last edited by Deciazulado; 07-22-2018 at 04:33 AM.
  Reply With Quote
Old 07-22-2018, 08:54 AM   #1079
andreasy969 andreasy969 is offline
Blu-ray Knight
 
Aug 2008
125
Default

Quote:
Originally Posted by Deciazulado View Post
That's why I wish we had 10 bit jags and a 10-bit chain and all that jazz. And andreasy969 demands it!
I don't "demand" it. I'm just pointing out that 8 bit instead of 10 is a problem (and therefore not accurate at all). See, with HDR10 the range of 0-519 (out of the 10 bit) is used for the SDR part resp. for luminance levels up to 100 nits alone. So with only 8 bit, even 100 nits HDR10=>SDR converted caps are not accurate since your losing approx. half the precision of HDR10 already. So it's obvious what happens, if you go even higher with the nits. With 400 nits, HDR10 uses the range of 0-668, with 1,000 nits 0-769, with 2,000 nits 0-847, with 5000 nits 0-948, with 10,000 nits the full range of 0-1023. So the higher you go with the nits and put this range into 8 bit (resp. values 0-255), the more you will lose. And I think banding, clipped and crushed stuff will be the result. The result won't be and cannot be accurate at any rate.

HDR10 doesn't use 10 bit for no reason. And DolbyVision uses even 12, because even 10 bit is not perfect, which becomes more than obvious, if you consider that HDR10 uses basically half (0-519 resp. 520 values) of the available range for just the SDR range of 0-100 nits already (and 520-1023 resp. only 504 values for the rest of 100+ up to 10,000 nits). I think the reasoning for this is that precision isn't as important in the higher nits regions (?), but I still think that this is the one (and only) advantage DV actually has over HDR10 - not the dynamic meta data, which I consider rather redundant - just give me proper image processing instead please.
  Reply With Quote
Thanks given by:
Geoff D (07-24-2018)
Old 07-22-2018, 03:03 PM   #1080
andreasy969 andreasy969 is offline
Blu-ray Knight
 
Aug 2008
125
Default

A Quiet Place

Watched this yesterday and thought it looked wonderfully filmic.

http://screenshotcomparison.com/comparison/117287

BD (upscaled) left, UHD-BD (madVR/SDR/200 nits) right

Disclaimer as to why the UHD-BD images may appear to be too dim:
[Show spoiler]Please note that the UHD-BD shots have been converted from HDR to SDR using special techniques, which drastically compresses the dynamic range of the original image (the color bit depth has been compressed as well). The UHD-BD shots are therefore not an accurate representation of the original HDR image - dynamic range, colors (tone and intensity) and contrast should be taken with a big pinch of salt and the main focus should be on comparing details. Typically, the image will appear too dark (which is by design when the caps are done at 200 nits; on its own they should be viewed with monitor brightness set to 200 nits), may lack a certain "pop" and may at times also appear "boosted" when compared to the BD shots. The SDR conversion should still give you a good idea of the actual image of the UHD-BD though and one should also be able to at least catch a glimpse of the increased dynamic range. The BD shots have been upscaled for comparison purposes, but other than that should be accurate. You might also want to check out this post of mine (incl. the further link there) where I tried to show/explain this:
https://forum.blu-ray.com/showpost.p...&postcount=589

1.

2.

3. (#3 1000 nits)

4. (#3 1000 nits)

5.

6.

7.

8.

9.

10.

11. (#3 1000 nits)

12.

13.

14.

15.

16. (#3 1000 nits)

17.

18.

19.

20.

21.

22.

23. (#3 1000 nits)

24.

25.
  Reply With Quote
Thanks given by:
aetherhole (07-23-2018), chip75 (07-29-2018), HD Goofnut (07-25-2018), juanbauty@yahoo.es (07-23-2018), Lionet (08-06-2018), NightKing (07-24-2018), Pieter V (07-22-2018)
Reply
Go Back   Blu-ray Forum > 4K Ultra HD > 4K Blu-ray and 4K Movies



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 11:29 PM.