|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $29.96 2 hrs ago
| ![]() $86.13 10 hrs ago
| ![]() $49.99 1 day ago
| ![]() $29.96 10 hrs ago
| ![]() $34.96 1 day ago
| ![]() $31.99 | ![]() $36.69 1 day ago
| ![]() $14.44 12 hrs ago
| ![]() $37.99 | ![]() $32.99 | ![]() $32.99 | ![]() $14.97 7 hrs ago
|
![]() |
#1 |
Blu-ray Knight
|
![]()
http://www.hdtvtest.co.uk/news/4k-vs-201604104279.htm
Pretty interesting read. Personally I'm not having this issue and I usually watch with ambient lighting because my main TV is in the living room. My HDR settings on the 940C are standard except that I bumped the color up from 50 to 65. At night I lower the gamma from 0 to -1 or -2 (depending on the movie) and the black level (brightness) from 50 to 47. An option they don't mention in the article is adjusting the picture settings on the Samsung player itself. Dynamic mode bumps up the luminosity quite a bit from standard mode, so maybe they didn't do that. In any case, I feel the overall luminance on my UHD viewing is about the same as on regular Blu-ray. That's the misconception that many people have about HDR. They think it means a brighter overall picture, like watching in vivid mode, but it's really about having a higher range of luminance while the average brightness stays the same. |
![]() |
Thanks given by: | Geoff D (04-11-2016) |
![]() |
#2 |
Blu-ray Samurai
|
![]()
Bruce, the 940c is one of the brightest displays ever made so you won't have an issue. It's tvs like the 850c that will have problems and why I've been advising against that tv for 4k blu rays. These tvs that lack the power to overcome the dimness will cause issues for those people.
|
![]() |
Thanks given by: | bruceames (04-10-2016) |
![]() |
#3 | |
Blu-ray Knight
|
![]() Quote:
|
|
![]() |
![]() |
#4 | |
Blu-ray Samurai
|
![]() Quote:
I had a 2014 samsung hu9000 that got hdr via the upgrade box, and it was too dark for me which is why I upgraded. Alot of complaints from 2014 samsung owners that it's too dark who got the upgrade box. |
|
![]() |
![]() |
#5 |
Blu-ray Emperor
|
![]()
Interesting stuff, thanks for the link bruce. We already know that the much vaunted 1000-nit mastering ceiling for HDR10 on UHD Blu is only for specular highlights (like light glinting off of a reflective surface) and that the frame average light level should be 400 nits maximum - but that's all that is, a maximum value, so it follows that real world HDR content would have a more variable output as per whatever the filmmaker's intent should be. So strangely I'm not surprised to hear that average light levels are almost the same as SDR Blu-ray at certain spots during HDTVTest's, um, tests.
Can't wait to get my grubby mitts on the Panasonic player though, I'm starting to think that this whole hoo-haa about the SDR transform that I've been blathering on about for ages is a storm in a teacup. And what's crazy is how incredibly similar the colour is between the SDR Blu-ray and the HDR UHD Blu-ray in most of those photos! [edit] Apart from Mad Max, but then the colour differences between the SDR and HDR versions have been mentioned before, it seems to be less vivid and slightly more realistic in the HDR version in those photos. Last edited by Geoff D; 04-11-2016 at 12:07 AM. |
![]() |
Thanks given by: | bruceames (04-11-2016) |
![]() |
#6 | |
Blu-ray Guru
|
![]() Quote:
Really wish I had uhd screen capture capability, love taking and sharing comparisons but being unable to do so properly with uhd is a bummer. Last edited by vincentric; 04-11-2016 at 12:35 AM. |
|
![]() |
![]() |
#7 |
Site Manager
|
![]()
I derived this chart from another I had lying around to show nits (aka candles per meter squared) in log scale (yes Geoff, me and my deci f/stops
![]() The blue arrows show the range of a standard IPS LCD. VA LCDs might have about ~2 more f/stops. (A f/stop is a doubling or halving of numbers, 1000 nit vs 500 nits, or 0.05 nit vs 0.1 nit). CRTs, Plasmas and OLEDs have much wider range (15 f/stops or more) because of their deep blacks. As I mentioned in other thread, I'm finding the diffuse whites on HDR discs (for example the X-Men) around 100 nits which is near the usual ones I find on SDR Blu-rays, which put whites at 70-100% (Tho BD discs mastered with ~70% whites look rather dim to me to others BDs in a side by side comparison. If you put 100% SDR video (the white square of a color) bar at 100 nits, 70% video at 2.2 gamma is 46 nits, at 2.4 gamma is 42 nits, more than a f/stop difference, less than half as bright. Probably trying to preserve every bit of highlight in the film? Or they're afraid/don't know how to handle highlights and leave a huge safety range, like those old CDs of yore, having 90+dBs or dinamic range but mastered with the peaks not even reaching -16dBs, losing 2 bits or resolution! ![]() The HDR signals have much higher range above the diffuse whites, reserved for highlights (metallic, water, crystal reflections, colored lights and powered/superhero/laser/phaser/electricity ray beams, and light sources like sun through windows, clouds behind Spider-Man and the Sol itself. (One I checked on a Sony disc would be around 4000 nits if displayed by the book.) Anyway, you can see the extended highlight range in the curves above the 100 nits level. Since TVs we have today don't reach much above 1000 (unless your name is Sony or Dolby etc ![]() Talking about the bottom, not only HDR can give you brighter specular highlights, but also deeper shadow detail, (with its 10 bit signal you can have 4 more jumps in between the levels jumps of 8-bit). and the PQ also massages the encoding curve to optimize the distribution of those levels. So, on SDR you have a minimum big jump of 1 level from level 16/0 (video/PC) to level 17/1 then 18/2 etc, while on 10-bit HDR it's smaller jumps from level 64/0 (0 nits) to level 65/1 (~0.00004 nits) then 66/2 (0.0001 nits) 67/~3.5 (~0.00035 nits) 68/5 (0.0006 nits) 69/6 (0.0009 nits 70/7 (0.0012 nits) 71/8 (0.0015 nits) 72/9 (0.0018 nits) covering the same first only 2 SDR levels (17/1 and 18/2) up from 0 light output 16/0. Of course to achieve those deep black shadows and detail you'd need a display that reaches them with that level of precision. Take this brothers, may it serve you well. - The Beatles number 9 [Show spoiler]
Last edited by Deciazulado; 04-11-2016 at 02:13 AM. Reason: some f/stop clarifications. and decitypos |
![]() |
Thanks given by: | bruceames (04-11-2016), Clark Kent (04-16-2016) |
![]() |
#8 |
Site Manager
|
![]()
Successive ~0.83 f/stop lighter and lighter exposures in the 6.64 f/stop range from 10,000 nits brightness = 100% limit on pic, to 100 nits brightness = 100% limit on pic .
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() values top to bottom: 10000 = 100 % on pic 5600 3200 1800 1000 560 320 180 100 = 100 % on pic If you open them in tabs and switch rapidly through them you can practically see them like a movie increasing the exposure. Thanks to James Freeman for helping in getting these images made. Last edited by Deciazulado; 04-16-2016 at 11:57 AM. |
![]() |
Thanks given by: |
![]() |
#9 |
Blu-ray Emperor
|
![]()
Funnily enough, those middle pics of the 9 look just about perfect as far as SDR conversion goes, they bring out enough brightness without clipping the shit out of the highlights.
Are these things actually being mastered to disc at 10,000 nits, deci? |
![]() |
![]() |
#10 |
Site Manager
|
![]()
They are mastered so that the brightest highlights are falling in ~ the 1000 nits part of the signal, about 75%, which is inside the 10,000 100% signal. I think I saw a lamp or light highlight in one scene that was brighter but I have to find it again to check.Gwen's white shirt seems to be somewhere around 150-200 nits in that pic.
There's no clipping in the disc image, it's just how you set the max level. You can bring it all down with the player if it has adjustments. For example if having a 500 nit TV you could put the 500 nits signals at 500, or you could put the 1000 nits at the 500 maximum dimming the lower values proportionally going down from the 500 max of the TV. 320 Nits is the middle in between 100 and 1000. 1000 is the middle between 100 and 10000 |
![]() |
![]() |
#12 |
Site Manager
|
![]()
But it's not the conversion. it's the 8 bits 100 nits gamma 2.2 jpgs, from whatever you set the things to.
![]() (on SDR monitor) ![]() |
![]() |
![]() |
#13 |
Blu-ray Emperor
|
![]()
Okay, I think I understand your position now: in order to create SDR conversion caps that are crunched down into the relevant 100 nit level (for comparison to the SDR Blu-ray) then that means blowing out the highlights to shit - but purely from my personal perspective I don't care that it may end up being a touch darker than the SD Blu-ray if it means I can still preserve something of the higher range of the HDR edition.
To that end, the good thing about being able to receive this adjustable version of that converted signal is that my TV settings aren't locked out as HDR would normally do, so I can raise the backlight/contrast of my display accordingly to match it up with the nits that are being output (up to about 450 nits peak on my TV). |
![]() |
![]() |
#14 |
Expert Member
|
![]()
I don't have a problem with the overall brightness of HDR on my LG G6 but I do have to turn the brightness up or I get crushed blacks.
I watched "Concussion" last night & was wowed by the saturated colors & how natural they looked but kept trying to get the blacks from being crushed. I'll keep working until I get it right or until they come up with a calibration disc or something where I can get a good ISF calibrator to get it perfect. Also I left Deep Color in the TV off. I wasn't sure if that should be turned on or not. I also have Deep Color turned off on the Samsung UHD player. |
![]() |
![]() |
#15 | |
Senior Member
Oct 2013
|
![]() Quote:
I'll explain some basics first. The HDR screenshots are in 8bit (0-255) instead of 10bit (0-1024), Both in Full Range PC Level. The actual video on the UHD disc is in TV level 10bit 64-940, and on SDR disc 8bit 16-235, common knowledge. So right of the batch we have TV to PC and 10bit to 8bit conversion together. Moreover, the color gamut on the HDR screenshots provided by this site are in Rec.709 for some unknown reason, whether on the disc they are in Rec.2020. We have another (much worse) conversion from 2020 to 709 which clips or compresses (with unknown rendering intent) of the wide gamut colors to the tiny standard 709. Thirdly, all of the HDR screenshots on the site have a peak white level of 190 (out of 255), 190/255*100 = 75% input. 75% input in ST.2084 EOTF curve is exactly 1000 nit. Step 255 is actually 10,000nit and from 190 to 255 there is no image information at all. Only from this I can understand that practically all the UHD movies we have till now are mastered on a 1000nit peak display. And lastly, to actually convert HDR to SDR from these 10,000nit images one should not simply clip the high lights just to raise the lower shades, this is not how it's done in Hollywood! ![]() What should be done is the following: 1. Convert the 709 image back to 2020 (should be 2020 in first place) 1. Clip all the above 190 using a tool called Levels, there is nothing there anyway for current UHD releases. 2. Using a tool called Curves till it looks good. The results can be quite good without clipping the whites. http://screenshotcomparison.com/comparison/169740 I should note that Sony did amazing job on amazing Spiderman ![]() All the highlights are there nothing is clipped, well done. Last edited by James Freeman; 04-16-2016 at 06:55 PM. |
|
![]() |
Thanks given by: | pawel86ck (04-16-2016) |
|
|
![]() |
![]() |
|
|