As an Amazon associate we earn from qualifying purchases. Thanks for your support!                               
×

Best 4K Blu-ray Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
Dark Water 4K (Blu-ray)
$17.49
1 hr ago
Back to the Future Part II 4K (Blu-ray)
$24.96
18 hrs ago
Back to the Future: The Ultimate Trilogy 4K (Blu-ray)
$44.99
 
The Toxic Avenger 4K (Blu-ray)
$31.13
 
Wallace & Gromit: The Complete Cracking Collection 4K (Blu-ray)
$13.99
13 hrs ago
Lawrence of Arabia 4K (Blu-ray)
$30.50
5 hrs ago
House Party 4K (Blu-ray)
$34.99
1 day ago
Teenage Mutant Ninja Turtles Trilogy 4K (Blu-ray)
$70.00
 
The Breakfast Club 4K (Blu-ray)
$34.99
 
The Lord of the Rings: Return of the King 4K (Blu-ray)
$29.96
 
Jurassic World Rebirth 4K (Blu-ray)
$29.95
 
Superman 4K (Blu-ray)
$29.95
 
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)

Go Back   Blu-ray Forum > 4K Ultra HD > 4K Blu-ray and 4K Movies
Register FAQ Community Calendar Today's Posts Search


Reply
 
Thread Tools Display Modes
Old 10-23-2018, 10:38 PM   #1
Fenevar Fenevar is offline
Member
 
Jun 2014
394
1084
3
Default Why do 4K screenshots seem so dark?

I've recently found myself in the market for a 4K tv, and as a result, a 4K player and 4K movies. So I've been checking out the reviews and screenshots of 4K movies here, and it seems that all of those that are actually sourced from a 4K Blu-ray are all very, very dark. Like, obtrusively dark. Does this look better on a 4K tv and it just looks bad on a computer screen?
  Reply With Quote
Old 10-23-2018, 10:49 PM   #2
antspawn antspawn is offline
Active Member
 
antspawn's Avatar
 
Jun 2013
Default

Pictures never do the source justice.
  Reply With Quote
Old 10-23-2018, 10:53 PM   #3
Al_The_Strange Al_The_Strange is offline
Blu-ray Prince
 
Al_The_Strange's Avatar
 
Apr 2009
Out there...past them trees...
126
1123
4947
530
1008
132
32
Default

Without HDR, screenshots on a PC will definitely look inferior (darker, hazier) than the real deal.
  Reply With Quote
Thanks given by:
guachi (10-23-2018)
Old 10-23-2018, 10:56 PM   #4
OutOfBoose OutOfBoose is offline
Blu-ray Samurai
 
OutOfBoose's Avatar
 
Aug 2015
The City
1
Default

Since regular screens can't display HDR, SDR conversion is used and it makes image look dim and desaturated in comparison. In HDR image is more vivid and contrasty.
  Reply With Quote
Old 10-23-2018, 11:00 PM   #5
Fenevar Fenevar is offline
Member
 
Jun 2014
394
1084
3
Default

Awesome, thanks for the info everyone!
  Reply With Quote
Old 10-24-2018, 08:23 AM   #6
oddbox83 oddbox83 is offline
Blu-ray Champion
 
oddbox83's Avatar
 
Sep 2013
UK
Default

I've yet to see a 4K screenshot that accurately portrays a HDR disc.
  Reply With Quote
Old 10-24-2018, 01:04 PM   #7
HD Goofnut HD Goofnut is offline
Blu-ray King
 
HD Goofnut's Avatar
 
May 2010
Far, Far Away
114
743
2373
128
751
1091
598
133
39
Default

Quote:
Originally Posted by oddbox83 View Post
I've yet to see a 4K screenshot that accurately portrays a HDR disc.
I am eventually going to get a 4K HDR monitor so I can do just that.
  Reply With Quote
Old 10-24-2018, 02:04 PM   #8
StingingVelvet StingingVelvet is offline
Blu-ray Grand Duke
 
StingingVelvet's Avatar
 
Jan 2014
Philadelphia, PA
849
2329
111
12
69
Default

Be aware that if you're a backlight blazer with a setting above 40% or so then HDR is going to look REALLY dark at first, because it keeps a more accurate and "dark" baseline nit level.
  Reply With Quote
Old 11-03-2018, 10:35 AM   #9
Deciazulado Deciazulado is offline
Site Manager
 
Deciazulado's Avatar
 
Aug 2006
USiberia
6
1159
7041
4040
Default Why do 4K screenshots seem so dark?

This has been touched on before but the explanation is this. In photography/cinematography/video, you expose,record or capture a scene with a particular contrast in a medium that lowers (reduces) the original contrast (or "gamma") so it fits in the recording. Then you expose, transfer and display this image onto a medium or device that expands this contrast to the maximum ability of the display/medium, restoring the intent (contrast) as best possible of the original image.

So in photography/cinematography you expose the film and develop it so it has a contrast (or gamma) of about 0.50 (half) and then expose on a print/film that has a contrast (or gamma) of approximately 2 so that 0.50 x 2 = 1 , unity, similar contrast as the original image.

Gamma is the contrast slope. A gamma of 1.0 (unity) or "linear" gamma is a slope of 1 to 1, where the image luminosity or density range in, has a 1:1 ratio to the luminosity or density range out. In log/log scale gamma of 1 is represented with a 45º slope /
Gamma of 0.5 would have a slope less steep than 45º and Gamma of 2 would have a slope steeper than 45º, and be inversely proportional of the 0.50 one to "cancel" each other and display the image at the 45º 1:1 ratio.

Photography/Cinematography uses this low contrast slope in the taking stage (negatives) to record a higher contrast, or density range, of the original image in the linear or straight (non distorted) portion of the film exposure density range. Let's say for example a 13 f/stop (8,000:1) contrast range (a density range of 4.0 dLog) into a 2.0 dLog density range in the negative. Then, print film expands this 2.0 dLog back to the 4.0 dLog density the print film is capable of, by having a gamma of 2.0

On video this is accomplished in a similar way, the SDR video camera or scanner records the image with a gamma of about 0.45 (encodes the image values with a gamma of 0.45, from value 0 to 100% (on video, levels 16 to 235, or in full range computer levels from 0-255, the 256 values of 8-bit) This is done for two reasons: The digital one is because it's more efficient (less jumps between perceived steps) to encode image signal values in gamma 0.45 (the inverse of gamma 2.2), which raises the shadows, in the 8-bit container, than to encode them linearly (gamma of 1.0). And the main (historical) reason, because CRTs had a display gamma (slope of signal in, luminance out) of aproximately 2.2*. So an analog signal of 0.45 gamma sent straight onto the 2.2 CRT displayed properly directly. In the transition to digital and LCDs, etc. this CRT 2.2 display characteristic was kept or emulated by non CRT displays.

This worked well for SDR and the available displays of about 100 nits maximum but film image's density range still has to be compressed some (mastered to) to fit in this video range.

But technology and digital/home theater tech advances, and now designed a new system and standard with greater capability of recording and displaying contrast, color and brighness called commonly HDR.

In HDR the same principles are used: encode the image digital values in a low, less contrasty slope; this time with an even lower contrast slope, and with a curve that's more efficient in compressing the high density/contrast tones of the desired reproduced image: The particular shape of this curve was chosen because it reduced or eliminated visible step jumps better than gamma, and with increased range. So resulting in all the film range, or basically closer to the range of vision than before.

The encoding (OETF) and corresponding decoding display (EOTF) curve is called PQ and compared to the encoding SDR Gamma of nominal 0.45 or display Gamma of nominal 2.2, it has a much lower encoding contrast slope or "gamma", maybe closer to an equivalent of 0.25, with a peculiar curve, and a much higher decoding (or display) contrast slope or "gamma" as its corresponding display opposite of maybe a value like 4.0 with a display shape akin to an inverted S, something like this instead of a straight / slope. With a range of 10,000 nits.
Which is what causes HDR screenshots seen on SDR monitors that don't reach 10,000 or even 1,000 to be dim.

Here's some visual examples of the process:


For SDR images:
< Linear gamma (gamma = 1.0)
^ What's actually in the SDR digital file (Blu-rays/DVDs) if you could see the 0-255 values directly (no monitor).
< seen with 2.2 gamma.
^ Contrast added to the file by displaying the 0-255 values on a CRT monitor's natural 2.2 gamma slope, or by the LCD/OLED etc. emulation of a CRT's gamma. This is what you see on your SDR monitor.

As the image is designed so be seen correct with 100% video level 235 (or full PC level 255) displayed at 100nits with a display contrast of 2.2 gamma it looks fine.


Now, for HDR images:
< Linear gamma (gamma = 1.0)
^ What's actually in the HDR digital file (4K HDR UHDs) if you could see the 0-255 values directly (no monitor). A much higher contrast range has been encoded in the lower contrast file. (Note: In 10-bit HDR, 0-1023).
< Raw HDR image seen with 2.2 gamma applied.
^ This is how the HDR file looks on a CRT monitor's natural 2.2 gamma slope, or by the LCD/OLED etc. emulation of a CRT's gamma, if seen directly as is on a SDR monitor.

The contrast and brightness are totally wrong because the file image is designed to be seen with the much higher contrast of the PQ curve and with the brightness of the 100% level at 10,000 nits. In simple terms, this is the wrong "gamma". This is what you'd see if your UHDTV didn't apply a HDR PQ curve.
Also since the color is encoded in the super wide rec.2020 gamut color space, colors will be seen muted when displayed on SDR monitors with the standard rec.709/sRGB gamut. Even on a P3 gamut computer monitor (like the latest iMacs, etc) or photography monitor with the Adobe1998 gamut; because those color gamuts are still less wide than 2020.



Now, if you apply the correct PQ curve and color gamut, the contrast and color becomes correct but you have to look at it on a 10,000 nit monitor to make it look with the correct brightess. If your monitor is not that bright, the image will look very dim:
< HDR for 10,000 nits monitor.
^ If your computer monitor is like most SDR monitors, its brightness may be between 100 and 500 nits maybe.
Even if you browse the images on your UHDTV, they might display it between 300 and 1000 nits, 1500 nits maybe if you have the latest and the greatest.

So on a 100 nit monitor, HDR 10,000 displays a white of 10,000 nits as 100 nits and a white of 100 nits as 1 nit!
Even on a 1000 nit monitor 10,000 would display as 1000 and 100 nits as 10, with white t-shirts a darkish grey.




So what to do.
To make the screenshot be acceptably bright on normal current monitors it can be adapted to display more or less brighter, at the sacrifice of HDR highlight detail or the representation of their relative ratio. That's why you sometimes see HDR shots with a number like "150 nits"

Here's an early sequence run test made approximating this, going from 10,000 nits to 100 nits in equal steps:
< for 10000 nits monitor
< for 3200 nits monitor
< for 1000 nits monitor
< for 320 nits monitor
< for 100 nits monitor
< for 10000 nits monitor
< for 3200 nits monitor
< for 1000 nits monitor
< for 320 nits monitor
< for 100 nits monitor

If you make them for 100 nit brightness they can blow out tremendously, if you make them for 1000 they could be too dark, etc


The 4K ones currently on Br.com are in 8bit 2020 HDR adapted for ~320nit monitors of any gamut; a value in between SDR's 100 nits and 1000 nits, as an average/best fit for most current monitors and still preserve the impression of the HDR contrast, which will look somewhat darker than SDR 100nit Blu-ray ones, but not incredibly blown out neither, as they would if made equal to 100 nit Blu-ray, so instead of this:


or this



you see this.


Hope this has explained it. A little.




*actually CRT gamma is closer to 2.35, now SDR gamma is being specifiied at 2.4. For this post I use 2.2 in place of both

Last edited by Deciazulado; 11-16-2018 at 04:34 AM. Reason: Shortened the nit "exposure" sequence
  Reply With Quote
Thanks given by:
andreasy969 (12-05-2018), Lasse Reden (12-09-2018), UHDLoverForever (08-03-2024)
Old 11-06-2018, 10:07 PM   #10
Deciazulado Deciazulado is offline
Site Manager
 
Deciazulado's Avatar
 
Aug 2006
USiberia
6
1159
7041
4040
Default

If a 4K screenshot looks too dark on your SDR computer monitor and you want to see it brighter you can always open it on a image editor and increase its brightness till it's acceptably bright on it.


By increasing the white level. Basically, for a 320nits to 100 nits SDR monitor you increase a white level of around 60%
(or level ~153) to near ~100% (or level 255).

Here's 2 ways I did this on my Image editor (red arrows on the sliders)

Be sure you're increasing the white level, not changing the overall contrast or a brightness/black level slider.
Of course you can also overdo it and blow all HDR highlights above a certain point to the sky if you want to

Last edited by Deciazulado; 11-16-2018 at 03:51 AM.
  Reply With Quote
Reply
Go Back   Blu-ray Forum > 4K Ultra HD > 4K Blu-ray and 4K Movies

Tags
dark, screen, screenshots


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 12:48 PM.