|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $17.49 1 hr ago
| ![]() $24.96 18 hrs ago
| ![]() $44.99 | ![]() $31.13 | ![]() $13.99 13 hrs ago
| ![]() $30.50 5 hrs ago
| ![]() $34.99 1 day ago
| ![]() $70.00 | ![]() $34.99 | ![]() $29.96 | ![]() $29.95 | ![]() $29.95 |
![]() |
#1 |
Member
|
![]()
I've recently found myself in the market for a 4K tv, and as a result, a 4K player and 4K movies. So I've been checking out the reviews and screenshots of 4K movies here, and it seems that all of those that are actually sourced from a 4K Blu-ray are all very, very dark. Like, obtrusively dark. Does this look better on a 4K tv and it just looks bad on a computer screen?
|
![]() |
![]() |
#2 |
Active Member
Jun 2013
|
![]()
Pictures never do the source justice.
|
![]() |
![]() |
#6 |
Blu-ray Champion
Sep 2013
UK
|
![]()
I've yet to see a 4K screenshot that accurately portrays a HDR disc.
|
![]() |
![]() |
#9 |
Site Manager
|
![]()
This has been touched on before but the explanation is this. In photography/cinematography/video, you expose,record or capture a scene with a particular contrast in a medium that lowers (reduces) the original contrast (or "gamma") so it fits in the recording. Then you expose, transfer and display this image onto a medium or device that expands this contrast to the maximum ability of the display/medium, restoring the intent (contrast) as best possible of the original image.
So in photography/cinematography you expose the film and develop it so it has a contrast (or gamma) of about 0.50 (half) and then expose on a print/film that has a contrast (or gamma) of approximately 2 so that 0.50 x 2 = 1 , unity, similar contrast as the original image. Gamma is the contrast slope. A gamma of 1.0 (unity) or "linear" gamma is a slope of 1 to 1, where the image luminosity or density range in, has a 1:1 ratio to the luminosity or density range out. In log/log scale gamma of 1 is represented with a 45º slope / Gamma of 0.5 would have a slope less steep than 45º and Gamma of 2 would have a slope steeper than 45º, and be inversely proportional of the 0.50 one to "cancel" each other and display the image at the 45º 1:1 ratio. Photography/Cinematography uses this low contrast slope in the taking stage (negatives) to record a higher contrast, or density range, of the original image in the linear or straight (non distorted) portion of the film exposure density range. Let's say for example a 13 f/stop (8,000:1) contrast range (a density range of 4.0 dLog) into a 2.0 dLog density range in the negative. Then, print film expands this 2.0 dLog back to the 4.0 dLog density the print film is capable of, by having a gamma of 2.0 On video this is accomplished in a similar way, the SDR video camera or scanner records the image with a gamma of about 0.45 (encodes the image values with a gamma of 0.45, from value 0 to 100% (on video, levels 16 to 235, or in full range computer levels from 0-255, the 256 values of 8-bit) This is done for two reasons: The digital one is because it's more efficient (less jumps between perceived steps) to encode image signal values in gamma 0.45 (the inverse of gamma 2.2), which raises the shadows, in the 8-bit container, than to encode them linearly (gamma of 1.0). And the main (historical) reason, because CRTs had a display gamma (slope of signal in, luminance out) of aproximately 2.2*. So an analog signal of 0.45 gamma sent straight onto the 2.2 CRT displayed properly directly. In the transition to digital and LCDs, etc. this CRT 2.2 display characteristic was kept or emulated by non CRT displays. This worked well for SDR and the available displays of about 100 nits maximum but film image's density range still has to be compressed some (mastered to) to fit in this video range. But technology and digital/home theater tech advances, and now designed a new system and standard with greater capability of recording and displaying contrast, color and brighness called commonly HDR. In HDR the same principles are used: encode the image digital values in a low, less contrasty slope; this time with an even lower contrast slope, and with a curve that's more efficient in compressing the high density/contrast tones of the desired reproduced image: The particular shape of this curve was chosen because it reduced or eliminated visible step jumps better than gamma, and with increased range. So resulting in all the film range, or basically closer to the range of vision than before. The encoding (OETF) and corresponding decoding display (EOTF) curve is called PQ and compared to the encoding SDR Gamma of nominal 0.45 or display Gamma of nominal 2.2, it has a much lower encoding contrast slope or "gamma", maybe closer to an equivalent of 0.25, with a peculiar curve, and a much higher decoding (or display) contrast slope or "gamma" as its corresponding display opposite of maybe a value like 4.0 with a display shape akin to an inverted S, something like this ![]() Which is what causes HDR screenshots seen on SDR monitors that don't reach 10,000 or even 1,000 to be dim. Here's some visual examples of the process: For SDR images: ![]() ^ What's actually in the SDR digital file (Blu-rays/DVDs) if you could see the 0-255 values directly (no monitor). ![]() ^ Contrast added to the file by displaying the 0-255 values on a CRT monitor's natural 2.2 gamma slope, or by the LCD/OLED etc. emulation of a CRT's gamma. This is what you see on your SDR monitor. As the image is designed so be seen correct with 100% video level 235 (or full PC level 255) displayed at 100nits with a display contrast of 2.2 gamma it looks fine. Now, for HDR images: ![]() ^ What's actually in the HDR digital file (4K HDR UHDs) if you could see the 0-255 values directly (no monitor). A much higher contrast range has been encoded in the lower contrast file. (Note: In 10-bit HDR, 0-1023). ![]() ^ This is how the HDR file looks on a CRT monitor's natural 2.2 gamma slope, or by the LCD/OLED etc. emulation of a CRT's gamma, if seen directly as is on a SDR monitor. The contrast and brightness are totally wrong because the file image is designed to be seen with the much higher contrast of the PQ curve and with the brightness of the 100% level at 10,000 nits. In simple terms, this is the wrong "gamma". This is what you'd see if your UHDTV didn't apply a HDR PQ curve. Also since the color is encoded in the super wide rec.2020 gamut color space, colors will be seen muted when displayed on SDR monitors with the standard rec.709/sRGB gamut. Even on a P3 gamut computer monitor (like the latest iMacs, etc) or photography monitor with the Adobe1998 gamut; because those color gamuts are still less wide than 2020. Now, if you apply the correct PQ curve and color gamut, the contrast and color becomes correct but you have to look at it on a 10,000 nit monitor to make it look with the correct brightess. If your monitor is not that bright, the image will look very dim: ![]() ^ If your computer monitor is like most SDR monitors, its brightness may be between 100 and 500 nits maybe. Even if you browse the images on your UHDTV, they might display it between 300 and 1000 nits, 1500 nits maybe if you have the latest and the greatest. So on a 100 nit monitor, HDR 10,000 displays a white of 10,000 nits as 100 nits and a white of 100 nits as 1 nit! Even on a 1000 nit monitor 10,000 would display as 1000 and 100 nits as 10, with white t-shirts a darkish grey. So what to do. To make the screenshot be acceptably bright on normal current monitors it can be adapted to display more or less brighter, at the sacrifice of HDR highlight detail or the representation of their relative ratio. That's why you sometimes see HDR shots with a number like "150 nits" Here's an early sequence run test made approximating this, going from 10,000 nits to 100 nits in equal steps: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() If you make them for 100 nit brightness they can blow out tremendously, if you make them for 1000 they could be too dark, etc The 4K ones currently on Br.com are in 8bit 2020 HDR adapted for ~320nit monitors of any gamut; a value in between SDR's 100 nits and 1000 nits, as an average/best fit for most current monitors and still preserve the impression of the HDR contrast, which will look somewhat darker than SDR 100nit Blu-ray ones, but not incredibly blown out neither, as they would if made equal to 100 nit Blu-ray, so instead of this: ![]() ![]() or this ![]() ![]() you see this. Hope this has explained it. A little. *actually CRT gamma is closer to 2.35, now SDR gamma is being specifiied at 2.4. For this post I use 2.2 in place of both Last edited by Deciazulado; 11-16-2018 at 04:34 AM. Reason: Shortened the nit "exposure" sequence |
![]() |
Thanks given by: |
![]() |
#10 |
Site Manager
|
![]()
If a 4K screenshot looks too dark on your SDR computer monitor and you want to see it brighter you can always open it on a image editor and increase its brightness till it's acceptably bright on it.
![]() By increasing the white level. Basically, for a 320nits to 100 nits SDR monitor you increase a white level of around 60% (or level ~153) to near ~100% (or level 255). Here's 2 ways I did this on my Image editor (red arrows on the sliders) ![]() Be sure you're increasing the white level, not changing the overall contrast or a brightness/black level slider. Of course you can also overdo it and blow all HDR highlights above a certain point to the sky if you want to ![]() Last edited by Deciazulado; 11-16-2018 at 03:51 AM. |
![]() |
|
|
![]() |
![]() |
Tags |
dark, screen, screenshots |
|
|