|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $29.96 10 hrs ago
| ![]() $49.99 | ![]() $29.99 10 hrs ago
| ![]() $22.49 2 hrs ago
| ![]() $36.69 | ![]() $34.96 | ![]() $31.99 | ![]() $37.99 | ![]() $39.99 | ![]() $32.99 | ![]() $29.96 1 day ago
| ![]() $14.44 1 day ago
|
![]() |
#721 |
Blu-ray Knight
|
![]() |
![]() |
![]() |
#722 |
Senior Member
|
![]() ![]() Spider-Man (2002) Code:
Mastering display luminance : min: 0.0050 cd/m2, max: 4000 cd/m2 Maximum Content Light Level : 9953 cd/m2 Maximum Frame-Average Light Level : 1478 cd/m2 [Show spoiler] Gamut Visualizations (Album) [Show spoiler] HDR10 Plot ![]() Last edited by Macatouille; 08-30-2023 at 01:54 PM. |
![]() |
Thanks given by: | aphid (08-30-2023), Audiovisualmente (08-30-2023), Grain Man (08-30-2023), Labor_Unit001 (08-31-2023), Mierzwiak (08-29-2023), professorwho (08-29-2023) |
![]() |
#723 |
Senior Member
|
![]()
If you thought people were exaggerating when they said that Sony likes to use a light cannon, well... MaxCLL of 8901 (!) and a MaxFALL of 2370 (!). My highest rated shot came in at 2630:
![]() ![]() ![]() ![]() Strangely enough, there's basically little to no use of the WCG in my shots. So it's just a bright Rec. 709 release, essentially. |
![]() |
Thanks given by: | Audiovisualmente (08-30-2023), Grain Man (08-30-2023), Labor_Unit001 (08-31-2023), Mierzwiak (08-29-2023), mrtickleuk (08-29-2023), professorwho (08-29-2023), sojrner (08-29-2023) |
![]() |
#724 |
Blu-ray Samurai
|
![]()
It makes sense. If you speculate a bit? The mass of Sony 4K that they had prepped well before the format. Mastered, graded 2012-2014 if not some earlier with no set in stone final spec. So the resolution was obvious but the color grading standard was SDR colour (they ended up on BD downrezzed). Still... gross... I'd be curious if not all of those "mastered in 4K" series also had not a hint of WCG: disappointing.
|
![]() |
![]() |
#725 | |
Blu-ray Emperor
|
![]() Quote:
List of Sony’s costings to create 4K masters, this is from 2012. Note the references to P3 for the actual master, the 709 trim pass is created downstream when producing the consumer video mastering: ![]() |
|
![]() |
Thanks given by: | Audiovisualmente (08-30-2023), fkid (08-31-2023), Grain Man (08-30-2023), Labor_Unit001 (08-31-2023), Macatouille (08-30-2023), Matt89 (11-09-2024), mrtickleuk (09-01-2023), professorwho (08-30-2023), Riddhi2011 (11-11-2024), RoboDan (11-09-2024) |
![]() |
#727 |
Senior Member
|
![]() ![]() Spider-Man 2 (2004) Code:
Mastering display luminance : min: 0.0050 cd/m2, max: 4000 cd/m2 Maximum Content Light Level : 4471 cd/m2 Maximum Frame-Average Light Level : 1007 cd/m2 [Show spoiler] Gamut Visualizations (Album) [Show spoiler] HDR10 Plot ![]() Last edited by Macatouille; 08-31-2023 at 08:03 PM. |
![]() |
Thanks given by: |
![]() |
#728 |
Senior Member
|
![]()
Significantly toned down from the first movie, even if it's still pretty strong. MaxCLL down from almost 9k to 4k nits and MaxFALL down from 2k to 394 nits.
Brightest highlights from my shots: ![]() ![]() ![]() ![]() With regards to the WCG, there were only two significant instances (both in Rec. 2020 no less): ![]() ![]() And then a handful of extremely minor ones, like these: ![]() ![]() ![]() ![]() |
![]() |
![]() |
#729 | |
Active Member
|
![]() Quote:
![]() That field normally shows vanilla YCbCr. I guess there is something in the video signal letting the chain know that the disc is xvYCC (at least in theory ![]() That's Spider-Man 1 Mi4K BTW...color bit depth is just being padded by my Oppo. Last edited by trevorlj; 08-30-2023 at 03:02 PM. |
|
![]() |
Thanks given by: | mrtickleuk (09-01-2023) |
![]() |
#730 |
Blu-ray Ninja
|
![]() |
![]() |
![]() |
#731 | ||||
Blu-ray Baron
|
![]() Quote:
Quote:
Quote:
Quote:
So it seems this perfectly capable of being rendered with great results in plain old HDR10 if your display has the right blend of hardware capability and software processing. |
||||
![]() |
![]() |
#732 | ||
Member
|
![]() Quote:
I'm curious what a DI that was explicitly graded for the filmout from that era would look like in this regard. Eternal Sunshine of the Spotless Mind would probably be a good test case, since we know that Kino's release was scanned from a filmout rather than pulling directly from the DI. Quote:
Given the way xvYCC works, those "Mastered in 4K" Blu-rays *should* display as WCG SDR on a compatible system, even. Last edited by Azurfel; 08-30-2023 at 07:48 PM. |
||
![]() |
Thanks given by: | Grain Man (08-31-2023) |
![]() |
#733 |
Blu-ray Guru
![]() Apr 2019
|
![]()
So we're seeing some 4K BDs with extremely high brightness of around 9k MaxCLL and 2k MaxFALL.
I have to wonder: * Are there really any mastering displays capable of showing those kinda light levels? If not, why boost the brightness that much when the result cannot correctly be judged by those responsible for the production of the 4K BD anyway? * Not even Geoff D's Sony Light Cannon™ TV is able to display 9k/2k brightness. So those brightness levels will currently not be shown to anyone anyway. But sometime in the future we will have displays capable of 10k nits. So how do we expect these movies to be handled then? Will the viewer be blinded, will they add some "brightness reduction mode" to TVs, or how will it be dealt with? Please share your thoughts. |
![]() |
![]() |
#735 | |
Blu-ray Guru
![]() Apr 2019
|
![]() Quote:
"all about reducing input lag and improving the gaming experience. When enabled, the mode disables your TV's various processing effects, such as motion smoothing and noise reduction, to minimise latency and boost responsiveness." And lower brightness is part of that, since less brightness mean lower time for the pixel too switch its color? |
|
![]() |
![]() |
#736 | |
Member
|
![]() Quote:
|
|
![]() |
![]() |
#737 | |
Senior Member
|
![]() Quote:
Going to do Spider-Man 3 tomorrow then Last Jedi on Friday (by request). Next week I have Inglourious Basterds and Out of Sight planned. |
|
![]() |
![]() |
#738 |
Senior Member
|
![]()
Scarface (1983)
MaxCLL plot: [Show spoiler] MaxFALL plot: [Show spoiler] The following shots are representative of the MLL & FALL peaks, not the averages. MLL peaks: [Show spoiler] FALL peaks: [Show spoiler] Gamut Visualisation: [Show spoiler]
|
![]() |
Thanks given by: | aphid (08-31-2023), Grain Man (08-31-2023), Labor_Unit001 (09-05-2023), professorwho (08-31-2023), teddyballgame (09-04-2023) |
![]() |
#739 | |
Senior Member
|
![]()
Exactly. The times I notice a larger gamut is usually when it's gone too far. It's partly why I'm hesitant to post gamut visualisation maps because I feel like I'm contributing the idea that more chromatic = more better. A larger colour gamut should aid in faithfully reproducing aesthetic intent, not showcasing technological advances.
Quote:
The argument is that real world reflective and emissive light levels are far brighter, and while that is true, you have to keep in mind that our eyes take time to adapt to different luminance ranges. If you've been outside for a while, thousands of nits are no issue but shut your eyes for a few minutes and see how uncomfortable the light is once you open them. Likewise, scenes in a film can arbitrarily go from an average of 1 nit to hundreds or thousands. I think 1000 nits is more nits than we'll ever need. Almost all content that exceeds 1000 nits has been clipped in the grading tone curve anyway. I think we can all agree that 1000 nits is bright so why do we need brighter for the sake of brightness? Even if you put that aside, 10,000 nits would have a significant impact on the lifespan of LEDs and may even require active cooling. I'm glad people are realising that the endeavour for more and more nits, at least for consumer displays, is misguided. We're seeing it with UHDs too. Many of the early UHDs had insanely bright peaks and averages, but now that the novelty has worn off, we're seeing much more reasonable presentations. |
|
![]() |
Thanks given by: | Labor_Unit001 (09-05-2023) |
![]() |
#740 | |
Active Member
|
![]() Quote:
So yeah...I agree that more nits is not always better but I can't agree that the fun stops at 1000 nits or something like 220 nits average. Not after what I've seen. |
|
![]() |
|
|
![]() |
![]() |
|
|