|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $14.99 22 hrs ago
| ![]() $29.96 22 hrs ago
| ![]() $22.49 18 hrs ago
| ![]() $22.49 18 hrs ago
| ![]() $27.95 | ![]() $28.99 | ![]() $45.00 | ![]() $29.99 | ![]() $19.99 1 day ago
| ![]() $27.95 21 hrs ago
| ![]() $27.95 14 hrs ago
| ![]() $29.95 |
![]() |
#1 |
Blu-ray Guru
![]() Apr 2019
|
![]()
Hello,
There's a general discussion regarding whether pronounced grain is a problem on 4K UHD BD (especially for older movies with more / coarser grain). The argument is often that it's not a problem, because it is present in the original film negative. So having the grain visible is also closer to what the movie originally would have look like when shown in movie theaters. However there is another aspect to this: The difference between 4K BD and movie theater is that the former often has much higher brightness due to HDR. This also means higher contrast, and more noticeable grain. So if 4K BD should aim for having the grain as visible as when the movie was originally shown in the movie theater, then there are two options: 1. Use same brightness level as when shown in movie theater. This means do not make full use of the HDR capabilities available on 4K BD. 2. Use DNR. Which of those options should be preferred? Or is this the option to select: 3. Accept that grain will be be more noticeable on 4K BD than when the movie was originally shown in the movie theater. But if we go with 3), then the we have to give up the aim of having the grain on the 4K BD match how it originally looked like in the movie theater. I.e. we'll accept that the grain will be more pronounced on the 4K BD. Is that what we want, and if so why? Any thoughts on this would be appreciated. |
![]() |
![]() |
#2 |
Blu-ray Knight
|
![]()
No they’re not, hope this helps.
![]() |
![]() |
Thanks given by: | deme1138 (Today), DR Herbert West (Today), eXtofer (Today), Fodderstompf (Today), frogmort (Today), gigan72 (Today), ImBlu_DaBaDee (Today), imnoteventhatfunny (Today), jerclay (Today), MartinScorsesefan (Today), MifuneFan (Today), mikeainslie (Today), nateynate87 (Today), ntotoro (Today), starmike (Today), steel_breeze (Today), teddyballgame (Today) |
![]() |
#4 |
Blu-ray Guru
![]() Apr 2019
|
![]() |
![]() |
![]() |
#5 |
Blu-ray Count
|
![]()
No.
|
![]() |
Thanks given by: | mikeainslie (Today) |
![]() |
#6 |
Blu-ray Samurai
|
![]()
lol
|
![]() |
Thanks given by: | mikeainslie (Today) |
![]() |
#7 |
Special Member
|
![]()
Did you bump your head this morning?
![]() |
![]() |
Thanks given by: | mikeainslie (Today) |
![]() |
#8 |
Blu-ray Knight
Feb 2012
NJ
|
![]()
Uses "haters" in the title. Oh, this will end well.
|
![]() |
Thanks given by: | eXtofer (Today), mikeainslie (Today) |
![]() |
#9 |
Blu-ray Ninja
|
![]()
If anything, films shot on (and projected using) film would have likely been far grainier in the theater because you weren't watching a camera negative being run through the projector (which would be first generation film elements, hence the least amount of grain possible).
You were watching a release print back in the day, which could have been several generations removed from the original camera negative, and had the potential to be far noisier/grainier and less sharp than the OCN. I've seen really crummy release prints, only to be blown away when future home video releases scanned from the OCN revealed what I thought was a rough looking movie was actually pin-sharp, crystal clear and featured wonderful color saturation. Theatrical prints varied WILDLY in quality when things were still being shown on film. And I mean WILDLY. Sometimes a bluray/4k from the camera negative is a downright revelation for those of us who only saw some of these titles in a theater. Folks are far too caught up in nitpicking picture quality and perceived differences in grain structure, when they should just enjoy the damn movie for crying out loud. |
![]() |
Thanks given by: | mikeainslie (Today) |
![]() |
#10 |
Blu-ray Samurai
|
![]()
Watching a 35mm projection is completely different than watching something at home via projector, OLED, or whatever. There are so many variables that there will always be differences between 35mm prints or even the same 35mm prints projected via different projectors. Film and digital are two completely different mediums and there will always be differences between the two, whether out of necessity, limitations or even choice.
Theatrical brightness levels in a home would look horrible and incredibly dull because it's a completely different setting and the lower brightness of theatrical projection is compensated for by a giant ass screen filling your visual field in a dark setting. This doesn't need to be a "grain haters" vs. "grain fetishists" kind of showdown. What seems to help is grading which tries to emulate the photochemical look via film-like curves. Once implemented then it shouldn't matter how bright it is as it should be relative and the grain is essentially forced to react the way it does on film (so shouldn't spike or appear as if it's a "layer of grain"). Edit: Besides, if this thread is referring to some of my recent posts: scanners have improved greatly compared to what I was talking about. There is far more leeway when grading using Director scans without uncovering digital noise or unnaturally exacerbating grain. On something like the Scanity (which isn't being used anywhere near as much as the Director or ARRI for newer UHD releases), I've been told that the blue channel always needed to be noise reduced due to the amount of noise it contained in underexposed areas on film. Last edited by JohnCarpenterFan; Today at 07:44 PM. |
![]() |
Thanks given by: | mikeainslie (Today) |
![]() |
#12 | |
Senior Member
|
![]() Quote:
However, you are hitting close to something that I've thought about before. Default TV settings usually have boosted sharpness, and that could indeed make grain appear more intrusive than it's supposed to be. That's why AV enthusiasts -- the ones who bother to change their TV's default settings -- tend to have a more positive view of film grain, whereas regular Joes -- who don't even know that there are settings to change -- tend to be the ones up in arms about it. In a nutshell, grain issues can be resolved by setting Sharpness to 0. So while I disagree with all of your reasoning, I do agree with your premise. Grain haters may be right about the grain as it appears on their own TVs. Their only fault may be in erroneously extrapolating their own experience to others who literally don't see the problem. |
|
![]() |
Thanks given by: | mikeainslie (Today) |
![]() |
#13 |
Senior Member
|
![]() |
![]() |
Thanks given by: | ImBlu_DaBaDee (Today), mikeainslie (Today) |
|
|
![]() |
![]() |
Tags |
4k bd, grain, uhd |
Thread Tools | |
Display Modes | |
|
|