As an Amazon associate we earn from qualifying purchases. Thanks for your support!                               
×

Best Blu-ray Movie Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
A Better Tomorrow Trilogy 4K (Blu-ray)
$82.99
22 hrs ago
Mission: Impossible - The Final Reckoning 4K (Blu-ray)
$27.99
58 min ago
Weapons (Blu-ray)
$22.95
5 hrs ago
Burden of Dreams 4K (Blu-ray)
$34.99
2 hrs ago
Superman I-IV 5-Film Collection 4K (Blu-ray)
$74.99
 
Shudder: A Decade of Fearless Horror (Blu-ray)
$101.99
1 day ago
Jurassic World: 7-Movie Collection 4K (Blu-ray)
$99.99
 
Corpse Bride 4K (Blu-ray)
$35.94
14 hrs ago
Longlegs 4K (Blu-ray)
$23.60
15 hrs ago
The Dark Half 4K (Blu-ray)
$32.99
2 hrs ago
Back to the Future Part III 4K (Blu-ray)
$24.96
 
Superman 4K (Blu-ray)
$29.95
 
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)

Go Back   Blu-ray Forum > Blu-ray > Insider Discussion
Register FAQ Community Calendar Today's Posts Search


Closed Thread
 
Thread Tools Display Modes
Old 10-17-2008, 02:25 AM   #5561
pellucidity pellucidity is offline
Member
 
Feb 2008
Default

Quote:
Originally Posted by Bobby Henderson View Post
I think Apple customers have a right to be pretty angry about this.
I want to agree, but at this point I don't want Blu-ray in my computer enough to root for the DRM needed; I'm still hoping Steve's tantrums pay off again.

It would look pretty funny for Apple to support authoring discs but not playing them back, but if that's what it takes, I hope that is what they do.
 
Old 10-17-2008, 05:50 AM   #5562
Bobby Henderson Bobby Henderson is offline
Power Member
 
Bobby Henderson's Avatar
 
Jan 2008
Oklahoma
96
12
Default

Quote:
Originally Posted by Bobby Henderson
I think Apple customers have a right to be pretty angry about this.
Quote:
Originally Posted by Rob Tomlin
What about their potential customers, like me?
Potential customers may choose to buy a different kind of computer, if they're looking for a new computer to use in making Blu-ray discs. They can still go with an Apple-branded machine, but they'll have to use entirely 3rd party software and hardware to create the Blu-ray discs. Adobe has some information on their website about using Production Studio and/or Encore for creating Blu-ray discs on the Mac platform.

Quote:
Originally Posted by pellucidity
I want to agree, but at this point I don't want Blu-ray in my computer enough to root for the DRM needed; I'm still hoping Steve's tantrums pay off again.
Apple just wants their own flavor of DRM to dominate. They're one company against many other companies in movie production and electronics manufacturing as well as many rival computing companies who have already fit Blu-ray into their products.

Apple can pursue this gamble if they like. The strategy is not without possible consequences.

Apple tried pulling the same kinds of stunts in the early 1990s when 3D modeling and animation tools were rapidly maturing. Their "Not Invented Here" policy of snubbing any outside technology had Apple pushing the Quickdraw 3D API. Apple figured its sexy brand would be enough to win over everyone. Many casual computer users are ate up with fashion. Industrial level users are not.

Silicon Graphics made deals with numerous companies, including Microsoft, to make OpenGL the industry standard for 3D acceleration. Apple refused to support OpenGL until the end of the 1990s -only after the Windows platform had grown to dominate professional 3D animation and computer gaming.

Quote:
Originally Posted by pellucidity
It would look pretty funny for Apple to support authoring discs but not playing them back, but if that's what it takes, I hope that is what they do.
They're going to lose some customers if they insist on going with that route. Consumers don't give one give one iota of care about licensing issues, corporate politics or any other board room garbage. They just want the computer to do what they expect it to do. If they want the MPEG-4 AVC HD footage they burned onto a BD to play in their new computer, that new computer had better play it. If the machine cannot do that, then they will buy a different brand of machine next time.
 
Old 10-17-2008, 04:03 PM   #5563
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

Quote:
Originally Posted by Penton-Man View Post
http://www.youtube.com/watch?v=YBH7kjQISks

So, nobody recognizes #4 ?
Probably one his greatest contributions to the cinematic society is that he taught students at USC film school (of which he is an alumnus) for over 15 years holding the rank of full professor at the time.

When I get the chance, I’ll try to find a webpage listing some notable lenser USC alumni to help folks focus their thinking.
He’s listed on this page further on down, heck he may even be talking on the video clip here…………
http://cinema.usc.edu/alumni/alumni-history/
not sure though, as I haven’t watched it yet.
 
Old 10-17-2008, 04:08 PM   #5564
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

^
It looks like you have to click on that speaker thingee on the right bottom part of the clip to hear any sound.
 
Old 10-17-2008, 04:14 PM   #5565
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

Quote:
Originally Posted by Penton-Man View Post
I will say; however, that since I know the screenshot *scientists* absolutely abhor even the thought of noise reduction or detail sharpening applied to a movie and we all know that they read this forum and this thread religiously (despite never acknowledging it)……………

Journey to the Center of the Earth, which streets in a couple weeks, had both types of digital image processing applied, so happy artifact hunting all you magnifying glass types out there………….while the rest of us film aficionados enjoy the movie, in real time.
And along that same simpleton mindset of *scientists* who profess to be experts in grain reduction but, obviously have never seen a film scan of 35mm. or even an archival video master of the same material, I can just hear them grinding their teeth when they read RAH’s review on Dr. No……………………

http://www.hometheaterforum.com/htf/...o-blu-ray.html

Brace yourselves dnr screenshot simpletons, as all 20 of dem James Bond titles have had some degree of grain reduction. On that note, John L. has his own very interesting definiton of "grain" which I can elaborate on a bit later.
 
Old 10-17-2008, 05:05 PM   #5566
JimSD JimSD is offline
Expert Member
 
JimSD's Avatar
 
Sep 2007
109
Default

Quote:
Originally Posted by Penton-Man View Post
And along that same simpleton mindset of *scientists* who profess to be experts in grain reduction but, obviously have never seen a film scan of 35mm. or even an archival video master of the same material, I can just hear them grinding their teeth when they read RAH’s review on Dr. No……………………
It's already starting:
http://www.avsforum.com/avs-vb/showt...9#post14884969
 
Old 10-18-2008, 12:43 AM   #5567
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

Quote:
Originally Posted by JimSD View Post
A “little bit of DNR work” causes “clumpy’ grain, eh?

Hell, the standard graininess of a color negative is random clumps of dye clouds.

The way the whole process works is that when a color film is exposed to light, the photons hit the silver halide crystals in the film emulsion and then some valence electrons get knocked loose. When the exposed film is then processed, the silver halide crystals with enough missing electrons become developed silver crystals and activate dye couplers in their surrounding areas……….forming clumps of dye-clouds around the crystals.

Then when the film processing is completely finished, the silver is washed away by a fixer and random clumps of dye clouds remain.
So, for those who watch their movies freeze-framed with magnifying glasses…………….clumps are normal in the routine photochemical process and occur with no digital processing whatsoever.

Perhaps what some of these *scientists* really desire is to see the silver still left on the image and for that you need ‘bleach bypass’.
For *scientists* that watch movies at such a subpixel level, I then could appreciate that some grain connoisseurs would actually find the silver retention process to be more attractive than the standard graininess of color negative, because of the difference between grains of silver versus clumps of dye clouds.

It really amazes me this fixation with seeing grain particles that some hobbyists have, because based on filmmakers’ wishes, the trend at Kodak for decades has been to make finer and finer grained film (thus the current Vision 3 stock).

P.S.
Oh, and guess what happens to the number and thusly the appearance of those clumped dye clouds (grain particles) when somebody applies digital grain reduction.

^
I bolded the operative word.

Last edited by Penton-Man; 10-18-2008 at 12:46 AM. Reason: added a P.S.
 
Old 10-18-2008, 02:08 AM   #5568
CptGreedle CptGreedle is offline
Blu-ray Ninja
 
CptGreedle's Avatar
 
Jul 2007
Sworn super-hero now services Atlanta (and suburbs).
128
5
Send a message via AIM to CptGreedle
Default

Well, I am a Professional Imaging Technician for National Geographic Society (really, no joke). I work with ultra high quality images every day.
I also work with low quality images, image restoration, compilation, enhancement, and more.
While the film I look at is not the same as the film used in movies, they are both made by the same companies, like Kodak.
I have get a very up close personal look at images and the grain involved. The images you see on an HD screen are 1920x1080, being in red, green, and blue. They are compressed, but in the end, each image, if uncompressed, would be about 6 MB each frame. The average files I work on are 50 MB-100 MB each. In other words, I am working with fils much bigger than what you see. Looking at close at an image, you can really see the grain of the image. And if there is one thing that is consistent in all images, it is grain. Without grain, you lose all the detail, all the subtle nuances of an image. Even digital photography adds digital grain to the image to allow for details in shadows and such.
Up close, say at 300%, the grain is everywhere, it is strong and makes the image almost look like an impressionist painting. but when you look at it at 100%, or print size (closer to 50%), the grain just melts into the image, and all the beauty of the colors and details come out.

Any attempt to reduce the grain literally destroys the image. Sometimes a customer requests less noise, and there are various ways to get a good result, but there are never better than the original in my opinion.
 
Old 10-18-2008, 03:46 PM   #5569
pellucidity pellucidity is offline
Member
 
Feb 2008
Default

I'm well aware of what Apple was in the 90s. That said, I think a lot of things changed when Steve Jobs returned, and Apple's behavior with standards and open source has been far better than that recently: look at WebKit.

Quote:
Originally Posted by Bobby Henderson View Post
Apple just wants their own flavor of DRM to dominate. They're one company against many other companies in movie production and electronics manufacturing as well as many rival computing companies who have already fit Blu-ray into their products.

Apple can pursue this gamble if they like. The strategy is not without possible consequences.
I'm sorry, but in your main point you are comparing apples and oranges: Besides the DVD Player, OS X contains media DRM in one place only as far as I am aware: where the Quicktime framework handles FairPlay authorization. It has the power to play or not play files, but no more. This is a far cry from what is required to allow Blu-ray playback under the BDA's rules.

Drivers would have to be cryptographically signed and trusted. Display devices would need to authenticate with HCDP. The window server would need to support either mixing encrypted layers with non-encrypted layers.
All of this requires them to spend money to deliver features that don't actually do anything for you, and would degrade performance. But the worst of it is, a hypervisor or other privileged process would be needed to control what's encrypted and what isn't.

There is the crucial difference. Yes, Apple cripples the user-level tools to try to prevent you screenshotting DVDs (as they are supposed to do) but you can still read the contents of the screen easily, with a third-party program. If you're smart you can get the lossless digital waveform from your own DRM'd iTunes music. If you are very smart there are ways to get the unencrypted compressed song without actually hacking. They're making a token effort, but you are still root and you can still read the memory and everything going to the display and sound card. You are still in control.

For Blu-ray playback Apple would ostensibly be required to create parts of your computer you don't control, so they can be trusted. If you control it, Big Content doesn't trust it. We can accept this in embedded devices, even in gaming consoles: it reduces piracy and makes it harder to cheat at games. But I believe we look at our computers differently. We expect to control them. 'Trust' means no open-source, for one thing, since you compiled it and they don't trust you.

All of this is especially pointless given that AACS has been proved vulnerable in theory, and software players have been compromised. The next step is a hardware device popular enough that the BDA will be reluctant to revoke it (eg the PS3 or S300) being attacked, and it giving up keys. Even if AACS was unbreakable and their content protection worked it would just mean you'd have to take the decompressed output, but you could attack HDCP. If HDCP were also invulnerable, you could Van Eck phreak the pixels of your display, a total pain but better than the last resort: a camera. We all know they can never protect content absolutely, people will still pirate some things and buy others, but this time they want some serious infrastructure in place, and we'll have considerably less control of our machines.

What I am really saying, is: "This feature does not belong in my OS, if these are the conditions." I realize this is a forum of Blu-ray enthusiasts who might not be aware of the concern over Trusted Computing, so if Penton thinks this has gone on too long, I'll be happy to end it. As must be obvious now, I feel strongly about this, and I am genuinely curious: would you accept the loss of control of your system for this? Is TPM/Palladium really where you want to go tomorrow?
 
Old 10-18-2008, 04:11 PM   #5570
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

Quote:
Originally Posted by CptGreedle View Post
Well, I am a Professional Imaging Technician for National Geographic Society (really, no joke). I work with ultra high quality images every day.
I also work with low quality images, image restoration, compilation, enhancement, and more.
While the film I look at is not the same as the film used in movies.....
Nice contribution , and I also appreciate film grain, so much so that I like B&W grain more than color grain, and color reversal grain more than color negative grain (really, no joke).

However, I was addressing the conclusion by a poster on another forum who assumes that DNR causes grain to be more “clumpy”.
Hell, re-graining of a linear patch after manual scratch removal doesn’t even cause “clumpy” grain, if good motion estimation tools are available to the colorist.

Keep in mind that your attitude in regards to grain must also be reconciled with the situation of going from the appearance of grain in 35mm film exhibition of some motion pictures at a Cineplex to digital presentation with high-def home media by a typical home theater display……. which involves several steps that can accentuate the eventual outcome of the appearance of grain on Blu-ray movies (compared to the theatrical presentation), if it is not mitigated to some degree.

One of those early steps in that chain of events being the fact that when you scan some film stocks, exposed in certain ways during the principal photography, the grains seem to be emphasized and look more prominent than they do in film to film optical printing with an end-resultant release print (release prints eventually being the source for your local Cineplex and which the filmmaker anticipated as the public viewing ‘gold standard’ appearance to his creative intent in regards to grain, color, etc. when he was making the classic movie in the first place).

I think I mentioned in a post several months ago that that in my experience, when there has been an opportunity to show in a screening room the 4:2:2 source of some feature films on a typical high-end consumer display to the filmmaker…….that he/she is quite surprised as to how grainy the appearance their film can sometimes be, especially, with films shot on push-processed 500 ASA stock.

And the surprise is not one of delight but of concern, because what they see on current 1080p displays with the archival video masters running, in many cases is not what they intended audiences to see and was not the appearance of the theatrical presentation in which the grain is mitigated due to film to film optical printing (or the film-out after a DI process) and the inherent nature of theatrical presentation with its weave, optics and projector light output further on down the chain.

Do you feel me………..do you read me……tell me, am I getting through to you?
http://www.youtube.com/watch?v=L2hyPSt05ek
 
Old 10-18-2008, 04:48 PM   #5571
dialog_gvf dialog_gvf is offline
Moderator
 
dialog_gvf's Avatar
 
Nov 2006
Toronto
320
Default

Quote:
Originally Posted by Penton-Man View Post
I think I mentioned in a post several months ago that that in my experience, when there has been an opportunity to show in a screening room the 4:2:2 source of some feature films on a typical high-end consumer display to the filmmaker…….that he/she is quite surprised as to how grainy the appearance their film can sometimes be, especially, with films shot on push-processed 500 ASA stock.

And the surprise is not one of delight but of concern, because what they see on current 1080p displays with the archival video masters running, in many cases is not what they intended audiences to see and was not the appearance of the theatrical presentation in which the grain is mitigated due to film to film optical printing (or the film-out after a DI process) and the inherent nature of theatrical presentation with its weave, optics and projector light output further on down the chain.

Do you feel me………..do you read me……tell me, am I getting through to you?
What happened to the DP in all this? Isn't it part of their JOB to understand and discuss this?

Gary
 
Old 10-18-2008, 05:48 PM   #5572
Bobby Henderson Bobby Henderson is offline
Power Member
 
Bobby Henderson's Avatar
 
Jan 2008
Oklahoma
96
12
Default

Quote:
Originally Posted by pellucidity
I'm sorry, but in your main point you are comparing apples and oranges: Besides the DVD Player, OS X contains media DRM in one place only as far as I am aware: where the Quicktime framework handles FairPlay authorization. It has the power to play or not play files, but no more. This is a far cry from what is required to allow Blu-ray playback under the BDA's rules.
Yet the Windows platform is able to make Blu-ray work, either with or without the OS playing a critical part of the equation. Apple's inability or unwillingness to support Blu-ray could make Apple look bad in the eyes of customers looking for a new computer.

I can sympathize with the licensing difficulties of accommodating Blu-ray. But this isn't much different than the many hundreds of licensing and patent issues computer manufacturers have had to deal with all throughout their histories. In the end, the customers are going to go with the computing brand that delivers the computing abilities they want.

Apple prides itself on delivering cutting edge quality computers that are also easy to use. It's a cornerstone to the company's advertising and a critical part of its image.

In recent years Apple has been behind the curve on the cutting edge thing. The PC platform was ahead of Apple on symmetric multiprocessing, SLI-based accelerated graphics, OpenGL, and on and on. Blu-ray is yet another feature the PC platform now holds over Apple. And it's bad enough that professional Blu-ray authoring is happening all on lowly Windows PCs and Adobe is posing a greater threat to Final Cut Studio on the Mac platform. This situation is just not good at all. I believe Apple needs to deal with this problem urgently before it ends up losing a lot of FCP customers to Adobe or even loses Mac users to the PC platform.

If I'm currently in the market for a bleeding edge desktop tower or notebook, I'm going to demand that the machine has a Blu-ray burner in it and that the drive can do everything from playing Blu-ray movies to burning BD-R discs. A list of reasons over why Blu-ray can't work with a certain kind of computer is only going to result in me not buying that kind of computer.
 
Old 10-18-2008, 11:42 PM   #5573
4K2K 4K2K is offline
Special Member
 
Feb 2008
Region B
Default

Quote:
Originally Posted by Penton-Man View Post
I think I mentioned in a post several months ago that that in my experience, when there has been an opportunity to show in a screening room the 4:2:2 source of some feature films on a typical high-end consumer display to the filmmaker…….that he/she is quite surprised as to how grainy the appearance their film can sometimes be, especially, with films shot on push-processed 500 ASA stock.

And the surprise is not one of delight but of concern, because what they see on current 1080p displays with the archival video masters running, in many cases is not what they intended audiences to see and was not the appearance of the theatrical presentation in which the grain is mitigated due to film to film optical printing (or the film-out after a DI process) and the inherent nature of theatrical presentation with its weave, optics and projector light output further on down the chain.
There are quite a lot of posts in the forum where people say that film has a much higher resolution than 1920x1080 (I'm not talking about the resolution people might actually see at a poor quality cinema). While I don't like too much DNR or other processing that removes resolution and detail and makes people look waxy or like cgi, if film (the original negative or as close to it as possible) is really a lot higher resolution than 1920x1080 shouldn't we not be able to see the grains at all, even without DNR/other processing, and therefore unless the film they scan is many film generations old, a Blu-ray or PAL DVD shouldn't show film grain at all if this is true, just an incredibly detailed, sharp picture (unless they softened the picture purposely)?

Do they have types of film where the negative, when scanned, each film grain would be less than or equal to a quarter the size of a 1920x1080 pixel and then the film grains themselves would then be totally invisible on Blu-ray without any DNR or other processing (is this possible with 35mm or 65mm)?

Last edited by 4K2K; 10-19-2008 at 03:37 AM.
 
Old 10-19-2008, 12:44 AM   #5574
DaViD Boulet DaViD Boulet is offline
Blu-ray Guru
 
Jan 2007
Washington, DC
1
Default

Hey guys,

been out of the loop on this board for a while, so please forgive me if i'm asking a question that's already been belabored and answered. But just to save me hours of thread-sifting, can anyone tell me if we ever got a "why" as to the problem with Warner putting out lossy audio on recent Blu-ray Discs like Cool Hand Luke and Interview with the Vampire?

Have we gotten assurances that this nonesense will stop? I'm about post my review of Interview at dvdfile and I'm *fuming* made at their mishandling of this disc. I'm particuarly upset because it doesn't seem (from their history) that Warner is even really aware that there's a problem with not providing lossless in the first place.
 
Old 10-19-2008, 02:03 AM   #5575
Robert Harris Robert Harris is offline
Senior Member
 
Robert Harris's Avatar
 
Oct 2007
Default

Quote:
Originally Posted by DaViD Boulet View Post
can anyone tell me if we ever got a "why" as to the problem with Warner putting out lossy audio on recent Blu-ray Discs like Cool Hand Luke and Interview with the Vampire?

Have we gotten assurances that this nonesense will stop? I'm about post my review of Interview at dvdfile and I'm *fuming* made at their mishandling of this disc. I'm particuarly upset because it doesn't seem (from their history) that Warner is even really aware that there's a problem with not providing lossless in the first place.
The time that it takes to get a major release out is a good 6 months, if not more, which means that these films were in the hopper.

Warner has received the message and has already been making changes.
 
Old 10-19-2008, 04:29 AM   #5576
DaViD Boulet DaViD Boulet is offline
Blu-ray Guru
 
Jan 2007
Washington, DC
1
Default

Thanks RAH.

That makes sense.

I disagree with Warner's priorities, however: In my non-professional opinion, Warner should have delayed these releases in order to prepare a new encode file with lossless audio rather than continue to shove existing, sub-par, encode files through the pipeline. If a title can be pulled off the shelf and re-authored because of some special feature not working properly on a subset of players, I would consider a non-optimal audio track to be reason as well, especially considering the value that the current BD market is placing on both image and sound fidelity to the source.

I had really hoped to hear a full lossless track for Interview that would finally put my laserdisc to shame. Still waiting!



Thanks for the explanation, and hopefully we'll see Warner's future discs bear out the results. We've already been made promises on catalog titles that clearly weren't kept with these releases... unless of course when Warner reps said "all future titles" they meant "all future titles that do not yet have prepeared authored files"...

hopeful in the future!

dave
 
Old 10-19-2008, 05:09 AM   #5577
Bobby Henderson Bobby Henderson is offline
Power Member
 
Bobby Henderson's Avatar
 
Jan 2008
Oklahoma
96
12
Default

Quote:
Originally Posted by 4K2K
While I don't like too much DNR or other processing that removes resolution and detail and makes people look waxy or like cgi, if film (the original negative or as close to it as possible) is really a lot higher resolution than 1920x1080 shouldn't we not be able to see the grains at all, even without DNR/other processing, and therefore unless the film they scan is many film generations old, a Blu-ray or PAL DVD shouldn't show film grain at all if this is true, just an incredibly detailed, sharp picture (unless they softened the picture purposely)?
Lots of variables are involved. Sorry if that sounds like an incomplete answer, but it kind of rests as that. You have different kinds and speeds of film stock. Camera systems have many different lens systems, some of which are custom-ground for a particular feature. Different exposure settings, aperture settings and lighting levels further affect the shooting situation.

Nonetheless, whatever lies in the original negative has all the image detail that's ever going to be made available. Use of grain reduction simply will further reduce whatever detail is present from the original image.

Some people may not be blown away by the vintage image quality of a movie like The Godfather, but the BD of that movie presents imagery that is probably as good as it will ever get. Blurring out all the grain would destroy a huge amount of native detail in the image and make matters far worse.

A well photographed 35mm movie is still going to have more native detail present in its OCN elements than any movie shot with video cameras. Considering the rave reviews now pouring in regarding Baraka via its 8K digital intermediate from MPI, I hope Hollywood movie studios may get a clue from that and start originating its very expensive productions in front of 65mm cameras once again.
 
Old 10-19-2008, 06:12 AM   #5578
4K2K 4K2K is offline
Special Member
 
Feb 2008
Region B
Default

Quote:
Originally Posted by Bobby Henderson View Post
Use of grain reduction simply will further reduce whatever detail is present from the original image.

...Blurring out all the grain would destroy a huge amount of native detail in the image and make matters far worse.

A well photographed 35mm movie is still going to have more native detail present in its OCN elements than any movie shot with video cameras.
I'm not asking asking for them to blur out the grains as that would remove details and resolution. But given that a film grain is sort of like film's equivalent to a pixel, and an original 35mm film (or 65mm film) is supposed to have such a high resolution compared to HDTV I thought they should be invisible or practically invisible.

But thinking about it, maybe the grains would still be visible even if 35mm film was 4x the resolution of HDTV (eg. if you have a white image with black pixels and resample the image to half it's original width & height you can still see the original pixels that were black but they turn grey (less visible) rather than the original black). So I couldn't understand how considering how 35mm is so high res that the grains would look so obvious on HDTVs to people doing the encodes if the film negatives have been sampled down to HDTV res (when, by sampling down to HDTV res, and if the grains really where 4x smaller than HDTV (eg. half the width & half the height in size or even smaller than that), then multiple grains would be blended/averaged together I'd think in such a way that individual grains should really be invisible - unless the original had one black grain surrounded by lots of white grains - where in that case it would appear grey on HDTV (I'm not talking about DNR here just resizing/resampling from the much higher res 35mm/65mm film down to lower res HDTV)?

So how are the 'pixels' (film grains) from a medium of a higher resolution than 1920x1080 becoming more obvious and visible when that higher resolution medium is down-sampled to 1920x1080. Why don't the 'pixels' (film grains) not all blend together when resampled down and so be usually invisible unless you had a hypothetical HDTV with a resolution equal to that of 35mm (or 65mm) film itself? If a film grain on a certain movie takes up more than one 1920x1080 screen pixel, wouldn't that mean that the film scanned was lower resolution than 1920x1080?

Last edited by 4K2K; 10-19-2008 at 06:56 AM.
 
Old 10-19-2008, 04:11 PM   #5579
Rob Tomlin Rob Tomlin is offline
Blu-ray Guru
 
Rob Tomlin's Avatar
 
Sep 2006
Default

Quote:
Originally Posted by Bobby Henderson View Post
L

Some people may not be blown away by the vintage image quality of a movie like The Godfather, but the BD of that movie presents imagery that is probably as good as it will ever get.
..

Last edited by Rob Tomlin; 11-06-2008 at 12:41 AM.
 
Old 10-19-2008, 04:17 PM   #5580
Rob Tomlin Rob Tomlin is offline
Blu-ray Guru
 
Rob Tomlin's Avatar
 
Sep 2006
Default

Quote:
Originally Posted by 4K2K View Post
I'm not asking asking for them to blur out the grains as that would remove details and resolution. But given that a film grain is sort of like film's equivalent to a pixel, and an original 35mm film (or 65mm film) is supposed to have such a high resolution compared to HDTV I thought they should be invisible or practically invisible.

But thinking about it, maybe the grains would still be visible even if 35mm film was 4x the resolution of HDTV (eg. if you have a white image with black pixels and resample the image to half it's original width & height you can still see the original pixels that were black but they turn grey (less visible) rather than the original black). So I couldn't understand how considering how 35mm is so high res that the grains would look so obvious on HDTVs to people doing the encodes if the film negatives have been sampled down to HDTV res (when, by sampling down to HDTV res, and if the grains really where 4x smaller than HDTV (eg. half the width & half the height in size or even smaller than that), then multiple grains would be blended/averaged together I'd think in such a way that individual grains should really be invisible - unless the original had one black grain surrounded by lots of white grains - where in that case it would appear grey on HDTV (I'm not talking about DNR here just resizing/resampling from the much higher res 35mm/65mm film down to lower res HDTV)?

So how are the 'pixels' (film grains) from a medium of a higher resolution than 1920x1080 becoming more obvious and visible when that higher resolution medium is down-sampled to 1920x1080. Why don't the 'pixels' (film grains) not all blend together when resampled down and so be usually invisible unless you had a hypothetical HDTV with a resolution equal to that of 35mm (or 65mm) film itself? If a film grain on a certain movie takes up more than one 1920x1080 screen pixel, wouldn't that mean that the film scanned was lower resolution than 1920x1080?
..

Last edited by Rob Tomlin; 11-06-2008 at 12:02 AM.
 
Closed Thread
Go Back   Blu-ray Forum > Blu-ray > Insider Discussion

Similar Threads
thread Forum Thread Starter Replies Last Post
Ask questions to Compression Engineer insider "drmpeg" Insider Discussion iceman 145 01-31-2024 04:00 PM
Ask questions to Blu-ray Music insider "Alexander J" Insider Discussion iceman 280 07-04-2011 06:18 PM
Ask questions to Sony Pictures Entertainment insider "paidgeek" Insider Discussion iceman 958 04-06-2008 05:48 PM
Ask questions to Sony Computer Entertainment insider "SCE Insider" Insider Discussion Ben 13 01-21-2008 09:45 PM
UK gets "Kill Bill" 1&2, "Pulp Fiction", "Beowulf", "Jesse James", and more in March? Blu-ray Movies - North America JBlacklow 21 12-07-2007 11:05 AM



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 02:40 PM.