|
|
![]() |
||||||||||||||||||||
|
Best Blu-ray Movie Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $82.99 22 hrs ago
| ![]() $27.99 58 min ago
| ![]() $22.95 5 hrs ago
| ![]() $34.99 2 hrs ago
| ![]() $74.99 | ![]() $101.99 1 day ago
| ![]() $99.99 | ![]() $35.94 14 hrs ago
| ![]() $23.60 15 hrs ago
| ![]() $32.99 2 hrs ago
| ![]() $24.96 | ![]() $29.95 |
![]() |
#5561 | |
Member
Feb 2008
|
![]() Quote:
It would look pretty funny for Apple to support authoring discs but not playing them back, but if that's what it takes, I hope that is what they do. |
|
![]() |
#5562 | ||||
Power Member
|
![]() Quote:
Quote:
Quote:
Apple can pursue this gamble if they like. The strategy is not without possible consequences. Apple tried pulling the same kinds of stunts in the early 1990s when 3D modeling and animation tools were rapidly maturing. Their "Not Invented Here" policy of snubbing any outside technology had Apple pushing the Quickdraw 3D API. Apple figured its sexy brand would be enough to win over everyone. Many casual computer users are ate up with fashion. Industrial level users are not. Silicon Graphics made deals with numerous companies, including Microsoft, to make OpenGL the industry standard for 3D acceleration. Apple refused to support OpenGL until the end of the 1990s -only after the Windows platform had grown to dominate professional 3D animation and computer gaming. Quote:
|
||||
![]() |
#5563 | |
Retired Hollywood Insider
Apr 2007
|
![]() Quote:
http://cinema.usc.edu/alumni/alumni-history/ not sure though, as I haven’t watched it yet. |
|
![]() |
#5564 |
Retired Hollywood Insider
Apr 2007
|
![]()
^
It looks like you have to click on that speaker thingee on the right bottom part of the clip to hear any sound. |
![]() |
#5565 | |
Retired Hollywood Insider
Apr 2007
|
![]() Quote:
![]() http://www.hometheaterforum.com/htf/...o-blu-ray.html Brace yourselves dnr screenshot simpletons, as all 20 of dem James Bond titles have had some degree of grain reduction. On that note, John L. has his own very interesting definiton of "grain" which I can elaborate on a bit later. |
|
![]() |
#5566 | |
Expert Member
|
![]() Quote:
http://www.avsforum.com/avs-vb/showt...9#post14884969 |
|
![]() |
#5567 | |
Retired Hollywood Insider
Apr 2007
|
![]() Quote:
![]() Hell, the standard graininess of a color negative is random clumps of dye clouds. The way the whole process works is that when a color film is exposed to light, the photons hit the silver halide crystals in the film emulsion and then some valence electrons get knocked loose. When the exposed film is then processed, the silver halide crystals with enough missing electrons become developed silver crystals and activate dye couplers in their surrounding areas……….forming clumps of dye-clouds around the crystals. Then when the film processing is completely finished, the silver is washed away by a fixer and random clumps of dye clouds remain. So, for those who watch their movies freeze-framed with magnifying glasses…………….clumps are normal in the routine photochemical process and occur with no digital processing whatsoever. Perhaps what some of these *scientists* really desire is to see the silver still left on the image and for that you need ‘bleach bypass’. For *scientists* that watch movies at such a subpixel level, I then could appreciate that some grain connoisseurs would actually find the silver retention process to be more attractive than the standard graininess of color negative, because of the difference between grains of silver versus clumps of dye clouds. It really amazes me this fixation with seeing grain particles that some hobbyists have, because based on filmmakers’ wishes, the trend at Kodak for decades has been to make finer and finer grained film (thus the current Vision 3 stock). P.S. Oh, and guess what happens to the number and thusly the appearance of those clumped dye clouds (grain particles) when somebody applies digital grain reduction. ^ I bolded the operative word. Last edited by Penton-Man; 10-18-2008 at 12:46 AM. Reason: added a P.S. |
|
![]() |
#5568 |
Blu-ray Ninja
|
![]()
Well, I am a Professional Imaging Technician for National Geographic Society (really, no joke). I work with ultra high quality images every day.
I also work with low quality images, image restoration, compilation, enhancement, and more. While the film I look at is not the same as the film used in movies, they are both made by the same companies, like Kodak. I have get a very up close personal look at images and the grain involved. The images you see on an HD screen are 1920x1080, being in red, green, and blue. They are compressed, but in the end, each image, if uncompressed, would be about 6 MB each frame. The average files I work on are 50 MB-100 MB each. In other words, I am working with fils much bigger than what you see. Looking at close at an image, you can really see the grain of the image. And if there is one thing that is consistent in all images, it is grain. Without grain, you lose all the detail, all the subtle nuances of an image. Even digital photography adds digital grain to the image to allow for details in shadows and such. Up close, say at 300%, the grain is everywhere, it is strong and makes the image almost look like an impressionist painting. but when you look at it at 100%, or print size (closer to 50%), the grain just melts into the image, and all the beauty of the colors and details come out. Any attempt to reduce the grain literally destroys the image. Sometimes a customer requests less noise, and there are various ways to get a good result, but there are never better than the original in my opinion. |
![]() |
#5569 | |
Member
Feb 2008
|
![]()
I'm well aware of what Apple was in the 90s. That said, I think a lot of things changed when Steve Jobs returned, and Apple's behavior with standards and open source has been far better than that recently: look at WebKit.
Quote:
Drivers would have to be cryptographically signed and trusted. Display devices would need to authenticate with HCDP. The window server would need to support either mixing encrypted layers with non-encrypted layers. All of this requires them to spend money to deliver features that don't actually do anything for you, and would degrade performance. But the worst of it is, a hypervisor or other privileged process would be needed to control what's encrypted and what isn't. There is the crucial difference. Yes, Apple cripples the user-level tools to try to prevent you screenshotting DVDs (as they are supposed to do) but you can still read the contents of the screen easily, with a third-party program. If you're smart you can get the lossless digital waveform from your own DRM'd iTunes music. If you are very smart there are ways to get the unencrypted compressed song without actually hacking. They're making a token effort, but you are still root and you can still read the memory and everything going to the display and sound card. You are still in control. For Blu-ray playback Apple would ostensibly be required to create parts of your computer you don't control, so they can be trusted. If you control it, Big Content doesn't trust it. We can accept this in embedded devices, even in gaming consoles: it reduces piracy and makes it harder to cheat at games. But I believe we look at our computers differently. We expect to control them. 'Trust' means no open-source, for one thing, since you compiled it and they don't trust you. All of this is especially pointless given that AACS has been proved vulnerable in theory, and software players have been compromised. The next step is a hardware device popular enough that the BDA will be reluctant to revoke it (eg the PS3 or S300) being attacked, and it giving up keys. Even if AACS was unbreakable and their content protection worked it would just mean you'd have to take the decompressed output, but you could attack HDCP. If HDCP were also invulnerable, you could Van Eck phreak the pixels of your display, a total pain but better than the last resort: a camera. We all know they can never protect content absolutely, people will still pirate some things and buy others, but this time they want some serious infrastructure in place, and we'll have considerably less control of our machines. What I am really saying, is: "This feature does not belong in my OS, if these are the conditions." I realize this is a forum of Blu-ray enthusiasts who might not be aware of the concern over Trusted Computing, so if Penton thinks this has gone on too long, I'll be happy to end it. As must be obvious now, I feel strongly about this, and I am genuinely curious: would you accept the loss of control of your system for this? Is TPM/Palladium really where you want to go tomorrow? |
|
![]() |
#5570 | |
Retired Hollywood Insider
Apr 2007
|
![]() Quote:
![]() However, I was addressing the conclusion by a poster on another forum who assumes that DNR causes grain to be more “clumpy”. Hell, re-graining of a linear patch after manual scratch removal doesn’t even cause “clumpy” grain, if good motion estimation tools are available to the colorist. Keep in mind that your attitude in regards to grain must also be reconciled with the situation of going from the appearance of grain in 35mm film exhibition of some motion pictures at a Cineplex to digital presentation with high-def home media by a typical home theater display……. which involves several steps that can accentuate the eventual outcome of the appearance of grain on Blu-ray movies (compared to the theatrical presentation), if it is not mitigated to some degree. One of those early steps in that chain of events being the fact that when you scan some film stocks, exposed in certain ways during the principal photography, the grains seem to be emphasized and look more prominent than they do in film to film optical printing with an end-resultant release print (release prints eventually being the source for your local Cineplex and which the filmmaker anticipated as the public viewing ‘gold standard’ appearance to his creative intent in regards to grain, color, etc. when he was making the classic movie in the first place). I think I mentioned in a post several months ago that that in my experience, when there has been an opportunity to show in a screening room the 4:2:2 source of some feature films on a typical high-end consumer display to the filmmaker…….that he/she is quite surprised as to how grainy the appearance their film can sometimes be, especially, with films shot on push-processed 500 ASA stock. And the surprise is not one of delight but of concern, because what they see on current 1080p displays with the archival video masters running, in many cases is not what they intended audiences to see and was not the appearance of the theatrical presentation in which the grain is mitigated due to film to film optical printing (or the film-out after a DI process) and the inherent nature of theatrical presentation with its weave, optics and projector light output further on down the chain. Do you feel me………..do you read me……tell me, am I getting through to you? ![]() http://www.youtube.com/watch?v=L2hyPSt05ek |
|
![]() |
#5571 | |
Moderator
|
![]() Quote:
Gary |
|
![]() |
#5572 | |
Power Member
|
![]() Quote:
I can sympathize with the licensing difficulties of accommodating Blu-ray. But this isn't much different than the many hundreds of licensing and patent issues computer manufacturers have had to deal with all throughout their histories. In the end, the customers are going to go with the computing brand that delivers the computing abilities they want. Apple prides itself on delivering cutting edge quality computers that are also easy to use. It's a cornerstone to the company's advertising and a critical part of its image. In recent years Apple has been behind the curve on the cutting edge thing. The PC platform was ahead of Apple on symmetric multiprocessing, SLI-based accelerated graphics, OpenGL, and on and on. Blu-ray is yet another feature the PC platform now holds over Apple. And it's bad enough that professional Blu-ray authoring is happening all on lowly Windows PCs and Adobe is posing a greater threat to Final Cut Studio on the Mac platform. This situation is just not good at all. I believe Apple needs to deal with this problem urgently before it ends up losing a lot of FCP customers to Adobe or even loses Mac users to the PC platform. If I'm currently in the market for a bleeding edge desktop tower or notebook, I'm going to demand that the machine has a Blu-ray burner in it and that the drive can do everything from playing Blu-ray movies to burning BD-R discs. A list of reasons over why Blu-ray can't work with a certain kind of computer is only going to result in me not buying that kind of computer. |
|
![]() |
#5573 | |
Special Member
![]() Feb 2008
Region B
|
![]() Quote:
Do they have types of film where the negative, when scanned, each film grain would be less than or equal to a quarter the size of a 1920x1080 pixel and then the film grains themselves would then be totally invisible on Blu-ray without any DNR or other processing (is this possible with 35mm or 65mm)? Last edited by 4K2K; 10-19-2008 at 03:37 AM. |
|
![]() |
#5574 |
Blu-ray Guru
|
![]()
Hey guys,
been out of the loop on this board for a while, so please forgive me if i'm asking a question that's already been belabored and answered. But just to save me hours of thread-sifting, can anyone tell me if we ever got a "why" as to the problem with Warner putting out lossy audio on recent Blu-ray Discs like Cool Hand Luke and Interview with the Vampire? Have we gotten assurances that this nonesense will stop? I'm about post my review of Interview at dvdfile and I'm *fuming* made at their mishandling of this disc. I'm particuarly upset because it doesn't seem (from their history) that Warner is even really aware that there's a problem with not providing lossless in the first place. |
![]() |
#5575 | |
Senior Member
Oct 2007
|
![]() Quote:
Warner has received the message and has already been making changes. |
|
![]() |
#5576 |
Blu-ray Guru
|
![]()
Thanks RAH.
That makes sense. I disagree with Warner's priorities, however: In my non-professional opinion, Warner should have delayed these releases in order to prepare a new encode file with lossless audio rather than continue to shove existing, sub-par, encode files through the pipeline. If a title can be pulled off the shelf and re-authored because of some special feature not working properly on a subset of players, I would consider a non-optimal audio track to be reason as well, especially considering the value that the current BD market is placing on both image and sound fidelity to the source. I had really hoped to hear a full lossless track for Interview that would finally put my laserdisc to shame. Still waiting! ![]() Thanks for the explanation, and hopefully we'll see Warner's future discs bear out the results. We've already been made promises on catalog titles that clearly weren't kept with these releases... unless of course when Warner reps said "all future titles" they meant "all future titles that do not yet have prepeared authored files"... hopeful in the future! dave ![]() |
![]() |
#5577 | |
Power Member
|
![]() Quote:
Nonetheless, whatever lies in the original negative has all the image detail that's ever going to be made available. Use of grain reduction simply will further reduce whatever detail is present from the original image. Some people may not be blown away by the vintage image quality of a movie like The Godfather, but the BD of that movie presents imagery that is probably as good as it will ever get. Blurring out all the grain would destroy a huge amount of native detail in the image and make matters far worse. A well photographed 35mm movie is still going to have more native detail present in its OCN elements than any movie shot with video cameras. Considering the rave reviews now pouring in regarding Baraka via its 8K digital intermediate from MPI, I hope Hollywood movie studios may get a clue from that and start originating its very expensive productions in front of 65mm cameras once again. |
|
![]() |
#5578 | |
Special Member
![]() Feb 2008
Region B
|
![]() Quote:
But thinking about it, maybe the grains would still be visible even if 35mm film was 4x the resolution of HDTV (eg. if you have a white image with black pixels and resample the image to half it's original width & height you can still see the original pixels that were black but they turn grey (less visible) rather than the original black). So I couldn't understand how considering how 35mm is so high res that the grains would look so obvious on HDTVs to people doing the encodes if the film negatives have been sampled down to HDTV res (when, by sampling down to HDTV res, and if the grains really where 4x smaller than HDTV (eg. half the width & half the height in size or even smaller than that), then multiple grains would be blended/averaged together I'd think in such a way that individual grains should really be invisible - unless the original had one black grain surrounded by lots of white grains - where in that case it would appear grey on HDTV (I'm not talking about DNR here just resizing/resampling from the much higher res 35mm/65mm film down to lower res HDTV)? So how are the 'pixels' (film grains) from a medium of a higher resolution than 1920x1080 becoming more obvious and visible when that higher resolution medium is down-sampled to 1920x1080. Why don't the 'pixels' (film grains) not all blend together when resampled down and so be usually invisible unless you had a hypothetical HDTV with a resolution equal to that of 35mm (or 65mm) film itself? If a film grain on a certain movie takes up more than one 1920x1080 screen pixel, wouldn't that mean that the film scanned was lower resolution than 1920x1080? Last edited by 4K2K; 10-19-2008 at 06:56 AM. |
|
![]() |
#5579 |
Blu-ray Guru
Sep 2006
|
![]()
..
Last edited by Rob Tomlin; 11-06-2008 at 12:41 AM. |
![]() |
#5580 | |
Blu-ray Guru
Sep 2006
|
![]() Quote:
Last edited by Rob Tomlin; 11-06-2008 at 12:02 AM. |
|
|
|
![]() |
![]() |
![]() |
||||
thread | Forum | Thread Starter | Replies | Last Post |
Ask questions to Compression Engineer insider "drmpeg" | Insider Discussion | iceman | 145 | 01-31-2024 04:00 PM |
Ask questions to Blu-ray Music insider "Alexander J" | Insider Discussion | iceman | 280 | 07-04-2011 06:18 PM |
Ask questions to Sony Pictures Entertainment insider "paidgeek" | Insider Discussion | iceman | 958 | 04-06-2008 05:48 PM |
Ask questions to Sony Computer Entertainment insider "SCE Insider" | Insider Discussion | Ben | 13 | 01-21-2008 09:45 PM |
UK gets "Kill Bill" 1&2, "Pulp Fiction", "Beowulf", "Jesse James", and more in March? | Blu-ray Movies - North America | JBlacklow | 21 | 12-07-2007 11:05 AM |
|
|