|
|
![]() |
||||||||||||||||||||
|
Best Blu-ray Movie Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $82.99 20 hrs ago
| ![]() $74.99 | ![]() $22.95 3 hrs ago
| ![]() $34.99 1 hr ago
| ![]() $101.99 1 day ago
| ![]() $23.60 13 hrs ago
| ![]() $35.94 13 hrs ago
| ![]() $99.99 | ![]() $24.96 | ![]() $22.96 | ![]() $32.99 1 hr ago
| ![]() $29.95 |
|
View Poll Results: How Good is Good Enough For You? | |||
I think that WM9 1080p24 @ 8Mbps is fine... |
![]() ![]() ![]() ![]() |
0 | 0% |
I'd like to see WM9 1080p24 @ 24Mbps... |
![]() ![]() ![]() ![]() |
0 | 0% |
I'd like to see WM9 & MPeg4 @ 24Mbps... |
![]() ![]() ![]() ![]() |
12 | 100.00% |
Voters: 12. You may not vote on this poll |
![]() |
|
Thread Tools | Display Modes |
|
![]() |
#1 | |
Junior Member
May 2004
|
![]() Quote:
Another reason not to buy LCD just yet is that LCD technology is improving rapidly, particularly in the areas of colorimetry and response time. Some manufacturers are now claiming that the picture quality on their upcoming LCDs panels is superior to that of their own plasma models. |
|
![]() |
![]() |
#2 |
Active Member
Apr 2004
|
![]() ... and LG have just announced a 3.2" thick 76" 1920 by 1080p Plasma display, so the fight's not over yet. It's generally still agreed that, apart from the flicker issue (which with 24fps and 60fps progressive material could be handled by running the display at 120Hz and repeating frames) CRTs still potentially offer the best picture quality, but as Rimmer commented, LCDs are improving rapidly. ![]() Another reason not to buy an expensive display just yet (unless you want and can afford to replace it within a few years) is that organic light emitting diode [OLED] display technology, which is viewed by many as the generation display technology (after LCDs), is developing rapidly. Unlike LCDs, where the backlight is on all the time, in OLEDs each pixel's brightness determines how much power it uses, so an OLED display will typically use less than half the power of an LCD display - potentially saving hundreds of watts with 42" and larger displays. OLED display's are also much thinner than LCD displays - there are even ones which can be rolled up!. ![]() For more details of advanced display technologies for panels, see the Power Consumption and Display Technologies ... posting at: http://blu-raytalk.com/forums/viewfo...ad0ffb979e8406 ![]() |
![]() |
![]() |
#3 | ||
Active Member
Jun 2004
|
![]() Quote:
|
||
![]() |
![]() |
#4 |
Active Member
Apr 2004
|
![]() I'll have to investigate, but probable reasons include, for computer monitors, 15:9 gives a wider view than standard 4:3 but more vertical "desktop" space than 16:9; for TVs, a compromise between full 16:9 aspect ratio and conventional 4:3 is probably preferable if a significant amount of older material is likely to be watched - a number of CRT TVs were produced with 14:9 aspect ratios several years ago precisely for this reason. :? A number of ads claim that certain 1280 by 768 pixel displays have an aspect ratio of 16:9 whilst others have one of 15:9. I hope that these are typographical errors, because it would be a real mess if a number of displays didn't use square pixels - nobody would be that stupid? ![]() |
![]() |
![]() |
#5 |
Junior Member
May 2004
|
![]()
As far as I know all LCD use square pixels, so letterboxing is needed to achieve the correct 16:9 aspect ratio on a 1280 x 768 panel.
Another common non-standard aspect ratio found in LCD monitors is 1280 x 1024. The given aspect ratio is 4:3, but the actual aspect ratio is 5:4 (or 1.25:1). This means that if you set your PC's graphics card to a resolution other than 1280 x 1024, everything will look slightly stretched vertically. Again, to get the correct aspect ratio on full screen 4:3 video, slight letterboxing is needed. I've no idea what the reason for this is. There certainly are true 16:9 and true 4:3 LCD TVs and monitors out there, but 5:4 and 15:9 ones are probably more common. |
![]() |
![]() |
#6 |
Active Member
Apr 2004
|
![]() Square pixels make a lot of sense- especially with progressive scan. As far as codecs are concerned, it also makes compression a little easier and reduces a number of artefacts associated with both relative and absolute motion within scenes. There might be some psychometric argument for having non-square pixels, but as long as the resolution is high enough it shouldn't matter, and the benefits in terms of fewer compression artefacts and improved compression performance are likely to outweigh the costs of providing greater resolution than might be required (in one dimension). :? |
![]() |
![]() |
#7 |
Junior Member
May 2004
|
![]()
Interesting discussion of this topic here.
To summarise, "there are currently only two manufacturers (Samsung and one other) making the basic widescreen LCDs as components for the end-product manufacturers. Until Sharp start making widescreen LCDs, they'll probably all remain 15:9." |
![]() |
![]() |
#8 |
Active Member
Apr 2004
|
![]() Well that certainly clears that one up. :roll: I guess it’s WYSIWYG but WYGINNWYW… And whilst we leave the experts to sort out the difficult problem of how to insert a square peg into a rectangular hole, when several consultants insist that it must fit because a square peg can be redefined as a rectangular peg with sides of equal length, it appears that other experts may also be having some problems with weights and measures… Microsoft (you may have heard of them) may have been exaggerating the performance capabilities of WMV9 when compressing "full" 1920 by 1080 pixel high definition video, and reducing the apparent difference in quality between 720p24 and 1080p24 - see the Microsoft high definition WMV demos at: http://www.microsoft.com/windows/win...tShowcase.aspx It appears that a number of Microsoft's demonstrations of high definition 1080p24 material - that require a 1920 by 1440 pixel display according to the blurb - are not actually true (1920 by 1080 pixel) high definition, but interpolated 1440 by 1080 pixel video. In contrast, the 720p24 material appears to be encoded at the correct 1280 by 720 pixel resolution. Hence the results of any comparisons between the 720p24 and the “1080p24” material might be misleading… ![]() Misleading because:
|
![]() |
![]() |
#9 |
Blu-ray Guru
|
![]()
Microsoft is not the only one guilty of this.
Sony HDCAM, for long the industry standard for HD tape is only 1440x1080. Panasonic D5, which is the choice of Hollywood, is full res (in fact, flexible on res) 1920x1080 and that is used to master the DTheater tapes as I understand it. So, much of the HD stuff shot until recently was downres'ed when it went to tape. Sony have a new version of HDCAM out now that handles 1920x1080 and more... Cheers! DAve. |
![]() |
![]() |
#10 |
Junior Member
May 2004
|
![]()
Back on topic (almost). One final OT comment: if you do a Google search there are lots of 1280 x 720 16:9 LCD panels, and plenty of 1600 x 1200 4:3 models (as well as other 4:3 resolutions). In my opinion, 5:4 and 15:9 panels should be avoided at all costs. Selling them as "4:3" and "16:9" is very misleading.
I read somewhere that Microsoft downconverted their HD clips in order to make them easier to download. However, the retail versions aren't true 1080p either. Furthermore, I'm fairly sure the resolution of WMVHD DVDs is not 1440 x 1080, but 1440 x 816. The T2: Extreme Edition DVD is 816p, and I think the others may also have been encoded at this resolution (I can't say for sure, though). If 8Mbps is insufficient for 1440 x 816, it is clearly unacceptable for 1080p. Some people claim that 15Mbps is more than enough for 1080p24, but I can't help wondering if they really were looking at full resolution 1080p WMV video, and not 1440 x 816p. |
![]() |
![]() |
#11 |
Active Member
Apr 2004
|
![]() Yes Phloyd, but these are demos purporting to show the capabilities of WMV9 and, one would assume, allowing comparison between 720p material and 1080p material using this codec - in fact, between 720p24 material and 1080p24 material. Yes, Sony's earlier HDCAM was only 1440 by 1080 pixel resolution, and yes, Sony didn't highlight this fact, but a number of Microsoft's demos are taken off IMAX, which is potentially capable of at least 12,000 pixel horizontal resolution - so there was nothing to prevent there being at least a few full 1920 by 1080p24 resolution demos - unless Microsoft can't afford to get their “ultra-high quality” three minute 1080p24 demos telecined at 2k... :P To quote from the first paragraph in Microsoft's WMV HD Content Showcase, "To deliver ultra-high quality, these clips were encoded at 24 frames per second (fps), and at the resolutions noted of either 1280 by 720 (720p) or 1920 by 1080 (1080p). (Resolutions vary per clip.)" Stating that something is encoded at 1920 by 1080 pixels is quite different from claiming that something outputs data at the 1920 by 1080 pixel standard, in the same way that stating that a display has a native resolution of 1920 by 1080 pixels is not quite the same as claiming that it can display 1920 by 1080 pixel video. At the time of writing, all the 720p demos are specified as 1280 by 720 pixels in their properties windows, and are labelled accordingly as Title_720. Similarly, all the 1080p demos are specified as 1440 by 1080 pixels and labelled as Title_1080, apart from one clip, specified as 1440 by 816 pixels which, although labelled as Title1080 as a zip file becomes Title_WMVHD_Extract when unpacked into a WMV file... As an optimum (PC) configuration for playing back the 1080p demos, Microsoft recommends a screen resolution of 1920 by 1440 pixels. In fact, for all but one of the 1080p demos the optimum quality would actually be achieved using a screen of 1440 by 1440 pixels and of appropriate aspect ratio, as otherwise the video has to be interpolated horizontally (scaled by a third from 1440 to 1920 pixels). Unfortunately, such displays don't appear to exist - guaranteeing sub-optimal 1440. Shame they couldn't have stretched to full 1920 by 1080 p24, as 2k is pretty common now, 4k is not unusual and if you want and can afford ultra-high quality 6k and 8k are not difficult to get hold of. As mentioned earlier, carefully taken IMAX can go to 12k or more, whilst even 35mm is good for over 5k. :roll: |
![]() |
![]() |
#13 |
Active Member
Apr 2004
|
![]() WMV performs very well at low bitrates, which might be useful for long play modes, based upon the demos that Microsoft has created to (one would assume) show WMV in its best light, I have concerns about the effect of motion vector estimation with full resolution video of natural objects. Published reports speak of calibration data, etc., but artefacts of insufficiently accurate motion vector estimation are most likely to visible on objects of high fractal dimension such as trees, shrubs, desert tracks, and so on. I would love it if my concerns were not to be substantiated, but material such as the scaled 1440 by 1080 pixel material on the Microsoft website is not helpful in letting people (with the appropriate hardware) see the potential quality for themselves. :roll: Publishing demos which tend to favour 720p material over 1080p material is helpful for the cause of "high definition" on red laser products, but not so useful for high bandwidth, high capacity discs such as Blu-ray. At 8Mbps a 8.54GB DVD could store over two hours of high definition material, but if we scale for 1920 rather than 1440 this falls to about an hour and a half - which would mean that most films would need either two discs or so-called "flippers". Although I prefer the Blu-ray Disc [BD] over the Advanced Optical Disc [AOD] because of its much greater capacity and the potential of its roadmap, amongst other things, both can support plenty of "full" high definition 1920 by 1080 pixel material - even if AOD has to use lower bitrates to achieve it. :thumbup: Microsoft are pushing hard for a lower quality form of high definition, centred around resolutions much lower than full HD, and probably pre-dominantly 720p. Most of Microsoft's 720p demo material is encoded at 7Mbps, giving nearly three hours on a 8.54GB DVD. Strangely, the 720p "Dust to Glory" demo, which involves a lot of natural objects and fast movement, reports a bitrate of 20Mbps - which equates to less than an hour on a 8.54GB DVD - and that's not even 1080p! Pro rata that would be around 40Mbps at full 1920 by 1080p24 HD resolution… Sadly, it still shows plenty of the artefacts I have raised concerns about. Of course, 20Mbps or even 30Mbps is not a problem for BD, but then perhaps the properties window was lying... :roll: |
![]() |
![]() |
#14 |
Active Member
Apr 2004
|
![]() Note that Microsoft’s 1080p24 clips are encoded using non-square pixels at 16:9 1440 by 1080 and then scaled up on displaying to 1920 by 1080 pixels. Unfortunately, where a CRT can continuously vary the aspect ratio of each "pixel" as required, flat panel displays are not as flexible. Therefore, not only is the horizontal resolution inherently 25% lower than stated, but the mapping of 1440 pixels on to 1920 (33% more) is likely to produce further artefacts, as well as an effective horizontal resolution which may even be below that of the 720p material (1280 pixels). :roll: Displays of appropriate resolution which use discrete pixels - such as Plasma, LCD, OLED, etc., and have truly square pixels, are best for displaying high definition material - but will also show up any positional errors - such as may be produced by motion vector estimation processes, for example. Crucially for high definition, a display with a native resolution of 1920 by 1080 pixels can't perfectly display a 1280 by 720 pixel signal and vice-versa (a 1080p signal has to be scaled down by 33% in each direction to fit on to a 1280 by 720 pixel panel, whilst a 720p signal has to be scaled up by 50% for a 1920 by 1080p panel. If people buy 720p panels en masse, based upon the promise that they can support 1080p material, they will never enjoy the benefit of "full" high definition material without buying a new panel. 1080p material may even look worse than 720p material! Similarly, the numerous broadcasts in the US at 720p will not look their best displayed on a 1080p panel. :? The only artefact-free solution to displaying both 1080p and 720p material on a single fixed resolution panel is to go to an 8.3 Megapixel panel at 3840 by 2160 pixels. With this, each 1080p resolution pixel takes up 4 (2 by 2) high(er) definition pixels whilst each 720p resolution pixel takes up 9 (3 by 3) high(er) definition pixels. Of course, assuming that four times HDTV resolution (and a quarter Ultra High Definition Video) material isn’t available, such a panel could be designed to distribute a standard 1080p or 720p signal appropriately amongst its “micro-pixels”, so appearing to a 720p signal as a 720p panel and to a 1080p signal as a 1080p panel. ![]() Next generation discs such as Blu-ray can easily support full 1080p24 resolution, but with a mixture of 720p and 1080p material already available in the US, displays of both resolutions are being sold as offering "high definition." Whilst 1080p displays may be "future proof," many broadcasters are keen on using 720p in order to maximise revenue, and so you have to decide for yourself which resolution you are prepared to compromise on. Now is probably not a great time to buy an expensive new display. ![]() |
![]() |
![]() |
#15 |
Junior Member
May 2004
|
![]()
A 3840 x 2160p panel would be pretty cool. The current situation certainly isn't ideal, with so many oddball panel resolutions, such 1366 x 768 which requires 16/15 scaling of 720p source material. I'd be surprised if 1080p downconverted to 720p actually looked worse than true 720p though. While the scaling would impact on picture quality, this would be countered by the fact that the 720p version would retain the superior colour resolution of the 1080p source (assuming 4:2:0 sampling).
Another issue with scaling regards the encoding of 1080p movies from digital masters. Most big budget movies nowadays are scanned at 2048 x 1556 (2k resolution) and then cropped to the desired widescreen ratio. 2048 is very close to 1920, so how do you get downscale to 1920 pixels without introducing artefacts? Even with one to one pixel mapping there could be visible artefacts from the downconversion process. |
![]() |
![]() |
#16 |
Active Member
Apr 2004
|
![]() I guess the main point is that high definition was originally defined in the sixties as around 1000 line resolution, and properly displayed 1080p is clearly much better than 720p. Telecining at 2k resolution (or occasionally 4k resolution) has often been used to combine film footage with digital special effects, which is then transferred back to film for copying and distribution. Now that the 1920 by 1080 pixel high definition standard look like it might go mass-market over the next few years, there is a good reason for telecining appropriately for this resolution - taking account of any anamorphic lenses that are required, etc., so that no resampling is required. For a digital system to work optimally, there has to be a one to one (or one to n, where n is an integer) mapping between pixels in the camera (or scanner) and pixels in the display, with no reductions in resolution in between. Note that Sony recently introduced a 1920 by 1080 pixel telecine machine for transferring film to HDTV standard. ![]() Talking of 3840 by 2160 pixel resolution - also known as 2k line resolution (or roughly 4k resolution to confuse everyone), NHK, who started working with Sony on HDTV in the early sixties, began investigating the next generation after HDTV (Ultra High Definition Video or UHDV) over ten years ago at 2k lines and from 2000 on at 4k lines (roughly 8k resolution, or 7680 by 4320 pixels progressive, 60 frames per second), but it probably won't be in the home for at least five years - unless you happen to be Bill Gates ... :roll: |
![]() |
![]() |
#17 |
Junior Member
May 2004
|
![]()
Telecining 35mm film prints at 1080p resolution isn't a problem; I was specfiicially referring to the increasingly common use of digital intermediates, where the entire movie negative is scanned, usually at 2k resolution, sometimes at 4k, for editing and post-production. Where a digital intermediate is used, the final version of the movie usually exists as a 2k digital master, which is then transferred back to 35mm film for cinema release. The digital master is also used to produce the DVD/SDTV/HDTV versions. Obviously it would have been better if film were scanned at the same horizontal resolution as HDTV (or at an integer multiple of that resolution), but it's too late now, with digital cinema specified as having 2048 x 1080 or 4096 x 2160 resolution.:?
Downscaling from 2k to 1920 pixels involves 15/16 scaling which is not appealing from the point of view of picture quality. It would be even worse if the image has to be subsequently scaled down to 720p or 768p or any other native HDTV display resolution. In the long term, with 4k expected to be the digital cinema industry standard I don't see this as major issue, but is a problem while filmmakers scan at 2k. |
![]() |
![]() |
#18 |
Active Member
Apr 2004
|
![]() True Rimmer, they probably couldn't make a bigger mess if they had planned it, but the (roughly) 150 current digital cinemas use a 1280 by 1024 pixel resolution DLP - introduced in 1998, and represent around 0.1% of all cinemas, so there's still time to standardise on 1920 by 1080. Note that Sony and JVC have digital cinema products at 1920 by 1080 and 3840 by 2160, but as you say, if 1080p24 material is not taken off the negative but from a digital scan at a different resolution the benefits of 1080p over 720p etc. will be compromised. Note that the spatial resolution of well exposed and focussed 35mm is around 6k by 4k, or 24Mpixels, so could benefit from scanning at displaying at 3840 by 2160 pixels, and even 7680 by 4320 pixels in the future. 65mm IMAX film can be much higher resolution again... ![]() |
![]() |
![]() |
#19 |
Junior Member
May 2004
|
![]()
Unfortunately it is too late, as the DCI has already approved a 2048 x 1080 (2k) and 4196 x 2160 (4k) delivery system. The best solution for high definition DVDs will be for the studios to move to 4k scanning and post-production as soon as possible, as 4096 x 2160 downscales to 1920 x 1080 more gracefully than 2048 x 1080 to 1920 x 1080.
|
![]() |
![]() |
#20 |
Active Member
Apr 2004
|
![]() This could be viewed as either very short-sighted or as a good way of ensuring that the picture quality achieved in digital cinemas remains significantly higher than that achieved using high definition discs in homes. Of course, home high definition uses square pixels and a 16:9 aspect ratio with 1920 by 1080 pixels, so a typical film, with an aspect ratio of 2.35:1 occupies only 816 lines - as Phloyd has mentioned. The issue of the use of anamorphic lenses to produce such wide aspect ratios means that care would have to be taken anyway to ensure a one to one mapping between scanned pixels and HD pixels. If Sony can get hold of the originals it might want to rescan a range of popular films appropriately in order to best show off (1920 by) 1080p24 material on Blu-ray Discs. Note that IMAX scanners exist which can operate at 2k, 4k, 6k and 8k, and 8k will be required for Ultra High Definition Video (not exactly 8k, but 7680 by 4320 pixels). |
![]() |
|
|
![]() |
![]() |
![]() |
||||
thread | Forum | Thread Starter | Replies | Last Post |
"DVDs remastered in high-definition" | General Chat | thedownfall | 14 | 04-21-2009 12:20 PM |
Microsoft & Virgin Media: High def movie downloads in 15 mins | Home Theater General Discussion | funkybudda | 7 | 03-21-2008 10:30 PM |
High Definition News | Blu-ray Technology and Future Technology | dustin410 | 5 | 12-15-2007 06:03 PM |
Comcast FAQ: Will my DVDs show in High Definition? | Blu-ray Technology and Future Technology | nicoz | 3 | 10-22-2007 08:04 PM |
what WON'T look better in high definition? | Blu-ray Technology and Future Technology | no_wei | 11 | 05-02-2006 09:57 PM |
|
|