|
|
![]() |
||||||||||||||||||||
|
Best Blu-ray Movie Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $86.13 3 hrs ago
| ![]() $49.99 18 hrs ago
| ![]() $29.96 3 hrs ago
| ![]() $34.96 20 hrs ago
| ![]() $14.44 5 hrs ago
| ![]() $31.99 | ![]() $80.68 | ![]() $36.69 1 day ago
| ![]() $20.97 4 hrs ago
| ![]() $19.99 10 hrs ago
| ![]() $72.99 | ![]() $37.99 1 day ago
|
![]() |
#1 |
Member
Aug 2006
|
![]()
i have a blu-ray <1080p> and the toshiba hd dvd only plays 720p<1080i is smaller so i don't count it> why are some people compairing 720p to 1080p . shouldn't blu ray win hand over fist
|
![]() |
![]() |
#2 |
Member
Aug 2006
|
![]()
i'm not trying to start a war here but i would like to know why they are comparing them if 1080p and 720p are not even close
|
![]() |
![]() |
#3 |
Blu-ray Guru
|
![]()
1080i is smaller than 720p how? Despite not having each line read twice as fast 1080i has many more lines of resolution and is in fact superior to 720p. Also these figures are the horizontal lines of resolution. Don't ignore the added vertical lines of resolution that 1080i provides. Getting to the point though, while the A1 can output a max of 1080i, the XA2 will output 1080p and besides almost all HD DVDs are 1080p encoded so the only thing lacking is some of the players.
![]() |
![]() |
![]() |
#4 |
Member
Aug 2006
|
![]()
ok but right know your telling me 1080i is still stronger than 1080p
i bought a true 1080p samsung 71inch dlp and it looks great but when i read reviews they aren't what i thought and i know thats all opion but still until the xa2 comes out how can they compare them |
![]() |
![]() |
#5 |
Banned
Aug 2004
Seaattle
|
![]()
both formats are 1080p24 encoded to disc. That's all that matters..don't explode your head trying to quantify the efficiencies of the transfer.
|
![]() |
![]() |
#6 | |
Blu-ray Samurai
|
![]() Quote:
I think they usually compare them by taking Blu-ray down to 1080i. The truth is until the Sony or Pioneer player is released these players are not producing TRUE 1080p. They are taking the images down to 1080i 60fps then upconverting them to 1080p 60fps. The Sony and Pioneer players will allow you to watch the true contents straight from the disc at 1080p 24fps (true film quality). Then the comparisons will truly be a difference in quality between 1080i/1080p 60fps and 1080p 24fps (mainly no side effects of using 3:2 pull-down to achieve 60fps). I'll let someone else fill in the missing pieces if they so choose. |
|
![]() |
![]() |
#8 |
Senior Member
|
![]()
Actually, by most accounts 720p is superior to 1080i.
Food for thought: http://www.bluesky-web.com/numbers-mean-little.htm http://alvyray.com/DigitalTV/Progressive_FAQ.htm |
![]() |
![]() |
#9 |
Senior Member
|
![]()
ndirtdigler, as already mentioned both BD and HD DVD are encoded on the disc at 1080p. Well, actually, HD video like Legends of Jazz is encoded interlaced because that how it was shot (1080i). But film is inherently progressive, so most movies are being encoded as 1080p. Neither BD machine nor the Toshiba HD players will give you the best possible results via progressive out though. Neither will also pull the 1080p signal straight off the disc. All current machines take it down to 1080i first. And bast on the last report I saw, the XA2 will do the same.
I can't test the high definition deinterlacing of the Panasonic, but it didn't do really well with 480i. The Samsung might actually be better at 1080p post NR fix, as it's been tested and shown that for 1080i it does inverse telecine and does it well. But, due to the nature of the Faroudja chip in how it downsamples chroma, it still won't do as well as outboard solutions like VRS, HQV, and Gennum. I believe that Panasonic didn't use Faroudja for this very reason, because it handicaps the chroma resolution/depth. But as long as the player, display, or video processor is applying proper inverse telecine to 1080i you should be seeing about all there is to see from either technology for your given display. Last edited by Chad Varnadore; 10-22-2006 at 07:29 AM. |
![]() |
![]() |
#10 |
Blu-ray Guru
|
![]()
I beg to differ.
For film sources 720p has no adavantages over 1080i since 1080i effectively carries 1080p 24 data where 720p does not. Alternatively there are many 1080i sources and 720p is not as good for that either. There are some though very few 720p 60 fps real sources. Of course 720p is the right answer there but such sources are rare in the real world. |
![]() |
![]() |
#11 |
Banned
|
![]()
720p = 1280x720
1080i = 1920x1080 Even if all horizontal lines aren't onscreen at one time all 1920 vertical are. 1080i has more active pixels than 720p. A good example: NBC's Sunday Night Football coverage is noticably sharper than Fox or ESPN's NFL games. |
![]() |
![]() |
#12 | |
Blu-ray Samurai
|
![]() Quote:
|
|
![]() |
![]() |
#14 | |
Active Member
Aug 2006
|
![]() Quote:
1920x540 is a lower resolution than 1280x720. This is why digital flatpanels are advertised at 720p, because they are best veiwed at 720p. LCD, Plasma, and DLP are progressive in nature, meaning the entire screen must be rendered in a single pass. Thus video must be deinterlaced on the fly. High end TVs are better at deinterlacing than low end TVs. However, when you deinterlace 1080i on a flatscreem, the video will be converted to 540 first then upsampled to 720p. Where as 720p will displayed as is, which is what you want as the filtering processes degrade quality. CRT based HDTVs can actualy output 1080i as Standard TV would output 480i. No detinterlacing nessary. Watching 1080i on a 720p flat screen = less quality. However most people can't tell the difference, expecialy on DVD upsampling players. People can beleive what they want, But I know for a fact that my TV looks way beter in 720p than 1080i. And I got a 720p LCD HDTV. |
|
![]() |
![]() |
#15 | |
Blu-ray Guru
May 2006
|
![]() Quote:
huh? i think we had this discussion in another thread. there is absolutely no down conversion. when you speak about 540 or 1080 you are actually talking about the resolution. the picture does not magically change from 1920 x 1080 to whatever weird number 540 is (960 x 540?) |
|
![]() |
![]() |
#16 | |
Senior Member
Jan 2005
|
![]() Quote:
720p is shot at 60hz - a full 60 frames per second with each frame being unique. I'm not sure why you would state as fact something different, but this is not at all correct and you should reference your source on this. The entire crux for 720p being superior to 1080i for video based materials and the reason why most sports specific networks are using 720p is because of the full 720p resolution for 60 frames which will product sharper images during motion scenes - ie: football. 1080i is superior for non-motion events in general because the lack of motion ends up rendering a scene that is closer to 1080p in appearance instead of 1080i. 1080i, while often called 1080i/30 is in fact 60 true HALF frames per second. It is not shot as one full frame that is split in half then shown at one half of 30 frames per second (total 60). It is truly a half frame shot, then 1/60th of a second, the next half frame shot. There is NO proper way to lay two of these half frames on top of each other, if there is motion, and not induce stair stepping in the image. I detest the term 1080i/30 - it is incredibly inaccurate in my opinion as 1080i is NOT shot at 30 frames per second, but 60 half frames (interlaced) per second. |
|
![]() |
![]() |
#17 |
Banned
|
![]()
720p may indeed be 60fps for live broadcast, but I was referring to it as it applies to Blu-ray disc.
There is no material out there on disc (other than live broadcast) that is 60fps, and that somehow a 1080i output from either BD or HD DVD would be inferior to a 720p one. Plus the fact that calling 1080i "540p in disguise" is ludicrous. I also feel that many 720p sports broadcasts may be 60fps, but at the expense of detail. The 1080i broadcasts have more detail and that's what HDTV's all about. Last edited by PeterTHX; 10-23-2006 at 04:11 AM. |
![]() |
![]() |
#18 |
Member
Jun 2006
|
![]()
It is a fact that many (if not most) HD displays, rather than properly de-interlace 1080i, simply take half the fields and use it as 540p, and then upscale that to their native resolution. (1920x540 is still more pixels than 720p, but not much.)
It's a lot cheaper to convert 1080i -> 540p than it is to properly de-interlace 1080i. Plus, this cheap/lazy method guarantees there won't be any combing artifacts. But the process does lose half the original resolution. 1080i when properly deinterlaced and displayed on a 1080p panel looks gorgeous. It can look a lot better than 720p, but much of that depends on the source material and the bitrate of the MPEG-2 encoder. I saw some 1080i NASCAR yesterday that looked awful. It was the fault of the broadcaster. There are very, very few "true" 720P panels out there. Many "HD" plasmas are 1024x768 (yikes!) ... and newer HD plasma/LCD panels are 1366x768. This so-called 768p is suboptimal because every single video mode must be scaled to fit the panel ... including 720p! No video mode is native to a 768p display. The new 1080i/p panels coming out have the advantage that they can display 1080i and 1080p material with no scaling at all - full dot by dot precision. Sadly not all 1080p TVs take advantage of this. The recent 1080p Samsungs stretch the image anyway, creating artifacts in the process. Anyway, 60p is nice for sports, but 24p or 30p is fine for movies and TV. In summary: Lousy 720p/768p panels (most of them) display 720p better than 1080i 1080i can be deinterlaced to 1080p without losing much (if any) quality, but usually isn't. That is changing. Most HDTV's are 720p/768p ... but that is changing, too. I have a "full" 1080p TV and within a few years most HDTV's sold will be 1080p. I have a feeling that the deinterlacing silicon will be improved too ... and then 720p will have but one advantage over 1080i: framerate. And that will only help sports. Eventually I think the market will shift from 720p to 1080i and maybe even 1080p. Don't forget that ATSC is capable of 1080p! When the average joe's TV can display those signals, broadcasters may re-evaulate their positions. This argument reminds me of anamporphic vs. letterboxed DVD's. James Cameron refused to release anamorphic DVDs of his movies because most DVD players did a poor job scaling the video and inserting the letterboxes for 4:3 TVs, which is what the average consumer had. Consumers with high-end 16:9 TVs had to suffer with a much worse picture, but folks with a plain TV got a better picture. (I have no idea if Cameron has changed his position, but it wouldn't surprise me given how commonplace widescreen TVs are now.) |
![]() |
![]() |
#19 | ||||
Senior Member
Jan 2005
|
![]() Quote:
So - It is important to specify the source. The source is never Blu-ray, but rather a 1080p/24 source (typical film). Some TV shows have been shot at 30fps and others, I have heard, at 60fps. Yet, it's the actual encode that matters the most and that will affect the quality of the output the most. In an ideal world you have 1080p/24 conversion from film to a display that accepts and properly shows that source at 1080p/24 (48/72/etc.). Quote:
Quote:
Quote:
HDTV has no 'all-about'. It is covers many levels - color, color accuracy, digital perfection, motion, and detail. Any one of those areas could be considered more important than another, but it is the combination of all which is what HDTV is all about. What also MUST be considered is bandwidth in broadcasting and a simple breakdown of numbers that shows what 1080i and 720p are all about... 1080i = 60 half frame 1920x1080 shots per second, encoded as MPEG2 fields of 1920x540 pixels. This is 1,036,800 pixels every 1/60th of a second. 720p = 60 full frame 1280x720 shots per second. Encoded as MPEG2 fields of 1280x720 pixels. This is 921,600 pixels every 1/60th of a second. So - it appears 1080i carries more pixels from the get go right? Except we aren't talking about uncompressed delivery. We are talking about bandwidth limited MPEG2 compressed distribution. This actually correlates to the lower the number of pixels in an image, the better the compressed image will look at a given file size. Since broadcast bandwidth tends to be fairly constant, 720p delivery can look a bit less 'compressed' upon delivery and have more detail overall per 1/60th of a second shot. As well, 1080i must deal with variances from one interlaced field to the next accurately with compression so that details properly mesh and don't turn into an unidentifiable blob. FYI: I'm not trying to pick 720p as my favorite. It is just a tool that has appropriate, and inappropriate times of usage. 1080i is NOT the best format to use under all circumstances and neither is 720p. 1080p/60 is ideal under ALL circumstances if the bandwidth is high enough and the source is 1080p/60. 1080p/24 is ideal for all films shot at 24fps. But, output better match input! |
||||
![]() |
|
|
![]() |
![]() |
![]() |
||||
thread | Forum | Thread Starter | Replies | Last Post |
Question on Blu-ray Vs. DVD | Newbie Discussion | Kor-ray | 28 | 07-01-2009 11:42 PM |
DVD question on blu ray | Newbie Discussion | DarthRandel | 6 | 05-11-2009 03:12 AM |
Blu-ray IS better than HD DVD... But I have a question. | Blu-ray Technology and Future Technology | Heckler | 31 | 02-01-2008 02:15 AM |
Question about Blu Ray vs. HD DVD | Blu-ray Technology and Future Technology | Cyorg | 4 | 11-27-2007 08:42 PM |
Question about a blu-ray dvd.... | Blu-ray Movies - North America | jam24 | 2 | 12-27-2006 03:23 PM |
Thread Tools | |
Display Modes | |
|
|