|
|
![]() |
||||||||||||||||||||
|
Best PS3 Game Deals
|
Best PS3 Game Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $15.05 | ![]() $14.99 | ![]() $39.99 | ![]() $28.46 | ![]() $29.99 1 day ago
| ![]() $70.66 | ![]() $19.70 | ![]() $26.03 | ![]() $16.88 | ![]() $39.80 | ![]() $59.95 | ![]() $39.99 |
|
![]() |
#1 |
Special Member
Aug 2006
Rhode Island, USA
|
![]()
I have heard before that if you have a true 1080p TV receiving a 1080i signal, the upconversion would result in a true 1080p image, without interlacing. Is this true?
If so, could you then say that any game that will be in 1080i will really be 1080p? |
![]() |
#2 |
Member
May 2006
Raleigh, NC USA
|
![]()
That falls under the yes and no catagory. You see when the tv converts a image to 1080p there is a loss of image quailty due to the image being processed. One can not make something out of nothing well. Some sets will do a better job than others, however if you feed the set a 1080p image then the tv does not have to mess with the image and you get a better picture than if the tv had to make it 1080p. Can you tell the difference? That is something you have to see to determine.
|
![]() |
#3 |
Special Member
Aug 2006
Rhode Island, USA
|
![]()
Ah. I see. That makes sense. Thanks for the insight!
|
![]() |
#4 |
Active Member
Aug 2006
|
![]()
It all depends if the 1080i source was filmed in 1080i or 1080p. If the source was filmed in 1080i, all the frames will be interlaced. In other words you lose half the vertical resolution. If the source was filmed in 1080p, then the true progressive source can be realised.
DVD's can be displayed at 480p, because they are usualy fillmed at 480p at 24 FPS. The video is telecined (interlaced using 3:4 pull down). Basicaly this turns a 480p video from 24 fps to a 480i video at 29.99 FPS. Upconverting DVD players use advanced inverse telecine filters that return the video to it's former 480p state. All 1080i sources from HD-DVD and Blu-Ray are fillmed at 1080p, so theoreticaly you can inverse tellecine the 1080i video back to it's original 1080p state. However the quality of that inverse telecide depends heavily on the TV's video processing chip. If the chip sucks then video will suck. Infact the Samsung Blu-Ray player's built in noise filter chip is suspected for causing a drop in quality in Blu-Ray playback. From my experience with encoding is that their is always a drop in quality when you re-encode, no matter how good the encoding techniques are. Outputing true 1080p to the TV ensures the video will be in the quality the studio that developed it indended. In short, True 1080p is like lossles audio, it's the closest to the actual recording. |
![]() |
#5 |
Site Manager
|
![]()
There's actually a little more to it.
IF a program is mastered at 1080p while being watched on a 1080p display and the transfer optimized for 1080p display and then coded as 1080p on the disc, and the disc player when splitting the 1080p frame into two interlaced fields does nothing to the image except slice it into two interlaced fields and then the 1080p display recognizes the sliced 1080i input as being from a 1080p frame and simply re "weaves" the two fields into one full 1080p frame without further processing the image, a 1080i player output would look = to a 1080p one. If any of this things is not followed the image WILL be softer. For example if the image is being created/transfered/evaluated using a 1080i CRT, the image will probably be transfered softer than it should be, being vertically filtered to prevent interlace flicker/twiter. Or another example, if the 1080p display treats the incoming 1080i source as 1080i video then it will have to "deinterlace" it using one of the various methods for that, the worse being the Bob method which halves the vertical resolution, the best being the pixel/motion adaptative ones which are better but not totally perfect, etc. You think DVD is 480p? Many many DVD transfers (if not the mayority!) are optimized for interlaced displays and not done in true 480p quality. Look at the Lady And The Tramp DVD. The 2.55 wide image has been so filtered it looks soft. Probably to avoid complaints from parents by preventing any violent twiter happening on 4:3 TVs when watching a Cinemascope 2.55 image shrunk into a tiny aprox. 260 pixel tall interlaced image. DVD is basically 480i. Because that's what NTSC is. Could that be one of the reasons some of the first High Definition film transfers look softer than they should? Maybe some were already-done transfers optimized or created in 1080i facilities used for HDTV broadcasts... On the other hand, for true 1080i material, any 1080i source (live camera, etc) since it is 60 interlaced fields (or 50 in PAL), has to go through a deinterlcing chip at some point, be it at the 1080p player output or the 1080p display input. The better the deinterlacing chip method the sharper and artifact free the output but the best ones cost $$$ and the result is not the same as a true 1080p 60fps source would be. Last edited by Deciazulado; 09-04-2006 at 11:28 PM. |
|
|
![]() |
![]() |
![]() |
||||
thread | Forum | Thread Starter | Replies | Last Post |
1080p & optical cable or 1080i & PCM | Display Theory and Discussion | Furley_Ghost | 5 | 09-06-2009 08:35 PM |
1080i & 1080p | Display Theory and Discussion | Peaky | 19 | 12-01-2008 05:43 AM |
Need help, NIN disc doing 1080i & not 1080p | Newbie Discussion | lateralus85 | 14 | 01-19-2008 05:04 AM |
1080i & 1080p question | Newbie Discussion | jayson decambra | 5 | 12-27-2007 06:24 PM |
All BD players downconvert 1080p to 1080i/60 then upconvert to 1080p/60? | Blu-ray Players and Recorders | mainman | 8 | 11-23-2006 07:55 PM |
|
|