How far apart in quality is 1080i vs. 1080p?


Just wondering if I should stick with my upsampling Sony recorder or purchase a PS3 or true BlueRay Player.
128x128tabl10s

Showing 1 response by johnnyb53


How far apart in quality is 1080i vs. 1080p?
Just wondering if I should stick with my upsampling Sony recorder or purchase a PS3 or true BlueRay Player.
As worded, your two questions together are rather loaded, and raise further questions.

What do you mean by "How far apart in quality is 1080i vs. 1080p?" Are you thinking that an upsampling player produces 1080i output while Blu-ray produces 1080p? And thinking that the only difference between the resolution of an upsampler vs. a Blu-ray source is the difference between 1080i and 1080p? If so, that is simply not the issue here.

No matter what is upsampled to what resolution, the maximum resolution of any standard DVD-sourced signal is 480p. It may be upconverted to 720p or 1080i/1080p, but it's not creating any new resolution bits to do that.

In actual practice in most cases, the difference between 1080i and 1080p is moot. Whether the data is interlaced or progressive, both formats have 1080 lines of resolution. Any of the popular display formats--plasma, LCD, DLP, or LCoS--do not interlace at all. When they receive an interlaced video signal, they convert it to progressive. So, for example, if a 1080p LCD flat panel receives a 1080i signal, it de-interlaces it to 1080p to be able to display it.

Conversely, if you have a hi-def disc player with 1080i or 1080p resolution, but your HDTV has a 1280x720 display engine, even if it receives a 1080i HD video signal, it downconverts it to its native resolution, which is 720p.

When these TVs advertise that they are compatible with 480i/480p, 720p, and 1080i, what they actually mean is that they'll accept any of those signal formats, but will display in that TV's native mode. In the case of older plasma, LCD, DLP, and LCoS TVs, that native mode is 720p. To my knowledge, the only HDTVs that display 1080i as their native mode are CRT-based, which very few people are using for HDTVs these days.

I have a Toshiba HD-A2 with output of 1080i. When I play a standard DVD on it, it upconverts the signal to 1080i, but still based on a resolution of 480p. My TV receives the signal and downconverts it for display in its native mode--720p. When I play a real HD DVD, the player sends the full resolution video signal at 1080i to the TV. My TV then downconverts this 1080i signal to display in its native mode, 720p.

Can I see a difference between std. DVD upconverted to 720p vs. HD DVD downconverted to 720p? You bet. To me, the difference is not subtle. The HD DVD signal, even downconverted to 720p, is very sharp and detailed, with no jaggies, noticeably video artifacts, or mushy backgrounds. Played on a true 1080p display, HD DVD or Blu-ray looks even better, but for me, watching a 55" from 6-8 feet, that difference isn't so noticeable. The difference between a 480p source and a 1080i source *IS*.