How far apart in quality is 1080i vs. 1080p?


Just wondering if I should stick with my upsampling Sony recorder or purchase a PS3 or true BlueRay Player.
128x128tabl10s

How far apart in quality is 1080i vs. 1080p?
Just wondering if I should stick with my upsampling Sony recorder or purchase a PS3 or true BlueRay Player.
As worded, your two questions together are rather loaded, and raise further questions.

What do you mean by "How far apart in quality is 1080i vs. 1080p?" Are you thinking that an upsampling player produces 1080i output while Blu-ray produces 1080p? And thinking that the only difference between the resolution of an upsampler vs. a Blu-ray source is the difference between 1080i and 1080p? If so, that is simply not the issue here.

No matter what is upsampled to what resolution, the maximum resolution of any standard DVD-sourced signal is 480p. It may be upconverted to 720p or 1080i/1080p, but it's not creating any new resolution bits to do that.

In actual practice in most cases, the difference between 1080i and 1080p is moot. Whether the data is interlaced or progressive, both formats have 1080 lines of resolution. Any of the popular display formats--plasma, LCD, DLP, or LCoS--do not interlace at all. When they receive an interlaced video signal, they convert it to progressive. So, for example, if a 1080p LCD flat panel receives a 1080i signal, it de-interlaces it to 1080p to be able to display it.

Conversely, if you have a hi-def disc player with 1080i or 1080p resolution, but your HDTV has a 1280x720 display engine, even if it receives a 1080i HD video signal, it downconverts it to its native resolution, which is 720p.

When these TVs advertise that they are compatible with 480i/480p, 720p, and 1080i, what they actually mean is that they'll accept any of those signal formats, but will display in that TV's native mode. In the case of older plasma, LCD, DLP, and LCoS TVs, that native mode is 720p. To my knowledge, the only HDTVs that display 1080i as their native mode are CRT-based, which very few people are using for HDTVs these days.

I have a Toshiba HD-A2 with output of 1080i. When I play a standard DVD on it, it upconverts the signal to 1080i, but still based on a resolution of 480p. My TV receives the signal and downconverts it for display in its native mode--720p. When I play a real HD DVD, the player sends the full resolution video signal at 1080i to the TV. My TV then downconverts this 1080i signal to display in its native mode, 720p.

Can I see a difference between std. DVD upconverted to 720p vs. HD DVD downconverted to 720p? You bet. To me, the difference is not subtle. The HD DVD signal, even downconverted to 720p, is very sharp and detailed, with no jaggies, noticeably video artifacts, or mushy backgrounds. Played on a true 1080p display, HD DVD or Blu-ray looks even better, but for me, watching a 55" from 6-8 feet, that difference isn't so noticeable. The difference between a 480p source and a 1080i source *IS*.
As far as screen distance is concerned, it would depend on your eyesight, as well as the ratio of your viewing distance to your screen's width (i.e., your viewing distance from the screen divided by your TV or projector's screen width using the same unit of measure). The further away you are from the screen the less relevant high definition may become, though in reality, I have found that even when you are too far away to take advantage of the details in HD, you still get more vibrant colors.

There are recommended minimum and maximum distances for usefulness of the above mentioned ratio, but if you have excellent eyesight acuity, you may see details further than the recommendations state that you will (I know I do as far as my bedroom setup plasma TV is concerned).

My viewing ratio is ~1.59. I don't have any issues with pixelization using HD content, and once I get into a SD-DVD movie, I don't get bothered by the lack of detail. This doesn't mean the lack of detail isn't obvious when I look for it, but it usually doesn't bother me unless I look for it intentionally. I specifically chose to have my HT system set up for HD playback so that it would be close to THX and SMPTE standards, and so I could take full advantage of the higher resolutions with a very large screen size at a decent distance (I went with a 128.5" diagonal screen and I am around 14.85' away from the screen).

I'll try to post a link to a cool calculator excel spreadsheet I used recently for fun. If I can't post it, e-mail me and I'll send you the link via e-mail... I'll also post a link to a more simplified online version as well.

http://www.carltonbale.com/home-theater/home-theater-calculator/

http://myhometheater.homestead.com/viewingdistancecalculator.html
The PS3 is an excellent Blu-Ray player... I bought the Sony flagship Blu-Ray player when it was initially released and compared it to the PS3, the only advantage to owning the flagship player was the analog output of high definition sound. The picture quality was the same (they both use the same video processing chip and output a digital signal to the TV, or in my case a projector, via HDMI). I decided to return the flagship player and just use the PS3 instead...

There is no comparison between upsampling and Blu-Ray disc resolutions. You are talking around 345,600 pixels vs. 2,073,600 pixels. If you have a high definition TV, that is 6 times more "real" pixels from the original analog/digital signal! That means Blu-Ray has 6 times greater resolution than normal SD-DVD (Standard Definition)... Upsampling can't add material that is missing, it can only blend in approximations using digital algorithms to come up with those 2,073,600 pixels. So, though you may have the same number of pixels in use with both signals, visually there is no comparison on a good screen. This is because the fill-in on the upsampling is only an approximation not the actual original signal at 1920 x 1080 resolution. The upsampling sometimes ends up looking foggy and smeared with blending artifacts, even when using the best algorithms on the market, because you are losing detail that can't be given back once it is gone and the only solution is to stretch out the information that is already there to fill the empty pixel space. Compare the image quality between an upsampling player and a HD-format side by side and you will find that the difference is night and day (I've made such comparisons on my HT system).

If you do decide to get an upsampling player, get the Oppo DV-981HD or DV-980HD. I use the 980 and 970 (discontinued) myself in the living room and in my bedroom (I use a Denon DVD-5910 in my HT, but this is because of its audio features...). These players aren't just a preference on my part, they are some of the best measuring upsampling players available, if not the best.
Just to get back to the question, the difference between 1080i and 1080P may or may not be perceptible depending on what material you are watching, what type of display/front projection you are using, screen size, and seating distance from the screen.

At one extreme, if the video material is static, lets say a still picture, then 1080i would be equal or better than 1080P. In other words, it doesnt matter whther screen frames are interlaced or deinterlaced because what you see never moves.

On the flip side, if the video material was in constant fast motion, you might see a difference and 1080P would provide, potentially, less blur and more clarity.

If your display cannot natively display 1080P material( ie its not a 1920 x 1080 display), then the difference is a probably nothing, as downscaling wont allow 1080 material to be displayed at 1920 x 1080, due to a lack of display device resolution( or pixels in the case of LCD and plasma).

Smaller screens and further seating distances would make the difference between 1080i and 1080P imperceptible in most instances.

Also source material native resolution makes a difference. You cant view 480i or 480P material at 1920 x 1080 i or p and expect anything better than the original resolution, unless there are qualitative weak links in your technology set up somewhere ( this happens).

So 1080P is "better" than 1080i when you are viewing 1920 x 1080 source material ( like Hi Def DVDs) that have a lot of motion on large screens essentially.

everything else gets a bit fuzzy but if you try 1080i vs 1080p viewing at a reasonable distance ( say 9-10 feet), you most likely wont see much of a difference.
I love the PS3 and it has superb Blu Ray playback. It has the fastest load time of all models, either stand alone or PS3 game and Blu Ray player.
1080i is usually standard def (480p) upsampled. 1080p via a Blu-Ray/HD DVD player is true HD
The difference between either Blu-ray and HD DVD and upsampled SD DVD's is huge, especially on larger sets. I run the 70" XBR and @ 12 ft there is no comparison, friends use 55" sets and say the same.

As for the PS3, I own one as well as several other Blu-ray players, the PS3 is a cool game, but a mediocre Blu-ray player. If you want a true player, buy one of the stand alones.