1080i vs 720p


If one has a 720p projector which can output at 1080i, is it better when using regular dvd source or HDTV to watch in 1080i or use my Faroudja processor and watch in 720p, technically speaking that is.
jcbower

Showing 5 responses by dg1968


Only CRTs can display interlaced signals. All digital displays (plasma, LCD, LCD projectors, etc...) are progressive. As mentioned above, a 720p projector always outputs 720p, just as a 1080p plasma always displays 1080p. The is no such thing as a 480i or 1080i digital display. Of course these devices can accept interlaced signals-- standard def TV (480i), and high-def TV (1080i).

However in these cases, the display has to de-interlace the signal to display it progressively. The quality of de-interlacing can be variable. So perhaps 1080i on a display that does a relatively poor job of de-interlacing might look noticeably worse than a 720p source on the same display.

With most newer displays, the consensus seems to be that 720p and 1080i sources look very similar.

So to the OP, first of all, your processor should be set to output 720p, otherwise it will be sending a signal that your PJ needs to process additionally.

720p sources (some HD channels) need no processing whatsoever to display on your PJ. These sources should be sent 720p native from the cable box, pass through your processor without being changed, to your PJ which will display them natively.

480p from standard DVD doesn't have to be de-interlaced, but needs to be scaled to 720p for your PJ. You need to figure out which of your player, processor, or PJ does this better.

As to 1080i from your cable box, this needs to be both de-interlaced and scaled down to 720p for display by your PJ. Most likely either your processor or PJ will do this better than the cable box, and you need to figure which one actually results in a better picture.

If your PJ is significantly newer than your processor, it is possible that it does all of these things better, and you might be able to ditch your processor.

Hope this helps

dave

Bibucks5,

Your set most assuredly is not displaying anything at 1080i. As Prpixel said, it is displaying at whatever the native resolution of its panels. In the case of an 8 year old Sony, the panels are probably something like 1386 x 768 or so.

Getting back to 720p vs. 1080i, when I said the consensus is that these 2 resolutions look very similar, that was a generalization for most people with most "normal" sized displays.

Besides the display and processing, there are many other factors to take into account. But the end result is that with most equipment right now, these 2 resolutions end up looking quite similar to most people.

However as Prpixel was saying, as the quality and size of the processing and display goes up, 1080i starts to look better than 720p. In fact, theoretically if de-interlacing is *perfect*, 1080i should approach, but not quite equal 1080p.

dave
Elizabeth, correct but don't forget that when shown on a 1080p display, 1080i is displayed at 1080p-- twice the information is actually on the screen compared with 720p.

So as the quality of processing and screen size increases, 1080i should, and often does, look better than 720p. However, in most circumstances, these 2 resolutions look remarkably similar.

Bigbucks5, Yes but don't forget that (in the US) video is recorded and displayed at 60 Hz. So with 720p, 60 frames/second are recorded at full 720 resolution. 1080i is also 60 Hz, but each frame takes 2 cycles to record (1 cycle each for the even and odd lines per frame).

So during the course of displaying over 1 second, regarding the total amount of data handled, 1080i displays exactly half of 1080p, and approximately the same amount of actual data as 720p. But like you said, on the screen at any given time, the amount of data is the same between 1080i and 1080p.