1080i vs 720p


If one has a 720p projector which can output at 1080i, is it better when using regular dvd source or HDTV to watch in 1080i or use my Faroudja processor and watch in 720p, technically speaking that is.
jcbower
1. 1080P
2. 720P
3. 1080i

But...use your eyes, maybe you will see no real difference?

Dave
Post removed 
Do you mean that your 720P projector can accept 1080i input? I've never heard of a 1080i projector. All input on your 720p projector will be scaled to 720p no matter what the input resolution is.

I have to agree with Elizabeth; 1080i looks better than 720p.
Elizabeth:

The picture quality generally is better in 720p.

Prpixel:

I have to agree with Elizabeth; 1080i looks better than 720p.

???????
It depends on what kind of material you're watching also. The real advantage to progressive scan (the "p" in 720p) is that the integrity of the image is basically maintained even when there are periods of fast action. When frames are displayed as interlaced (the "i" in 1080i), you'll get a combing effect during action scenes. This is because on one refresh of the screen you'll get lines 1, 3, 5, 7, 9, etc, and on the next refresh you'll get lines 2, 4, 6, 8, etc. With progressive you'll get the lines of information in sequential order, painted along the top of the screen to the bottom, so at the very worst you would see only one visual rift on the screen (for example when the new set of lines 1 to 325 meet up with the old lines 326 to 720, you would see a rift between line 325 and 326.)

I personally use 720p even though my TV maxes out at 1080i. If your TV is under 50", you won't even be able to tell a difference between 720p and 1080p from a normal viewing distance. There is a great article about this on CNET. If you're interested, I'll pass along the link.

-Dusty

Only CRTs can display interlaced signals. All digital displays (plasma, LCD, LCD projectors, etc...) are progressive. As mentioned above, a 720p projector always outputs 720p, just as a 1080p plasma always displays 1080p. The is no such thing as a 480i or 1080i digital display. Of course these devices can accept interlaced signals-- standard def TV (480i), and high-def TV (1080i).

However in these cases, the display has to de-interlace the signal to display it progressively. The quality of de-interlacing can be variable. So perhaps 1080i on a display that does a relatively poor job of de-interlacing might look noticeably worse than a 720p source on the same display.

With most newer displays, the consensus seems to be that 720p and 1080i sources look very similar.

So to the OP, first of all, your processor should be set to output 720p, otherwise it will be sending a signal that your PJ needs to process additionally.

720p sources (some HD channels) need no processing whatsoever to display on your PJ. These sources should be sent 720p native from the cable box, pass through your processor without being changed, to your PJ which will display them natively.

480p from standard DVD doesn't have to be de-interlaced, but needs to be scaled to 720p for your PJ. You need to figure out which of your player, processor, or PJ does this better.

As to 1080i from your cable box, this needs to be both de-interlaced and scaled down to 720p for display by your PJ. Most likely either your processor or PJ will do this better than the cable box, and you need to figure which one actually results in a better picture.

If your PJ is significantly newer than your processor, it is possible that it does all of these things better, and you might be able to ditch your processor.

Hope this helps

dave
Dg1968

So what does it mean that my LCD is 1080i if it doesn't actually display 1080i? This is an 8 year old rear projection Sony LCD. It doesn't display 1080p only 1080i. If "All digital displays (plasma, LCD, LCD projectors, etc...) are progressive" then it would be able to display 1080i and 1080p, now wouldn't it?
Dg1968,

Dave, very good explanation. It all comes down to which device in the chain has the best scaler.

Linkster,

Yes, 1080i broadcast generally look better to me than 720P. They seam to have more defined edge detail, IE they look sharper and therefore have more depth of field.

Dusty,

I have to disagree with the statement that there's no difference between 720P and 1080P on 50" and smaller screens at normal viewing distance. In side by side comparisons even my wife could detect a difference on screens as small as 32". The difference is most detectable in edge detail and with text. I'm curious if I'm going to be able to see a difference on 23" panels. I'm dying to replace the 23" 720P LCD in the kitchen with a 1080P IPS panel. I sit less than 3' from this set while eating.
Recently, I was in Best Buy (not the best lighting conditions) and they has a 42" 720P Panny Plasma sitting right next to a 42" 1080P Panny Plasma. I calibrated them as close a I possible could and then started asking strangers if they could see a difference between the two sets. Six out of seven people I asked could see a difference, and then my wife dragged my out of the store by my ear. The source was a 1080P demo loop with scenes from Avatar.

My video system consist of a TivoHD, set to output native broadcast format, running into an Anthem Statement D2v processor. My project is a Mitsu HC6800 and my screen is a 120" Da-lite High Contrast Cinema Vision. The projector has been ISF calibrated. The Sigma Designs VXP broadcast-quality video processor, in the Antehm, takes care of the scaling duties. In addition, I also have a Pioneer Blu-ray player. In the Bedroom it's a TivoHD directly into a 42" Panny 1080P Plasma. The 42" Plasma goes away next Wednesday and will be replaced with a 50" Panny Plasma (TC-P50G25) based on the new infinite black panel.
Bibucks5,

You have to determine what the native resolution of the panel in your set is. There's a difference between what input resolutions a set will accept and what it actually scales and displays them at. Post the model # of your set and we can look it up. Being that your set is 8 yours old, I would suspect that it's native resolution is 720P.
Prpixel,

I'm just passing along the tests of the highly trustworthy reviewers at CNET. Here is the article:

http://reviews.cnet.com/720p-vs-1080p-hdtv/

Thanks,
Dusty

Bibucks5,

Your set most assuredly is not displaying anything at 1080i. As Prpixel said, it is displaying at whatever the native resolution of its panels. In the case of an 8 year old Sony, the panels are probably something like 1386 x 768 or so.

Getting back to 720p vs. 1080i, when I said the consensus is that these 2 resolutions look very similar, that was a generalization for most people with most "normal" sized displays.

Besides the display and processing, there are many other factors to take into account. But the end result is that with most equipment right now, these 2 resolutions end up looking quite similar to most people.

However as Prpixel was saying, as the quality and size of the processing and display goes up, 1080i starts to look better than 720p. In fact, theoretically if de-interlacing is *perfect*, 1080i should approach, but not quite equal 1080p.

dave
Dusty,

Sorry if it sounded like I was attaching you.

I have some problems with that Cnet article. The first is that the optimum viewing distance for a 50" is 6' 3" not 8'. As you move past the optimum viewing distance, you will loose detail. Another thing that they fail to mention is what TV's they were comparing. Are they from the same manufacturer? They compare prices of two Panasonic Plasmas, but they don't tell you what sets were actually used in the test. In another part of the article, they say that they can see more detail, and less jaggie edges, but then they say that it doesn't justify the added cost. And, that the only real benefit to extra resolution is that you can sit closer to the set. Finally, they say that resolution is resolution and it's the determining factor when it comes to detail. Well, I guess that the quality of the scaler has nothing to do with it. I'd say that this article, while better than Consumer Reports, still has some shortcomings. I'll give them props for using Blu-ray as a source and comparing the sets side by side. I just wish they would have mention what sets/manufacturers were compared and been more scientific in their methodology.

I know from my own experiences that a 720P set can have a sharper picture than a 1080P set. Case in point, I owned a Panasonic PT-AX100u 720P projector about 4 years back. I got the upgrade bug for 1080P so I "upgraded" to a Mitsu HC4900 which on paper looked better than the Panny. In reality, the 720P Panny was a better picture all the way around; more detail, better contrast and a more natural picture. I dumped the HC4900 as soon as the HC5500 became available.

I have a friend that has a high-end/custom install business. So, I get to play around with many different TV's from many different manufacturers. In addition, I get to compare side by side different size/resolution sets in manufacturer's product lines. Sometimes, the difference is noticeable, but not a big deal. And sometimes the difference jumps out and bites you in the ass.
Dusty.

One other note about CNET reviews. I've noticed that lately their reviews are becoming more mainstream; focusing more on features/value and less on performance. I guess you can tell that I'm not a big fan. YMMV

BTW, my home page is news.cnet.com. LOL.
Post removed 
Elizabeth, correct but don't forget that when shown on a 1080p display, 1080i is displayed at 1080p-- twice the information is actually on the screen compared with 720p.

So as the quality of processing and screen size increases, 1080i should, and often does, look better than 720p. However, in most circumstances, these 2 resolutions look remarkably similar.
OK, here's where I've got to call foul...

"1080p has TWICE the data of either 720p OR 1080i"

1080p and 1080i transmissions have the SAME amount of data (1080 lines). The only difference is that the 1080i is transmitted to the display in 2 groups of 540 lines (even & odd) while the 1080p is transmitted in one pass, top to bottom. The amount of data is the same. It's not like 1080i is repeating the same 540 lines.

In theory, 1080i is better than 720p especially if you are talking about a static picture. The problem arises when motion is thrown in to the equation.

Bigbucks5, Yes but don't forget that (in the US) video is recorded and displayed at 60 Hz. So with 720p, 60 frames/second are recorded at full 720 resolution. 1080i is also 60 Hz, but each frame takes 2 cycles to record (1 cycle each for the even and odd lines per frame).

So during the course of displaying over 1 second, regarding the total amount of data handled, 1080i displays exactly half of 1080p, and approximately the same amount of actual data as 720p. But like you said, on the screen at any given time, the amount of data is the same between 1080i and 1080p.
Post removed 
Agree, 1080i is noticibly better than 720P on my display (Pioneer Elite Kuros 60"). When sporting events are transmitted in 720P, it is actually dissapointing, to me.
"no flat panel TV today actually displays a 1080i signal. They all buffer 1080i and display 1080p."

True, but if they would then 720p should be better in spite of lower resolution, since eye is extremely sensitive to horizontal misalignment.
I would think 720 P theoretically, than, should equal 1440i, but it doesn't. Inherent limitations of bandwith, and time, if I am not mistaken?
I don't agree that "no flat panel displays 1080i...". Mine does. Precisely. As for my post, I don't agree that a progressively scanned signal is half of the intelaced. Resolution isn't that simple. To me 720 P is a knock off of 1080i (due to transmitor not wanting to pay for the bandwidth and, perhaps, actually thinking a 720 progressively scanned image is better than a 1080 scanned interlaced image-fox network comes to mind); until 1080P, 1080i was the puresst hd resolution. I am not dismissing the advantage of a progressively scanned image, just the size of bandwith it travels in.
Cerrot - HDTV programs change format many times being transfered between stations. Standard OTA HDTV signal is digital and compressed to fit 6MHz bandwidth slots. No matter what this signal is my DLP TV always converts it to own TV format where individual pixels of the same color are addressed at once - then next color comes etc. 90 times a second (like color interlacing). Even frequency of update - 90Hz suggest that it has nothing to do with 60Hz interlacing. Other displays like plasma or LCD are perhaps simpler and address each individual pixel - whole screen at once. Screen resolution and update rate are arbitrary and can be rescaled. New Samsung LCD TVs rescale update rate (and approximate pixels) to 120Hz providing more fluid motion.

My TV has 720x1280 pixels resolution. With higher resolution amount of detail would be the same since it is limited by the bandwidth.
Post removed