It depends on which your 1080p set does the best job down converting. Try both. In all probability you won't see any difference between 720p and 1080i.
If you have to choose I would send 1080i over 720p, there may be times it may be noticable, tho I agree on smaller sets its harder to discern.
With DirectTV-HD satellite I get HD channels in 1080i and the others in what I think is 720p, and there is no comparison. I will now only watch HD channels.. but I am using 60" Elite plasma.
BTW Tgrisham would your friend recommend calibrating the 151 elite plasma?
Your eyes and display should tell you what's best.
Interlaced vs progressive are two different animals entirely. The refresh rate also plays a big part in interlaced... more so than with progressive (480P, 720P, 1080P). Distance too will overcome minor annoyances, or artifacts.
The channel you are watching is going to make the decision for you usually. Although I've got my box set to 720p, there are those stations which don't send out 720p info, and use 1080i instead, and vice versa... also some are still 4:3, not 16:9.
Upsampling one format to a higher resolution simply doesn't always make the image better... as you are telling the device to fill in missing info by 'guessing'.
Sometimes the image is degraded with artifacts, and not improved at all. Much of the time it is just made to look smoother on static images, and revelas it's true self ojn fasxt moving scenes. the display, technology employed, and components are all responsible for overcoming such errors when upsampling video images.
Progressive images should provide more vibrant colors, and an easier to view image... and I do mean 'should'.
I think that format is converted many times between stations but bandwidth (usable information) stays the same. 720p should be better on the standard TV since human eye is very sensitive to horizontal misalignment that comes from interlacing but in digital TV everything is converted to individually addressed pixels anyway. I have 42" DLP TV and switched mode from 1080i to 720p many times - could not see absolutely any difference. I don't know why many stations transmit in 1080i since, according to my friend who knows more about image processing, 720p is easier to compress.
I have a friend who is a certified ISF technician and has calibrated 1000's of televisions. I stand by his advise. At 50" or less there is no difference to the human eye between 720 and 1080. Larger than that and the difference MAY be apparent, or not depending on your distance from the screen.
I believe you meant to say that at 50" or MORE there is little difference and that differences may be apparent at LESS than that distance. While I'm not sure that 50" is the threshold, the differences in higher resolution picture quality, i.e. 1080p v. 1080i or 720p, are much more discernable at closer viewing distances.
Thank you for the clarification. I meant to say that at 50" diagonal measurement displays or smaller, the differences between 720 and 1080 are insignificant. I did not mean at viewing distance. And, although most displays are good out of the box, all displays can be improved with calibration. The spec sheets and reviews show some displays come with virtually perfect calibration already and others benefit greatly. It varies from model to model. There is the room in which you are watching, and the source may determine the quality. Some displays have individual settings (some user accessible and others only technician accessible)for each source. There can be a great difference between the calibration for DVD-SD vs Blu-Ray vs over-the-air HDTV. I am not an expert, but this is what I understand to be true.