If your display is digital (most modern panels and projectors are), there is no "decoding" at all. There are other digital operations that are accomplished (de-interlacing, scaling, etc.).
Plenty of stuff goes on inside a BluRay player (or DVD player) besides the final digital to analog conversion. Using the HDMI out DOES cut out twice converting the signal. If you use the component video out, the BluRay player had to convert the signal to that standard, then the screen (if not old CRT) converts it BACK to a digital signal.
So the HDMI is WAY better no matter what. As an aside, the BluRay player does more than just pull a perfect stream off the disc and send it to the HDMI. So a better player is gonna do a LOT better than a cheap one. The best low price player by all accounts is the Oppo 83. Which (if you have heard) was just put in a fancy case and sold for $3,500. under another brand name.
So I guess the Oppo IS pretty good.
Elizabeth, you completely gone the other way of what Aberyclark was saying. He said Video quality only(audio conversion and your Oppo 83 player is a different subject for another day). What's with the Oppo 83, this subject was beaten to death every day here on Agone, enough already. Now back to the question, make sure your player can put out 1080P quality. That's the STANDARD display for true Blu-Ray dics, as long as it does and your HD display as well, yes the pictures quality are as good as they can get. The Audio conversion that's another story for another day.
If your source is 1080P and your display is 1080P then the display will not process the signal. The same go's if both are 720P.
If your source is 720P but your display is 1080P then the display may convert it to 1080P if you allow it to in the displays setup menu. If your source is 1080P and your display is 720P then the display will convert the signal. BUT this is as long as the 720P display it can accept the 1080P signal. Most 720P displays will not accept 1080P but will accept 1080i.
If you do have to have the signal processed try both the source or the display to see what does a better conversion.
A standard DVD 480P upsampled signal will not look as good as a true 1080P signal.
THEN.....how much difference would I see (not hear), if any, from the 80 to the 83? Basically, from what I see, the 80 is a same or step up in sound and major step up in video than what I have now. My elite 45-A universal player and elite receiver are both about 7 years old. I will have to A/B cd's (not as concerned about movie audio) with both the analog and optical outs to see if I can hear a major difference.
It's (video) all moot if you don't have a very good display. Like speakers and av gear, if your system isn't revealing, you won't hear certain differences. So, if you don't have a very good dislay, you may not see the difference between the 80 and the 83. Keep in mind, power cords, conditioners, quality of electricity and quality of HDMI cables all make a difference in being able to see the difference in BD/DVD players.
Many of the tests done by those magazines are error handling tests. It shows how well the player handles disks with mastering error, such as showing video cadence on film material. Such tests are easily done, and the results can be easily compared quantitatively. But I don't see a lot of tests on those magazines that deal with quality of video decoder, which is the subject of the discussion here.
24FPS vs 30FPS issue is a good point, and it's the display that needs to handle such input, not the player. Many of the displays being sold today support 120Hz or 240Hz, which are multiples of 24 and don't have any issue handling 24FPS source material.