Technical Question

I know if I use the "analog audio" outs of a CD player, the actual player is doing the DAC. If I use digital out (optical, coaxil), the receiver will do the DAC. Now, does the same apply with video? If I use the HDMI connection on a blu ray player and connect to my TV, does the TV do the actual video conversion? Just curious...If the TV does the conversion, why spend more on a top line BD player (disregarding features...just talking about video quality)?
If your display is digital (most modern panels and projectors are), there is no "decoding" at all. There are other digital operations that are accomplished (de-interlacing, scaling, etc.).

Plenty of stuff goes on inside a BluRay player (or DVD player) besides the final digital to analog conversion. Using the HDMI out DOES cut out twice converting the signal. If you use the component video out, the BluRay player had to convert the signal to that standard, then the screen (if not old CRT) converts it BACK to a digital signal.
So the HDMI is WAY better no matter what. As an aside, the BluRay player does more than just pull a perfect stream off the disc and send it to the HDMI. So a better player is gonna do a LOT better than a cheap one. The best low price player by all accounts is the Oppo 83. Which (if you have heard) was just put in a fancy case and sold for $3,500. under another brand name.
So I guess the Oppo IS pretty good.
Thanks for clarifying things. I'm probably going to order the Oppo 80
Elizabeth, you completely gone the other way of what Aberyclark was saying. He said Video quality only(audio conversion and your Oppo 83 player is a different subject for another day). What's with the Oppo 83, this subject was beaten to death every day here on Agone, enough already. Now back to the question, make sure your player can put out 1080P quality. That's the STANDARD display for true Blu-Ray dics, as long as it does and your HD display as well, yes the pictures quality are as good as they can get. The Audio conversion that's another story for another day.
If your source is 1080P and your display is 1080P then the display will not process the signal. The same go's if both are 720P.
If your source is 720P but your display is 1080P then the display may convert it to 1080P if you allow it to in the displays setup menu. If your source is 1080P and your display is 720P then the display will convert the signal. BUT this is as long as the 720P display it can accept the 1080P signal. Most 720P displays will not accept 1080P but will accept 1080i.
If you do have to have the signal processed try both the source or the display to see what does a better conversion.
A standard DVD 480P upsampled signal will not look as good as a true 1080P signal.
HAve to diagree that just because a product has a 1080p output means they are "all the same' NOT!
The digital proceesors in the player for video make a huge difference. THAT is why the Oppo 83 is the player. the video chip in the Oppo 83 is very, very good. much difference would I see (not hear), if any, from the 80 to the 83? Basically, from what I see, the 80 is a same or step up in sound and major step up in video than what I have now. My elite 45-A universal player and elite receiver are both about 7 years old. I will have to A/B cd's (not as concerned about movie audio) with both the analog and optical outs to see if I can hear a major difference.
From the Oppo site:
For video output over HDMI, the BDP-83 sports a highly advanced video processor (ABT2010 with VRS by Anchor Bay Technology). In contrast, BDP-80 handles all video processing tasks using its main decoder chip.

The VRS video processor is as good as it gets, like the HQV Reon & Realta, so given the "right" source material you'll likely be able to detect artifacts in the display from the 80 that would be handled better by the 83.

Scan the player reviews at Home Theater magazine's web site and look at the video processing test results. There is quite a bit of variation. HQV and VRS, almost universally, pass all the tests and offer excellent scaling of DVD (std def) to high def. The most recent Panasonic BR players have also tested very well.
It's (video) all moot if you don't have a very good display. Like speakers and av gear, if your system isn't revealing, you won't hear certain differences. So, if you don't have a very good dislay, you may not see the difference between the 80 and the 83. Keep in mind, power cords, conditioners, quality of electricity and quality of HDMI cables all make a difference in being able to see the difference in BD/DVD players.
I agree, the quality of TV is more important than the quality of BluRay player. It was more important with DVD as the player had to deal with interpolation and deinterlacing of video. With 1080 contents and 1080 display there is no such need.
With 1080 contents and 1080 display there is no such need.

Then I guess the folks at Home Theater magazine are wasting their time doing HD video tests. At the very least, frame rate differences between source (24 FPS) and display (30 FPS) have to be handled. Yes, there are more displays able to handle 24 FPS material adequately today, but many can not and so the player will have to do the conversion.
Many of the tests done by those magazines are error handling tests. It shows how well the player handles disks with mastering error, such as showing video cadence on film material. Such tests are easily done, and the results can be easily compared quantitatively. But I don't see a lot of tests on those magazines that deal with quality of video decoder, which is the subject of the discussion here.

24FPS vs 30FPS issue is a good point, and it's the display that needs to handle such input, not the player. Many of the displays being sold today support 120Hz or 240Hz, which are multiples of 24 and don't have any issue handling 24FPS source material.
Many of the tests done by those magazines are error handling tests.
Actually, the one's I'm referring to are deinterlacing test for both HD and SD. See this for more details:

24FPS vs 30FPS issue is a good point, and it's the
display that needs to handle such input, not the player.
But, if your display can't handle it, then it has to be done in the player.