No, I don't think so. As far as transmitting hi-rez audio information the HDMI interface still has some kinks to be worked out. Some would say that the format is irreparably flawed. I don't know if that is true. I think with future versions, eventually HDMI will become the defacto standard for the transmission of hi-rez audio and hi-def video signals. For the here and now I don't think you are missing anything.I use multi-channel analougue RCA connections and find they sound better. I personally would only use HDMI for video signals, directly to my monitor.
I don't think you're missing out, if you're able to get lossless via RCA's. Lossless is lossless. I will say that I love HDMI, as much for the convenience as anything else. Using one cable to replace as many as seven is huge for me. I've been using HDMI exclusively for two years and I've never had a glitch. Right now I have a blu-ray player, a PS3 and a cable box running into an Onkyo receiver, with the video out of the reciever to a Samsung LCD. Over the past two years, I've had as many as five differnt blu-ray players hooked up without the first glitch. So, speaking solely from personal experience, I see no reason to shy away from HDMI due to fears of compatibiity.
Hang on to your DSP-A1-if you're happy with your setup, that's all that matters. I have the same unit you do, and setup, and enjoy it immensly. Rcrerar said it right!
I would agree that HDMI is not a necessity. Mulitchannel out from a high quality Blu-Ray player can sound fantastic.
For video, HDMI does offer some very real benefits though, but no need to route that through a receiver anyway... best quality will be directly from source to video (TV or projector).
I have decided to skip out on HDMI audio since I love the sound of my Krell HTS7.1. I route all HDMI through my DVDO video processor. For BR, the HDMI video is passed through the processor (nothing but a switcher then) and sound through multichannel RCA. Sound is glorious!
Frankly I'm shocked at how non-descript this very topic has remained to the general public. I don't hear reviewers emphisizing this point either. As it's very understood that there are a couple of ways to get the HD codecs to play (either internally from a source, connected via analog, or from an HDMI connection to a pre/pro that has this decoding), it's not been made common knowledge as to what the benefits and downsides are to either method.
Can't someone simply set up an A/B scenario to compare? Then give their impressions? I see/hear non of this.
Why is that?
Once and for all, could someone PLEASE put to rest what is doing what on both (i.e, extensive assessment of sound quality differences and attributes) setup methods! Lol. Er at least point us to some pro reviews and such were this is spelled out in detail.
There are actually multiple issues at play here and there is no one easy answer.
For audio quality, the main issues are decoders, DACs and post-processing modes.
The decoders for the major formats (either traditional DD or DTS or the new lossless formats) should be the same whether they are in the receiver or in the player. As long as the player has all the decoders, simply from the decoding perspective, it really should not matter where the decoding happens. One nuance is that on BR discs there is a secondary audio stream - typically for commentary during the movie. That secondary audio stream can only be accessed when using the decoder in the player.
Once the audio signal is decoded it needs to be converted to analog for playing. The only way to send the lossless digital signals is through HDMI. For the digital to audio conversion, one main issue is the quality of the DACs and associated electronics. Some BR players (like the higher end Pioneer and Denon players) have been optimzed for their analog performance. Many entry level ones have not - they assume HDMI will be used. In any case, you need to compare the D to A conversion in the player versus the D to A conversion in your receiver/processor. So, if you use the analog outs, pay attention to the D to A conversion. If you are a believer in differences in analog cables, that also enters into the equation.
On most receivers/processors the 6 or 8 channel analog inputs get no post processing. So, things like room correction and PLIIx cannot be done on the analog inputs. Those receivers/processors that do allow post-processing on analog inputs need to do a A to D conversion before doing that post-processing. So, in that case you do 2 D to A conversions. Fortunately, most of the receiver/processors that convert the analog to digital for post-processing are pretty high quality and do a good job at this. Since I believe your Yamaha does not do room correction or PLIIx, I do not think this an issue. But newer HDMI receiver/processors typically do have both room correction and PLIIx, both of which can be very nice to have.
As I said, there are lots of issues at play here. You really need to do this evaluation based on the properties of a specific receiver/processor and a specific player and whether things like room correction and PLIIx are important to you. Many people report preferences for where the decoding happens and whether HDMI is used or not. But I think that most of these differences come from issues like the ones above, not from a inherent problem with HDMI or analog connections.
As I understand it, digital SACD output is available only through HDMI, and that seems a good reason to want HDMI for high quality stereo. As far as multi-channel is concerned, few would tolerate having a cable for each channel, a get a bigger hammer solution.
The issue really comes down to where you get the best decoding... a higher end SACD player may actually sound better connected to the pre/pro or receiver via analog connections (due to superior decoding). A lower end SACD player could be much improved if you use it as a digital transport only and connect it via HDMI to do the decoding in the pre/pro or receiver...
For the least amount of money compared to a reasonable sound quality, HDMI does have some benefits... but for high end audio, you could actually end up sticking to analog and get better results.
When given the choice to hook up something like a CD player, people tend to always go with toslink or a digital cable. As I understand, the main reason is to avoid interference?
Assuming you dont have HDMI, why do manufacturers use plain old RCA cables as an option for multichannel input? Why didnt they use 3 (or more) optical or digital cables instead?
Seems like by using RCA cables to hook up blu ray players, that we are regressing back to RCA cables and all the problems that current cable technologies have developed to combat them?
You ask "Assuming you dont have HDMI, why do manufacturers use plain old RCA cables as an option for multichannel input? Why didnt they use 3 (or more) optical or digital cables instead?"
The answer is simple. Optical/coax connections do not offer content protection (i.e., HDCP) which is deemed essential by the content providers. That said, Meridian does do this but, for higher than CD resolution, it is encrypted.
"The decoders for the major formats (either traditional DD or DTS or the new lossless formats) "SHOULD" be the same whether they are in the receiver or in the player. As long as the player has all the decoders, simply from the decoding perspective, it really should not matter where the decoding happens." - Dtc
Well let me just add that has simply not been the case over the years - at least in regards to old Dolby Dig and DTS processing in the past. Back when, when codecs could be processed either internally or via outboard processors, pertaining to standard lower rez digital Dolby and DTS, my experience was that it always sounded better using the DAC's in the outboard processor over those in a standard DVD player! Infact, not only did AV reviewers frequently make mention of this fact (if hinting that there were usually lesser processors utilized in the players - if for other reasons) in magazine articles and reviews, but I never found one instance where any dvd player I ever owned, where that was not the case, either!
So, while I've not experimented with the latest codecs, and differing hookup options - pertaining to the latest Dolby Plus and DTS Master codecs - I'm not certain that things are so simple as to say that it should be equal sound and results, regardless of which hook up method you chose! As I said, my past experience, personally, has shown that there are, at the very least, factors to consider. And all things are not necessarily equal when it comes to processing, hook up, preamplification, etc. I guess what I'm saying is, is that I don't think it's just that straight foreward as is being implied here. I'd like to know from some review professionals, at the very least.
Anyone point me to a reveiw article or two? Perhaps have done some AB'ing of their own?
Quefee - Please read the rest of my post, including the title. The part you quote is about DECODING. It should be the same in both the receiver and the player. The next paragraph is about DACs. And the third is about post-processing.
Your statement "... it should be equal sound and results, regardless of which hook up method you chose" is not what I was saying at all. Decoding should be the same, although there are lots of other factors, including DACs and post-processing.
There are certainly DVD players and Blu-Ray players that have better DACs than some receivers. The DACs in high-end players (Cary, high-end Pioneer, Denon, dcs and others)are certainly better than in low-end receivers (Sony, Onkyo, etc.)
So I agree that is is a complicated issue. And I definitely believe it depends on the specific equipment you have. Everyone needs to consider all the options.