Running a CD player directly into power amps,

good, deleterious, dangerous or simply stupid? Since I never listen to my tuner, and am too lazy to bother with vinyl anymore and never got that tape deck (thank God!), can I go the direct route? An "audio consultant" (a.k.a "salesperson") told me it was unthinkable because of some mismatch in the output of one and the input of the other... He was trying to sell me a preamp. Since, long ago and far away in a different audio galaxy, it was believed that the shortest signal route (all other factors being otherwise equal) would provide the best, least degraded signal, I thought, and still think for that matter, that my idea is swell. A better CD player + a better power amp + new earthshaking speakers and voilà! Am I missing some great truth here?
I see no problem with doing such so long as the input impedance of the amp is at least equal to or higher than the output impedance of the CD player. In other words, the CD player might have a 50 ohm output. The input impedance of the amp should be AT LEAST equal to 50 ohms and preferably much higher. If you were to use an amp with a lower impedance than the CD, sonics might be severely compromised along with circuit stability. Something else to take into account is that the less variance between the two figures might contribute to showing bigger sonic variances between the interconnects used. Sean

PS....The above numbers are stictly "gibberish" and were used for explanation purposes only.
If you buy either a new Resolution Audio CD55 or better yet a used CD50 you will never go back to a pre amp. I run my 50 direct to my Lamm 1.1's. I have tried many different combos of high end CD players and pre amps. I'd have to spend $20K to even get close to what the Res does and it still does not do it better. Removing the pre and an extra set of interconnects from the signal path just makes a huge difference. The Resolution Audio is nice because it has a true analog dedicated pre amp in the player.
You might or might not blow your speaker drivers depending on how powerful your amplifier is or depending on the output voltage of your CD-player. If your player produces no more than 1.5Volt in output than you'll probably have a loud but still listenable sound out of your speakers. It is a good practice to experiment with passive preamplifiers that attenuate a signal and give you an option to match output impedance of preamp with input impedance of power amp. The ideal case will be the infinite input impedance of power amplifier and zero output impedance of preamp. Things aren't ideal in this world so if your power amp has >=50kOhm input impedance, than you're OK with passive preamplifier which isn't realy amplifier since it doesn't have any gain. If you have a tube amp it'll work out great.
I assume that your cd player has a variable output so that you can control volume. The sound will depend on the quality of the volume control of your cd player, length of cables it needs to drive and whether it's output is powerful enough to drive your power amp. It can work but in most situations a preamp of some sort will sound better.
Most modern CD players can run straight into an amp with no problem. However, the lack of volume control is often a real inconvenience. Some CD players have built in volumes to solve this very problem (a passive attenuator is another possible solution).
Yeah, i guess i should have commented that you will either need a variable output on the CD or input attenuators on the amp. Running line level out of the CD into most amps will either blow the speakers or the amp. So much for "assuming" : ) Sean
I have recently found a CD player which bests the sound through a pre-amp and comes damn close to vinyl. If all you use is digital, the Audio Aero Capitole 24/192 may be perfect for your situation. It has a gain control with remote and has a huge amount of output (4.5v.) I hear no loss in dynamics and am obtaining much better clarity.

Call Jim at Chambers Audio, he is a gentleman and will be more than happy to take care of you. Tell him Jonathan says hello.
Amazing how many people talk nonsense.
Every Cdplayer output has a static output impedance below 300 Ohms. And Every power amp has an input impedance of several tens of thousands of Ohms, save for the two or so esoterics from the far east that you would not pay for anyway. Equal (matching) impedances are only relevant in High Frequency (radio/TV/Digital signal processing) domains, where power has to be transferred without loss, and signal reflections have to be avoided.
In audio only signal VOLTAGES are transferred, and no power is involved.
So the remarks that "the cdplayer's output has to have enough power to drive the input of the power amp" or "beware of impedance mismatches" are complete bullshit.

Be sure that your CDplayer has an analog volume control. Why?
Because digital volume controls in CDplayers are usually implemented in the digital filter circuit preceding the D/A converter chip, and in this configuration every 6.02 dB attanuation from full volume down provokes the loss of 6.02 dB (=1bit)of resolution. Normal listening level is around-20 to -30 dB below maximum volume, meaning that your higly paid 20bits-resolution of 100+dB's(including noise floor) goes 20-30dB down the drain (and the noise floor is NOT changing), leaving you with 70-80dB of real digital resolution which is a mere 13-14bits.

The whole issue of compatibility has nothing to do with imaginary output or input impedances, or output levels(as long as your player has an analog volume control built in), but with signal ground pollution, spurious HF noise on the signal lines, and proper phase behaviour due to the CDplayer's output stage configuration, aka inhowfar this stage can drive the cable capacitance. This is sometimes very difficult , especially for a player with budget opamps in its output stages, that are heavily compensated for stability. Their output impedance can be low enough but that is only the result of lots'a Negative Feedback. A sure recipe for phase errors and instability.

A good preamplifier can vastly improve on the sound of such CDplayer because it acts as a 'corrector' for all those small disturbances, buffering instable outputs of otherwise fine cdplayers, therefore preparing a much cleaner signal for the power amp.

And with a good preamp you can leave the cdplayer's internal digital volume control to maximum output, preserving your 20bits of resolution.

Only a CDplayer with analog volume control and a decent output stage (they mostly go hand in hand) will show its full potential of sound quality when connected directly to a power amp.

Do not listen to self-educated charlatans with distorted technical orientations, but check with reliable retailers and listen to the equipment youself.
No technical explanation can replace a good listening experience.
A rule of thumb is to have the input impedance 10 times the value of the output impedance of the driving equipment. If the input impedance of the amp is 10K ohms (most are at least that) then the output impedance could be as high as 1000 ohms.

An analogy to a battery might help to understand the situation. A battery can be modeled as a perfect voltage source (maintains its voltage even if the current is infinite) in series with a small value resistor. As you pull more current out of the battery, the voltage drops a little because of the low value resistor that is "inside" the battery. As the resistance of the load gets smaller it will try to draw more current from the battery, but the "resistor" limits that current somewhat. One can see that the "resistor" inside of 12 volt lantern battery is much larger than the one inside a automotive battery. If a low resistance load (like a automotive headlight) is connected to an automotive battery it is going to be brighter than when hooked up to the little flash light battery.

The idea is the same in a passive volume control. (several CDP's use this on the output while others put active stages following the volume control and then there are the ones that use digital attenuation in the output stage) A passive's maxium output impedance is TYPICALLY at mid position (electrically) and is approximately half the value of the variable resistor. This means that a 10K pot will have about a 5K output impedance and might be suitable for use with a tube input amp as long as the IC cables were low in capacitance. If the volume adjust is followed by a electronic stage then the design of the stage determines the output impedance. (typically a 200 ohms or below)

I do agree that a well designed CDP could be better than a preamp just because it will have less connections in the signal path. The big question is what entails a "well designed" CDP. I believe that a good output will sound good and will look forward to comments from others regarding the sound of CDP with variable outputs. Chris
One comment on Croese's dislike of digital volume controls: Accuphase and Wadia have taken steps to overcome their USUAL defects, and in my experience with an Accuphase DP-75, they have succeeded. I used my DP-75 with excellent cables and an excellent passive preamp, and still lost detail versus the DP-75 "straight into" my power amp(s). An internal stepped attenuator might be better yet, saving the cables and their connectors, but I can't hear any degradation if I keep the digital volume attenuation below 12 decibels, which in my system is always enough. Accuphase claims you can go up to 20 decibels of attenuation and I have no reason to disbelieve this, but if you can arrange to use as little as possible, that couldn't hurt.
Croese, i'm assuming that your commentary pertaining to "bullshit" and "charlatans" was aimed at myself or possibly one other post previous to yours. While i can't respond for the other participants, I was simply trying to cover all the bases in a very "general" statement.

Since i am not familiar with ALL of the equipment out there, nor do i think that anyone else is, i used an example for those not electronically inclined to follow along with. I was trying to make sure that the analogue output of the CD player or DAC would not be "loaded down" under ANY circumstances. As such, i chose to post "worst case scenario" using generic figures and figured that ANYTHING above that would perform acceptably. While i'm sorry if my use of 50 ohms threw you for a loop or led to a misunderstanding, re-reading of my post should clarify things. I specifically stated that it was preferred that the input impedance of the amp be higher than the output impedance of the source. If i would have used non-RF based impedance figures or figures in the thousands or tens of thousands of ohms range, would that have made you happier ???

As Chris stated, it is common practice in the audio field to shoot for a "X10" difference in impedance from the output of one piece to the next gain stage. As such, i was not suggesting that one need worry about VSWR in audio gear although i do not doubt / deny that this factor MIGHT come into play in different ways. This is a subject that still has a LOT of "grey areas" to it and subject to much debate by folks FAR more knowledgable and experienced than i will ever be.

As far as that goes, i know my limitations and try to work within those confines. If i am making suggestions or comments based on anything but verifiable data or first hand experience, i typically state such. There is nothing wrong with presenting ones' opinion, but it should not be presented as "fact" unless one has done all of the necessary testing under various conditions to verify such. Sean

**do not listen to self-educated charlatans with distorted technical
orientations, but check with retailers and listen to the equipment

since you choose to be so obnoxiously pedantic, perhaps you'd like to reveal your qualifications. while you're at it, why don't you register so we can obtain more of your "knowledge." BTW, you’ve obviously not heard or heard of the accuphase cdp’s. the dp-75v, for example, has a digital volume device that drops NO bits. -kelly