why does preamp improve sound quality?


I recently listened to a Mark Levinson no.390s CD processor directly connected to a pair of Quad II-forty five tube amps. When a matching Quad preamp (much cheaper than the Levinson) was placed between the Levinson and the amplifier, the sound improved dramatically even at moderate volume- it became much more clear and transparent. Why would this be the case? Wouldn't adding an extra piece of equipment add more distortion?
no_slouch

Showing 2 responses by atmasphere

One thing that no-one seems to be mentioning is that the line section of a preamp also has to control the interconnect cable. The better they do this is often also part of the measure of quality a preamp can portray. Passive controls do not control the cable at all (hence the difference in sound from low to high volume) so if a preamp is effective at controlling the cable *and* is competent at the other tasks of a line section (low distortion, wide bandwidth, etc.) then it can *easily* sound better then a passive.

Some might look at this as an impedance issue, and to a large degree it is, but it is not to do with how the unit drives the amplifier so much as whether or not the construction issues of the interconnect are adequately bypassed enough so as to be minimized.

And mind you, not all line sections are up to this task. If not, a passive will be better...
OK- Here's how the preamp line stage should function.

The problem is that the interconnect cable has electrical parameters, capacitance for the most part, that interacts with the signal. The higher the impedance, particularly of the source, the greater the interaction. Reducing the impedance of the amplifier input also serves to reduce the interaction.

The problem is that amplifier inputs have to fairly high to accomodate various preamplfiers, some of which would be unhappy with a low input impedance. Thus most amplifiers are about 100kohms at the input. This is generally high enough not to interact with preamps, but it will interact quite a bit with interconnect cables. The input connectors themselves can play a role at this impedance also.

Thus it comes down almost entirely to the source to control the capacitive (and other) effects of the cable. A passive control tends to have a significant impedance in series with the cable at low volume settings and this causes audible interactions, which are reduced as the volume setting is increased. A line stage OTOH has a fixed impedance (hopefully fairly low) that has less interaction with the cable, and if low enough impedance will 'swamp' most of the adverse effects the cable may have.

IMO, this is one of the more important functions of the line stage- to provide a low impedance buffer to control the cable, while allowing for some gain and use of a proper volume control in such a way that the setting of the control does not interact with tonality, only volume. Its a bit of a task.

The input impedance of the amplifier could be reduced, which would help, but many tube preamps in particular would suffer low frequency loss as the input impedance of the amplifier is reduced, due to the coupling cap at the output of the preamp.

Balanced lines tend to reduce some of the noise pickup problems that are an issue with single-ended cables. To really take advantage of what balanced lines can offer, a low source impedance from the preamp is really nice to have. Then it is possible to send signals over 50-100 feet without degradation.

The recording and broadcast industries have known this for decades, but it is not that well known to audiophiles. But if you think about it, most of the significant recordings from the late 50s and early 60s that are still revered today were done with mic cables that carried the microphone signal as far as 200 feet! And this before the age of exotic cables. None of it would have been possible with a passive control interjected- you need a robust low impedance to make it happen.