Why do some fine solid state amplifiers like Soulution have such low input impedance?

I was looking at an excellent deal on a 5 series Soulution stereo amplifier to mate with my VAC Renaissance 5 preamplifier.  I then found out that the Soulution has an input impedance of 2000 Ohms balanced. Although my VAC is transformer coupled at the output, I am going to pass on the Soulution. Each component is too good alone to worry about a compromised “marriage”.  Do SS amplifier manufacturers find sonic benefit in such low input impedances, or is it really to discourage use with tube preamplifiers and encourage sales of their own preamplifiers?   
Well, most SS preamps can drive 600 ohms without issue. The lower the impedance, the lower the EMF/RFI noise that can be picked up.

There may be other benefits, for instance, removing a gain stage, or minimizing it in terms of noise produced by the amp. It's been a very long time since I did this math.
I should add, you probably should listen to the combination.

The worst that happens is a loss of highs at the extremes of the volume knob.

You may not care, or even like it. :)


Almarg, your assistance is needed...
I was going to guess -- "noise".
Its not how i design, but its legitimate. It may also be delectable or easily changed. Often its just a resistor.
Thanks for the mention, John (Roxy54). Following are some excerpts from this thread, in which the same question was discussed several years ago:

Almarg 2-11-2014

... having never designed an audio power amplifier, I can't speak knowledgeably about what the tradeoffs would be if a solid state one were designed with a high input impedance. Certainly it's readily doable, but I don't have a good feel for what the inevitable tradeoffs would be.

In addition to those tradeoffs ... I don't doubt that in many cases a significant factor is a lack of motivation to provide compatibility with tube preamps.

Kirkus 2-13-2014

... Since the input bias current of a bipolar transistor varies with temperature, if the DC source impedances of each side (that is, the input side and the feedback side) are different, then the voltages developed as a result of the bias current are different, leading to an offset condition.

If a high value for the input bias resistor is desired, then the designer could raise the impedance of the feedback to match, and this would reduce offset drift, but then the noise would increase as a result of the Johnson noise developed on the feedback resistor ladder. He/she could reduce the standing current in the input pair to reduce the bias current, but this would dramatically reduce the slew rate and input-stage transconductance. (BTW insufficient input-stage current is the true source of TIM - not global feedback.)

One could use FETs for the input stage to raise input impedance, but they have a lower transconductance than bipolars, and most of the good ones have voltage ratings a bit on the low side. Their higher impedance is offset at higher frequencies by higher capacitance, which can be reduced by cascoding, but this in turn introduces another HF pole in the input stage's response.

A designer could also add a servo to improve offset, but this is far from free, given the fact that it almost always requires lower-voltage supply rails from the rest of the amp. For that matter, he/she could also add an input buffer or balanced input stage . . . but again, there are more tradeoffs.

But the real question is, what kind of source impedance should an amplifier designer reasonably expect to see from the driving source? And in a world where only the wimpiest of preamplifiers have an issue with a 10K-50K load, how much extra cost and design effort is one willing to spend in order to satisfy a few oddball cases?

Almarg 2-13-2014

... see what I meant when I said that "I don't have a good feel for what the inevitable tradeoffs would be." :-)

-- Al 
I noticed that at least some SMcAudio-revised McCormack amps have much lower input impedance compared to the stock models. I don’t know what specifically Steve does in the revisions that produces this, but given the results he clearly feels it’s a worthwhile tradeoff for improved sonics. But given Steve’s practical nature and Kirkus’s point about limiting the sonic benefit for 95% of audiophiles to accommodate a few outliers, the choice for a lower input impedance seems like pretty reasonable one in the scheme of things. No?

As a stupid follow-up question, does increasing the bias in a SS amp have an effect on input impedance? Or, put another way, could you increase the operating bias in an amp without raising the input impedance, and if you did that what would be the sonic trade offs?  Of course there are Class-A amps that seem to have reasonably low input impedances, so maybe I answered my own question.  Yes I know, I’m a moron.
I'm not sure I would agree with Kirkus' assessment of FETs limiting bandwidth. Its all in how you design the circuit.

I can't think of a good reason for such a low input impedance, unless the equipment was originally designed for studio use, in which case I can think of some excellent reasons. When we built our MP-1 preamp, despite it being the first balanced line preamp made for home use, it did not occur to us to not support the balanced standard (A.K.A. AES48).

So our preamps (which are tube and have a patented direct-coupled output, as a means to get around having to use a line transformer) have no problems driving input impedances this low, since this sort of thing is common with the balanced standard.

This being the simpler explanation (support of the balanced standard), I suspect it to be the correct one. 

At any rate you are not limited to solid state preamps, you can use tube preamps and the latter does not have to have a line output transformer to do the job.
... does increasing the bias in a SS amp have an effect on input impedance? Or, put another way, could you increase the operating bias in an amp without raising the input impedance, and if you did that what would be the sonic trade offs?

"Increasing the bias" usually refers to the biasing of the output stage of a power amp or integrated amp. So it would not have any effect on the input impedance, which is determined by the input stage. Increasing the bias can of course affect sonics in various ways, depending on the specific design, but the range of acceptable bias points is limited by some combination of thermal considerations, the characteristics of the particular output devices, the voltages they are operated at, and various other aspects of the specific design.

Best regards,
-- Al
I can attest to Steve McCormack's use of low output impedance on his amp revisions.
When I got my DNA-1 monoblocks back the impedance was so low I now need to use a Jensen Transformer to covert the XLR signal to RCA in order to use my Vandy crossovers at lower impedance than I can get using balanced. A bit of a bummer. And, I still haven't had time to install everything. Ralph's MA-1's/MP-3 are so good that I have little impetus to change things.
Oh well, something to work on later...
Hi Guys - I just saw this thread and thought that I ought to chime in to (hopefully) clear-up any potential misunderstanding. All of the McCormack DNA amplifiers except the DNA-500 and DNA-750 monoblocks have a 100kohm input impedance, and this is true whether they are stock or upgraded. It is *only* when adding balanced inputs or converting a stereo amp to a monoblock that the input impedance drops to 10kohm, and this is because of the nature of the transformers I use for balancing and phase-splitting. I prefer these transformers to any other technique I have found, but it is true that the 10k input can occasionally cause some impedance-matching issues. Usually this only comes up when there is a speaker that requires the use of a passive high-pass filter before the amp input and the capacitor values must be matched accordingly (as in the case of the Vandersteen speakers with self-powered woofer sections). There have also been a few cases where a tube preamp had an unusually high output (source) impedance (over 1k), and this means that the impedance match (following the 10-to-1 rule of thumb) is less than optimal (even though it may still work well and sound good).
The technique I use for creating balanced inputs or a monoblock amp treats both the XLR and RCA inputs the same way, so both are the same 10k input load. I have made a few stereo amps with a switchable input feature so the user could have both a balanced 10k XLR input and an unbalanced RCA at 100k, and that has worked well for those folks.
Bob, I apologize but I am confused by your comment and don't understand why you felt the need for an additional transformer conversion. I'm glad you are enjoying your Atmasphere setup but if I can help you with your DNA-1 monoblocks, please let me know.
And, as always, if anyone else has any questions about my gear, feel free to contact me at SMc Audio.
Cheerio -
Steve McCormack