... does increasing the bias in a SS amp have an effect on input impedance? Or, put another way, could you increase the operating bias in an amp without raising the input impedance, and if you did that what would be the sonic trade offs?
"Increasing the bias" usually refers to the biasing of the output stage of a power amp or integrated amp. So it would not have any effect on the input impedance, which is determined by the input stage. Increasing the bias can of course affect sonics in various ways, depending on the specific design, but the range of acceptable bias points is limited by some combination of thermal considerations, the characteristics of the particular output devices, the voltages they are operated at, and various other aspects of the specific design. Best regards, -- Al |
Thanks for the mention, John (Roxy54). Following are some excerpts from this thread, in which the same question was discussed several years ago: Almarg 2-11-2014
... having never designed an audio power amplifier, I can't speak
knowledgeably about what the tradeoffs would be if a solid state one
were designed with a high input impedance. Certainly it's readily
doable, but I don't have a good feel for what the inevitable tradeoffs
would be.
In addition to those tradeoffs ... I don't doubt that in many cases a significant factor
is a lack of motivation to provide compatibility with tube preamps.
Kirkus 2-13-2014
... Since the input bias current of a bipolar transistor varies with
temperature, if the DC source impedances of each side (that is, the
input side and the feedback side) are different, then the voltages
developed as a result of the bias current are different, leading to an
offset condition.
If a high value for the input bias resistor is
desired, then the designer could raise the impedance of the feedback to
match, and this would reduce offset drift, but then the noise would
increase as a result of the Johnson noise developed on the feedback
resistor ladder. He/she could reduce the standing current in the input
pair to reduce the bias current, but this would dramatically reduce the
slew rate and input-stage transconductance. (BTW insufficient
input-stage current is the true source of TIM - not global feedback.)
One
could use FETs for the input stage to raise input impedance, but they
have a lower transconductance than bipolars, and most of the good ones
have voltage ratings a bit on the low side. Their higher impedance is
offset at higher frequencies by higher capacitance, which can be reduced
by cascoding, but this in turn introduces another HF pole in the input
stage's response.
A designer could also add a servo to improve
offset, but this is far from free, given the fact that it almost always
requires lower-voltage supply rails from the rest of the amp. For that
matter, he/she could also add an input buffer or balanced input stage . .
. but again, there are more tradeoffs.
But the real question is,
what kind of source impedance should an amplifier designer reasonably
expect to see from the driving source? And in a world where only the
wimpiest of preamplifiers have an issue with a 10K-50K load, how much
extra cost and design effort is one willing to spend in order to satisfy
a few oddball cases?
Almarg 2-13-2014
...
see what I meant when I said that "I don't have a good feel for what the inevitable tradeoffs would be." :-) Regards, -- Al
|