How does the input impedance of an amplifier effect it sonically?


I understand the effects of an output to input impedance miss-match, but what I don’t understand is
why there’s such wide range in (especially input) impedances. Most tube amplifiers have a very input impedance. Solid state on the other hand has impedances that range from 5-250k. Why so much variance and how does it effect the sound of an amplifier, if it does at all?
hiendmmoe
One thing for sure will effect SQ is low LOW, like the old VTLs. .75 or something. Most are 1.2? My Modded MACs, would OVER drive them to over 320 watt per 300 watt rated MB.

When you played at low volumes they didn't sound right. If I remember it was a Pro preferred setting.  I had to do a couple of things to make them work, fixed output, sucked too.  I fixed that too. had a 3.2 and a 5.9 tap..

Just the way they did it.. not a very good idea, in hindsight, kind cheap...kinda lazy... You had to match those for sure. You should hear them push good/big horns though...My lord.. 115-120 the ol spl meter..

Regards
That's a very good question. The simplified answer is that the resistor which sets the input impedance is usually sized the same value as the global negative feedback resistor in order to minimize DC offset from a gain mismatch between the two differential input transistors. The value of the gnfb resistor depends on many variables and they are different among different topologies. That's the main reason.

As far as SQ goes, the smaller the resistance the better. This is way over generalized, but resistor noise is proportional to the product of resistance, temperature and frequency bandwidth. So it is desirable to go for a lower resistance in both the input and gnfb resistors. A lower value feedback resistor also allows a smaller value resistor on the feedback shunt, lowering noise further. Noise at the input is carried all the way through to the output. However, overall linearity is more important for sound quality (lower distortion) than resistor noise, hence the tradeoff as to why not all amps have 10K input resistances.
Ya know when I read your question again. It has to do with Pro vs Home stereo gear, too. The difference between the two.. You could  plug something into a lot of the old amps for amplification, Macs, VTL, for use in a band (LIVE), PA, all kinds of apps, ay? Some could get by doing both, but not both well, Mac is the exception..Sure there were a few more.

There are a lot of music guys that are stereo guys too.. They probably know....

Regards
+1 gs5556! A good engineering-based answer! 
The simplified answer is that the resistor which sets the input impedance is usually sized the same value as the global negative feedback resistor in order to minimize DC offset from a gain mismatch between the two differential input transistors. The value of the gnfb resistor depends on many variables and they are different among different topologies.
This is true of opamps, but not really true for most power amplifiers. The input resistance is dominated by a resistor that goes to ground at the input of the amplifier. With tubes this part is usually what is called a ’grid leak resistor’ as it prevents electrons from building up on the grid of the input tube and causing it to shut off. With signal tubes this value can be quite high.

Depending on what is at the input of a solid state amp, the input resistance can be one of several things. Quite often it is part of an input network to the amplifier to limit Radio Frequency Interference (RFI) and also to block DC (which is done by an accompanying capacitor). Many solid state amps have transistors at their input that can be easily damaged, so a coupling cap to block DC voltages from the outside world is helpful (and also helpful if there are DC voltages at the input of the input device to set its correct operating point).


Since high frequencies can be rolled off by stray capacitance (such as the input cable) and something called Miller Effect (which is the source impedance interacting with the input capacitance of whatever active device is at the input of the amplifier) it is useful to keep the input resistance values low. Since solid state amps are often driven by solid state preamps, this low resistance (10K to 30K or so) can be easily driven. But tube preamps (not all mind you; our preamps can drive very low impedances) can often have a problem driving resistances this low, so if you plan to use a tube preamp with a solid state amplifier you have to pay attention to the input impedance of the amp and the output impedance of the preamp (which generally speaking should be 1/10th or less than that if the amplifier).


Any other combination- tube preamp with tube amplifier or solid state preamp with tube amplifier is no worries as the 1/10th rule is easily met.

Beyond these issues the input resistance does not affect the sonic character of the amplifier at all.
If the preamp is struggling, doesn't that affect the amp's output in the bass more than in other regions?  I seem to remember this from John Atkinson comments.
That depends on what is meant by 'struggling' as there can be several issues. One is that the amp needs a lot of input voltage and this has nothing to do with the input impedance. The other is that if the preamp has an output coupling capacitor, its output impedance at 20Hz can be quite a lot higher than it is at 1000Hz, and this can affect the bass if that 1/10th rule I mentioned is not observed- while taking the 20Hz output impedance into account.
So I assume 1/10 isn’t going to enough of a safety net when matching preamp to amplifier input impedance. Since impedance can vary with frequency and most manufacturers don’t provide accurate numbers 1/20 sounds like a safer bet when matching pre to amp.
@hiendmmoe  If the manufacturer can't provide that 20Hz output impedance value, you probably should look elsewhere! The 20Hz value can be a multiple of the 1KHz value; it could easily be 10x the value at 20Hz. We got around this issue with our preamps by direct-coupling and so the output impedance of our balanced preamps is the same at all frequencies. 
Is it then more important to know the correct value of the output impedance of the pre-amp at 20HZ than to know how accurate the input impedance of the amp is.
As for the amplifier input impedance is 20HZ also the number I should look for when a specific input impedance is given. If I understand correctly, 20hz is where the amplifier will show its  minimum value and any higher specification (frequency) given for impedance  is not its true value.