difference from higher input impedence on amp?


My preamp output impedence is 300 ohms (AR LS-15) or 1K (Aesthetix Janus). In either case, the default 60K input impedence on my Baron amp is way more than 10x, so it should be acceptable. Some comments on these forums imply that a higher delta is better. The Baron's input impedence is adjustable to either 60K or 110k (with effort, via an internal resistor). What sonic or operational change could be anticipated from raising the input impedence of the amp? Worth the bother to experiment?
128x128lloydc
I do not think you will here a differance considering the output impedances of your preamps
Don't know how you were intending to experiment, but I keep Rothwell IC attenuators on hand which simply plug into the amp's inputs. I think Partsexpress has cheaper ones. The Rothwells are 10db usually, but they do make a 20 db which you usually have to order from a distributer.
If they use a standard configuration, using a plug in attenuator is not the same as raising the input impedance of the amp internally. I'm assuming the resistor being talked about is from the input to ground. The plug forms a voltage divider which does increase the load R that the preamp sees but it also lowers the voltage that goes into the amp. Whatever changes are heard could be due to the increase R that the preamp sees or the fact that it has to be turned up more to get the same volume or a combination of the 2. If the system gain is high enough then it should be no problem trying the attenuators. You can build your own with a few dollars worth of parts from Radio Shack as a cheap experiment.

I agree with 4est, unless you try it you will always wonder about it.

.