Another Basic Question about Preamps

Is it correct that the output impedence of the preamp should be less than the input impedence of the amp?

Is there any rule of thumb about the relationship between these two numbers?
Yes, preamp output impedance should be much less than power amp input impedance.

The commonly cited rule of thumb is a 10x ratio. However, a 10x ratio is truly comfortable (in assuring that frequency response flatness will not be significantly affected by the interaction of the two impedances) only if the worst case (maximum) output impedance of the preamp across the 20Hz to 20kHz frequency range is known. Manufacturer's specs commonly just indicate a nominal output impedance, presumably at 1kHz or some other mid-range frequency.

If the preamp has been reviewed by Stereophile, John Atkinson's measurements usually indicate the worst case output impedance. If only a nominal output impedance is known, IMO a minimum ratio of at least 50x, and ideally 75x, should be used, especially if the preamp may have a coupling capacitor at its output (as most tube preamps and some solid state preamps do). An output coupling capacitor will typically cause a large rise in output impedance at deep bass frequencies.

-- Al
At least a ratio of 10. The output impedance of the preamp indicates its reluctance to deliver current--the higher the impedance, the harder it is to send current. The input impedance of the amp is associated with its need for current--the lower the input impedance, the more current that it requires to perform the same task.
A voltage divider effect ensues, which is minimized as the ratio of output/input impedance increases. I have heard that for good sound, 10 is the minimum for this ratio, but the higher the better.
I have always gone by 20:1 as general rule of thumb, though there are exceptions.
Thanks - this helps a lot. Are there other key specs that you find to be important?