Theoretical Pre Amp Question


Real world answer would be to listen to it both ways and pick, because execution matters, but theoretically...

If a source has a choice of high (2V) or low (1V) output, then at typical listening levels the pre amp will be attenuating the signal to much less than 1V. Which source output level SHOULD be better? Is there likely to be more distortion or noise from a pre at lower or higher input level, even though either would use less than unity gain? If specifically using a tube pre amp, SHOULD the source level have an impact on how much “tubiness” comes through even though there is negative gain? What about potential interconnect effects? Wouldn’t a higher level signal be more resistant to noise as a %?

In an ideal theoretical case there is no distortion or noise. In a real world, empirical test the implementation dictates results. I’m just curious about the in between case of typical expected results based on standard practice and other people’s experience 


cat_doorman

Showing 1 response by kijanki

Benchmark's recommends highest gain settings at pre and the lowest at the power amp.  They provide option of 22dBu (9.8VAC) input in AHB2 power amp.  I understand that they want to move gain from noisy environment (power amp) to quieter environment (pre), not to mention less interconnects sensitivity to ambient electrical noise (better S/N).  Long time ago, mostly in Europe, they had -10dBV  (0.316VAC) standard for line level.  They believed that it will save money since only one item (amp) had more gain stages, while multiple sources had less.  I assume it didn't work (too noisy?).  The most common for line level in US is likely +4dBu (1.23VAC), but I assume that for preamp output it has to be higher since AHB2s lowest level input setting is 8.2dBu (2VAC).  Is there any standard for power amp input? Most of the time 2VAC is mentioned.