In the old days amplifiers like my old Quad 33 had interchangeable input boards with gain and other characteristics that could be matched precisely to the cartridge for optimum S/N, frequency response and distortion. Quad supplied them ready made for the most common high end cartridges of the day, and others could easily be made from blank boards.That same amplifier had similar adjustment options for the tape input and output. As you are discovering, this is indeed very important.
The same issue also exists with the analogue output of CD players or DACs. CD Red Book specifies a 2V output, but many amplifiers have much more sensitive line inputs. The result is that the input stage may be driven into clipping, with increased distortion and compression of peak signals as a result. The solution is simple: just fit inline attenuators or an attenuating cable. The result is often a far smoother sound. The giveaway sign that this is indeed a problem is when you only have to turn up the volume control a little bit to have a loud signal.
The commercial reason behind this strange situation is that the human brain interprets louder as better. So if a CD player or DAC is slightly louder than another, this gives an unfair advantage in de demo room because the innocent listener will interpret this difference in level as a quality difference (as that is how the brain works). The same applies to amplifier input sensitivity. The more sensitive amplifier will be louder and hence seem to be better. Proper comparison should be level matched within 0.2 dB, but that requires the serious Volt meter that I at least have never seen in a demo room.
The same issue also exists with the analogue output of CD players or DACs. CD Red Book specifies a 2V output, but many amplifiers have much more sensitive line inputs. The result is that the input stage may be driven into clipping, with increased distortion and compression of peak signals as a result. The solution is simple: just fit inline attenuators or an attenuating cable. The result is often a far smoother sound. The giveaway sign that this is indeed a problem is when you only have to turn up the volume control a little bit to have a loud signal.
The commercial reason behind this strange situation is that the human brain interprets louder as better. So if a CD player or DAC is slightly louder than another, this gives an unfair advantage in de demo room because the innocent listener will interpret this difference in level as a quality difference (as that is how the brain works). The same applies to amplifier input sensitivity. The more sensitive amplifier will be louder and hence seem to be better. Proper comparison should be level matched within 0.2 dB, but that requires the serious Volt meter that I at least have never seen in a demo room.