Cable 101: input output imped


My preamp has output imped of 100ohm across the board. I have a few different amps with input imped of 20Kohm, 65Kohm and 2 megaohm. Which amps would be the most interconnect dependent theoretically?

All connections XLR and length of interconnects are 6 meter.

Thanks
128x128glai
2nd Al's input.
I'm using a 100 Ohms output impedance preamplifier into a 10K Ohm amplifier with very good results.
IMO, given that you are driving balanced cables from an output impedance that is low and constant, the connection to all of the amps should be essentially cable independent. At least assuming that cable capacitance is not unusually and unreasonably high.

Conceivably the amps could differ somewhat in terms of the common mode noise rejection capability of their input stages, and in terms of their susceptibility to ground loop effects. To the extent that those differences are significant, there could be differences between the amps in cable sensitivity. But given the balanced interface I doubt that those differences would be significant for components that are reasonably well designed, and in any event they would have no predictable correlation with input impedance.

Also, while obviously much less current would flow through the cable when connecting to the 2M ohm amp than when connecting to the other amps, I don't think that would result in any difference in cable sensitivity, at least not a predictable difference.

Regards,
-- Al