Signal degration over length, source vs. output ?


Which signal is more likely to degrade over length, source to amp, or amp to speaker? I have a pair of 300B monos on the way and don't know where to place them relative to preamp and speakers. Is it generally better to place the amps close to the pre-amp and use a short interconnect and a longer speaker cable or the other way around? Any insight would be appreciated.
jamesddurkin
You might just try it both ways. If you notice no differance, then just do what is convenient. having said that, I agree with the above post. 8 to 10 feet is no big deal either way.
Kr4: Since your concern is with loss of power transfer, which do you think "loses" more power / signal ?

A) A long run of 20 gauge or thinner wire ( as commonly found in most interconnects )

OR

B) A long run of 14 gauge or heavier wire ( as commonly found in most speaker cables )

As far as the "voltage losses" that can be compensated for with the volume control, that is not just "voltage" that you've lost, it is a dynamic part of the signal. Since the losses are most likely to take place when the least amount of signal / voltage is present, the likely effect is that one will lose low level information. This results in the masking of subtle details. While some may mistake this loss of signal or noise transfer as a reduction in the systems' noise floor i.e. a "blacker background" due to NO noise or signal being present at very low levels, it is in all actuality, a reduction in resolution and dynamic range.

Obviously, there are pro's and con's to each method. If your system is carefully thought out and uses conductors that are suitable for passing the quantity of signal that will be in operation without incurring measurable amounts of series resistance, chances are, either method will work "okay". Sean
>
We have seen gain in our long cable runs; Why? Because signal degeneration is mostly a product of field contamination and poor resonance tuning NOT CABLE LENGTH. Remenber: alternating current is a complimentary technology that's in diametric opposition. Think outside the box!
Actually, my concern is the harmonic change due to very long speaker cables rather than power or voltage losses. In addition, living with a 10meter run for one or the other, I have experimented with both and found my preference. Of course, I generally use a preamp with a <50ohm output impedance and excellent voltage output.

In the OP's case, (and indeed in any other), best advice is to try both ways if one can.
If you have a solid state preamp with output impedance in the 50-100 ohm range, interconnects are not all that critical. As Kr4 says, impedances for this interface (preamp to power amp) are well defined, whereas the power amp-to-speaker impedance relationship is variable. Therefore go with long interconnects. I note that at recording sessions it is not uncommon to have a hundred feet of interconnect wire between microphones the mic preamp and the recorder.

If you have a tube preamp, with the typical 600ohm output impedance, keep the interconnects short.

These general rules apply for "decent" (affordable)wire. The disadvantages of long wires of either type can be overcome (so they say) by various kinds of exotic wire configurations, for which you will pay dearly.