How to go from RCA to XLR?


I've got an Aragon Stage One processor with RCA outputs and two Aragon Palladium 1K monoblocks with XLR inputs. I know there are a lot of RCA-XLR cables available, but a fabricator told me you have to know which XLR pins are "hot" and these have to match the amps' input circuitry or you will damage the amp.
So how do you know which pins to make hot when you order the cables? When you buy these cables "off the shelf" are you just hoping you get lucky and they match up with your equipment?
Thanks
noslop
Hi Rwwear,

First, I would not assume that the monoblock amps are designed internally as bridged stereo amps, unless you know that to be true for the specific model.

But in order to answer your question, let's assume they are. I'm pretty certain that the design of a mono amp which is internally a bridged stereo amp, and has a balanced differential input, would NOT have one of its two amplifier sections driven off of the positive-going input line, and the other off of the negative-going input line, which I think is what you are envisioning.

If it were done that way, that would defeat the fundamental noise reduction advantage of having balanced interconnect cables and interfaces. As you probably realize, by feeding both the positive and negative signal inputs into a differential receiver device, common mode noise that can be expected to be present equally on both lines gets cancelled.

I believe that what would be done in the design of a mono amplifier that was internally bridged and had differential inputs is something conceptually similar to the following diagram. Note that the second diagram, in fact, shows a single ended input and a differential output, from the same circuit that can also be used to receive a differential input:

http://www.edn.com/contents/images/84302f4.pdf

The op amp that is used in this case (the same concepts would apply to an amplifier stage made up of discrete transistors) receives either a balanced differential input, or a single-ended input referenced to ground, then amplifies the difference between the two inputs, and then in either case outputs an out-of-phase (balanced differential) pair of signals that in turn would then branch off to the two amplifier sections as you envisioned.

So both output amplifier sections would still be driven, and therefore the result would be no reduction in output power capability, just a 6db reduction in gain as I and Atmasphere indicated. Perhaps Atmasphere or someone else can confirm that I am envisioning the design correctly.

Thanks for your good question!

Regards,
-- Al
Thanks Al. The reason I suspect the amps are converted from a stereo design is because the tech sort of alludes to the theory. There's many high end mono amps out there that are done the same. I look back at Audio Research's Classic 60 and Classic 120s as an example. I'm not sure but I suspect the 60s were converted in a like manner. Krell's KSA 300 can be converted to mono in a similar fashion.
Al, if truly balanced is achieved best with mirror imaged circuits, what better way is there than taking a stereo amp and letting one channel drive the positive signal and the other drive the negative? I know this seems like a simplistic approach but it should work maybe at the cost of low impedence drive.