RCA to XLR adapters?


I currently have a complete single end (RCA) input system but was possibly looking at other amps that are fully balanced and accept only balanced or XLR connections. My question is will these adaptors give you the full benefit of a balanced amp or preamp? Or will it simply was work ok? I have expensive cable that I will not replace but is terminated with RCA's.
bobheinatz

Showing 2 responses by kirkus

I assume that is the architecture used in Ralph's designs. So the question becomes whether or not the input stage of all or at least most other "fully balanced" amps can be expected to have been designed in an equivalent manner. I don't know the answer to that question.
Al, you're exactly correct -- the design of what constitutes a "balanced" input stage, and a "balanced amplifier" circuit, varies wildly among different manufacturers and circuit designers.

There *should* really be two different, independent aspects to "balanced" equipment design -- the first being the rejection of noise as a result of equipment interconnection, the second being any performance benefits/demerits of differential circuit operation. From what I've seen in high-end audio, designers and engineers get these two goals confused the overwhelming majority of the time.

I know you're familiar with Whitlock's papers on the subject of balanced interconnection - these give excellent insight into the issues of balanced input-stage design and how common-mode rejection is dependent on the balance of impedance (NOT voltage!) presented by the line driver and cable/connectors. Basically, the input stage's tolerance to source imbalance is a function of the ratio of the common-mode and differential-mode impedances. That is, for a given differential-mode (signal) input impedance, the higher the common-mode impedance . . . the greater tolerance the balanced input has for source impedance balance errors. This is how a balanced input circuit with a high common-mode impedance (i.e. a transformer) can indeed give substantial rejection of ground noise from an unbalanced source, provided it's wired with a properly-terminated balanced adapter cable.

Where the necessity for balanced voltages comes into play is for the rejection of even-order distortion products -- this occurs by enforcing symmetry in the circuit's transfer function . . . and the majority of designers confuse this goal with that of interconnection noise rejection.

A common example is this notion of a "balanced amplifier" being simply two amps stuck in a chassis wired to different pins of an XLR connector. In this instance (as you correctly postulated) any voltage imbalance between the two amplifier stages will disturb the necessary null for cancellation of even-order harmonic distortion products. And this voltage imbalance will be affected by any impedance OR voltage imbalances in the preceeding equipment, cables, or the input termination resistances . . . not to mention any differences in gain, distortion performance, or bandwidth between the two amplifier stages. It's even common to see this approach on with two otherwise conventional solid-state power-amp circuits, which makes very little sense at all . . . because the differential input stage alone (with high open-loop gain) is very effective at eliminating even-order distortion products, and the load impedance is then effectively halved across the amplifer output, making the output stage much less linear (everything else being equal).

I think it's exceptionally bad form to design a preamp or amplifier in this manner, but it's alarmingly ubiquitous . . . I guess because this idea of "balanced" is so in vogue right now. There's also a common variation on this where the two amplifier stages share a common differential feedback ladder (like an "instrumentation" op-amp), making the common-mode gain unity, and (hopefully) substantially lower than the signal gain. This latter topology can be somewhat successful, but IMO it's still a bad choice to rely solely on this for input-stage noise rejection . . . the noise is still present to a degree through the entire circuit, and can intermodulate with the signal to a degree dependent on the circuit's linearity.

I believe Ralph's amps are based on differential amplifiers for each stage, and this is one (rare) instance where a single approach can work reasonably well for both interconnection noise rejection and even-order distortion cancellation. Here, the tolerance of the amplifier to source impedance imbalances is mainly the choice and tolerance in input termination resistors, and the tolerance of source voltage imbalances is a function of the transconductance of each stage, and the matching of certain circuit elements (i.e. input triodes and plate-load resistors).

So the short answer to the original poster's question is . . . how well the amplifier works with a simple adapter is dependent on how well the amplifier is designed . . . and as always, simply having a brand name with "audiophile-cred" doesn't mean it's well-designed.
Mikey, I can relate to wanting to preserve the resale value . . . But having the correct termination for your application pretty much trumps all other cable-quality parameters. Also, the additional mechanical stress that Atmasphere noted from using an adapter is a very real and common issue; one errant tug on a big XLR cable/RCA adapter can wreck the preamp's connector in pretty short order, leading to a much greater loss in value.

But if you're just wanting to try out the CJ preamplifier before making a final decision, here's what I would do. First, contact Krell and see if their power amp will work with an adapter cable. If they say yes, then use the Markertek adapter (pin 3 grounded) and your existing XLR cables. If not, then use a Jensen DM2-2RX with a short RCA cable from the preamp.
Then just use an RCA cable to connect the DAC to the preamp via unbalanced in/out.

And if you keep this setup, then you can upgrade RCA cables as you see fit, and/or reterminate your amp cable.