Balanced cables


Do different brands/levels of balanced XLR ended cables going to and from differentially balanced components make a difference?
128x128stringreen

Showing 10 responses by atmasphere

Actually EEs are not that rare in the recording world. There also those that are entirely by gosh and by golly. But like any field, they also attend the school of hard knocks. So its unwise to assume that without an education that they also don't know what they are doing.


I must also add I have tried the ground cable from T/T on Pre-Amp, when using standalone Phono. So if Cheater Plug works what is the procedure then?
An isolation power transformer might be used so as to isolate the grounds. Or you could send it back to the manufacturer and see if he can fix it. Hint: if you can measure the resistance between the chassis and the ground side of the RCA connectors and its about 1 ohm or less, then it is vulnerable to ground loop issues.

There are recording studios, recording engineers, mastering engineers, etc... who really go out of their way to use premium cables in their studios. Audiophile approved cables and gear.
There are, but as many recording engineers (many of whom do have a 4-year degree or more) know, if the balanced standard (AES file 48) is observed in the studio, there is no need for ’audiophile approved’ cables as Mogami, Canare and Belden cables will indeed work just fine, and by that I mean won’t sound any different then audiophile cables that cost $1000/foot.

Again, if you are hearing big differences between balanced line cables, what that means is your equipment is somehow failing to support the balanced standard.
The post is way over the top in projected falsehoods: cast and directed scorn, vitriol, and denigration, combined with unsubtle blunt force trauma delivered appeals to authority.
Actually the post to which you refer is pretty truthful whether we audiophiles like it or not. I straddle both sides of this issue, as I simultaneously operate a high end audio company where we routinely see cables making a difference (as I have previously harped, we see both balanced and single-ended operation with our gear) and I also run a recording studio complete with LP mastering capacity. In that studio we are careful to make sure that we do things that will result in good sound, but that does not include high end cables (don’t think we’ve not tried) as such is completely unnecessary if the balanced line standards are observed. This is a very easy thing to do in the studio, as the equipment designed for it observes AES file 48. So the sonic problems we encounter aren’t cable related- they are equipment related and usually pretty easy to spot.

I also agree that far too much compression is used! We had a guest engineer in our studio ruin a session because he over-used a compressor. Yuk.
I get buzzing if I use 2 sets of RCA cables with my standalone Phono stage. i.e. if I run RCA's from turntable into Phono and then I run RCA's into PreAmp I get buzzing. If I run Phono direct to Power Amps (phono has vol. control), no buzzing. Same if I go direct to Pre-Amp with inbuilt Phono no buzzing, but go standalone Phono into same Pre-Amp I get buzzing, so it appears two sets of RCA's from Phono sets off buzzing.
@initforthemusic This sounds like you may have a ground loop between the phono preamp and the line stage. This is usually caused by poor circuit grounding practice.

To test for this, try installing a ground cheater adapter, which you can get at the hardware store, and use it to lift the power cord ground connection on first the phono section, and then also try the power cord ground connection on the line stage. My guess is the phono stage will be the culprit.

Please note that you should not run the system this way as such can be a shock or fire hazard should things go wrong. This is just for test.

if I use the XLR>RCA adaptors from the RCA output of the Krell preamp to the PrimaLuna amp's RCA inputs will all be OK?
Maybe. I would talk to Krell about this, otherwise if in doubt use a Jensen transformer as suggested.
They don't require shielding to achieve low noise.
This is true- I've used balanced Kimber which has no shielding at all and over 30 feet, not hum/buzz whatsoever.
Warren Gehl to my knowledge does some very limited design work for ARC and is key in QC by reportedly listening to every piece of gear hooked up to his reference system of a Ref 6 and Ref 150se and some old, large Magnepans before the piece leaves the factory. It was, however, Ward Fiebiger who took over the reigns from Bill Johnson in the actual engineering/circuit layouts of ARC's top-level gear including the Ref series of preamps and amps. Now does that make a difference? Probably not. Surely Warren knows enough to answer the question being posed here.
Warren has made a lot of contributions to the sound of ARC and little of it has been to circuitry- mostly vibration control, tube choice and similar. Bill hired him for his ears as he was/is very astute.

Ward (RIP) may have been influenced by the Italians who have since learned to let the company do what it does best. I'm not sure where your amp sits in this since it is still current.

I have seen some pretty expensive cables over at ARC (they are a 1/2 hour drive from here) so I've had pretty good reason to suspect that they didn't support the balanced standard (else you wouldn't need the pricey cables). Here is a nice bit about how balanced line works, from the Rane (a popular pro audio manufacturer) website:

http://www.rane.com/note110.html


but I’ve had the suspicion that instead of using differential stages they basically have a separate signal path through the amp (up to the output transformer primary) for each of the two signals in the balanced signal pair they receive for each channel. That would be consistent with a very dramatic reduction of power capability as well as an increase in distortion if the amp were to be provided with unbalanced inputs via RCA-to-XLR adapters or adapter cables, as was found to be the case with the Ref 150 used by the OP in this thread we had participated in some time ago. I believe it would also be consistent with low CMRR, due to the gain and other characteristics of the two paths not matching precisely.
Al, that makes a lot of sense to me, but both Warren Gehl (of ARC) and Kalvin told me that the amp employed a differential amplifier at its input. But they may have been reading off of the same cue sheet, which may not have been accurate. Although I've known both of them for 40 years, on this point I'm more willing to believe your theory as it is consistent with the amp's behavior. I never had the heart to to tell them that if what they said was true that a lot of performance was being left on the table. But who knows- maybe that will be part of the next iteration.

At any rate, that aspect of the amp's performance is well-known and acknowledged by ARC. So in the case of a balanced interconnect, if noise were able to impinge the cable, the amp would not be very good at rejecting it (in the old days this was often handled by an input transformer, which is usually very good at CMRR). So this would seem to make the characteristics of the cable more audible. For such an amplifier, I would recommend a cable that is double shielded.
Its probably not the amp that is causing you to hear the cables so much as the preamp. But there is a recent period of ARC amps that had me scratching my head. I'm not sure which models they are. They were out when Kalvin Dahl (with whom I went to school) was still at ARC (about 2-3 years ago). Apparently the amp has a very low CMRR (Common Mode Rejection Ratio) so it only has balanced inputs. Apparently also if you try to run it single-ended the power goes down and the distortion goes up.

I can't think of a good reason for a low CMRR in a differential amplifier (which is what these amps use). You wind up leaving performance on the table (I've been designing differential circuits since the mid 1980s).

If that kind of amp is used with a preamp of fairly high output impedance as per any of the ARC Ref series, the result will be that cable differences will be heard.

That last bit is the part I don't get- why defeat the purpose of balanced line? At that point you might as well run single-ended, as the ability to run long cables is lost as well.

It is things like this which is (IMO) why the single-ended/balanced debate continues! Think about this for a moment- from whom do you most often hear that saw that 'cables don't make a difference'? Usually its someone with an audio engineering background. Well, most of those people work in pro audio, where balanced lines are used, the balanced standard is observed and cables thus really don't make a difference. But in high end audio, for some reason (my guess is the difficulty) the balanced standards have been largely ignored, so this conversation continues...
Just wanted to reiterate that if designer lays out his signal path to be more direct to the XLR and the gain is set higher to that output it will always sound better which is the intention, if the opposite is done to the RCA out, it will always sound better as well.
This statement is false.

Balanced and single-ended (RCA) operations are inherently incompatible. So if RCAs are used its not balanced, and if the XLR outputs are used (and the preamp is properly balanced) then the use of the RCA connections will result in a buzz.

IOW its one or the other and never both, unless additional active circuitry is used.

The gain has nothing to do with it whatsoever. That is saying that to make something sound better, you just make it louder.
So if a balanced dual mono/stereo preamplifier has both XLR and RCA outputs, and both outputs from XLR or RCA or moving signal from a separate mono channel for left and right, then in essence their both balanced cables doing the same thing. Interconnects are nothing more than ground connectors. As I already pointed out, no one called XLR cables balanced cables for almost forty years until the 80's when dual mono/stereo components were on the rise having the option of XLR or RCA outputs. If XLR cables were invented in the 80's for the sole purpose only to use with high end dual-mono components, then technically it would be a balalanced cable only, not a cable that was given the nickname "balanced" due to its great ground properties which works best with noisy components especially noisy tube amplifiers.
Al addressed this correctly. I do have the feeling though that you did not read my post carefully. XLR connections were in wide use in the 1950s- my Ampex 351-2 tape machine, built in 1957, uses XLRs exclusively.

They were used by the recording and broadcast industries beginning in the 1950s, and their introduction to high end audio was made by me in the late 1980s (we introduced the first balanced line product for high end audio in 1987).

The reason XLRs are used for balanced operation is that the relationship of both the non-inverted and inverted signals with respect to ground is identical. This is important for a proper balanced connection and is something an RCA connector simply can't do.

A very well designed solid state preamp is quiet as a tomb and its redundant to use a 1 meter pair of XLR's since there is no noise to deal with. To create this myth that XLR cables have an effect on the quality and quantity of the music signal is outright fraud. The quality is in the recording itself whether its vinyl or CD and has nothing to do with the wire or the connector. If its a very bad recording its going to sound like crap regardless what cable your using, XLR or RCA. Now if you have a poorly designed preamp with a high level of cross talk and noise than the XLR will help to flush out the noise at the output. Its just wire with a good ground, its not a "mini preamp, a "processor", or a buffer like many in the high end retail continue to perpetuate to make more money.
This paragraph is full of outright falsehoods so I will attempt to set the record straight.

The noise of the preamp is a different thing from the noise that can enter a cable. It does not matter if the cable is 6" or 60 meters. Balanced operation still has a noise advantage with respect to the cable, and the additional advantage of being able to eliminate cable artifact. If you had to pay big dollars for a single-ended cable because that was the one that sounded right, that's the kind of artifact I'm talking about!

Now balanced operation within something like a preamp can also have lower noise but for entirely different reasons. For example, we use differential amplifiers in our preamps; for a given stage of gain, a differential amp can have 6db less noise than its single-ended counterpart. 

Differential amplifiers are in common use in many solid state power amps and many opamps. They are used because they offer lower noise and also greater power supply noise rejection. They can be executed in tubes as well (the first production opamps were made in the 1950s by George Philbrick and were all-tube).

The bottom line is balanced operation is used to reduce or eliminate the sound an interconnect cable might impose in the system, and also to reduce or eliminate noise that might be impinged on the cable by power cords, magnetic fields and the like- these are things single-ended cables cannot do. This is why all recordings since the 1950s employ balanced line connections- its not just so that the cables can be run a long ways, but if you sit and think about it, the fact that the technology prevents the cable from modifying the signal does also imply you can run the cable much longer distances without troubles.  This can be quite advantageous in the home; I keep my amps right by my speakers with short speaker runs, and run the interconnect cables about 30 feet to my preamp which is located at the spot in the room with the least bass (room nadir). In this way I get considerably more definition and less coloration.

Again, you'd think that audiophiles would be all over that!

I knew Robert Fulton as he lived here in town. He was the guy that founded the high end audio cable industry. Back in the late 1970s he had a high end RCA cable, and his Fulton Brown and Fulton Gold speaker cables.

If you run RCAs, the cables to make the connections are the hidden cost of any preamp. If you run balanced, and the equipment supports the balanced standard, the cables are cheap but the sound is better than the best RCAs.

So Ralph, based on your engineering knowledge, why would ARC make those design choices? Can you hazard a guess as to what the most likely reasons would be? My experience tells me that there must be trade-off considerations-at this level of audio, there always are.
You are correct. There are several ways to do balanced operation with tubes. If you want to support the balanced standard though, your options become limited because of the low input impedances the standard requires you to be able to drive, and also there is that issue with ignoring ground as I stated in my first post.

In the old days of tubes, an output transformer was employed. That is how my Ampex recorders (which are single-ended internally) drive balanced lines. When transistors came along, and in particular solid state opamps, it became possible to direct-couple the output. But even with solid state, transformers are still in common use even today.

We developed a third means, which is a direct coupled balanced vacuum tube output, for which we also developed a patent. I'm pretty sure ARC didn't have any interest in infringing the patent, using output transformers or a solid state output, so they used the only means left to them, which was to not support the balanced standard. They knew they had to do something because balanced operation offers too many advantages to ignore!

As a result, you can easily hear differences in balanced cables while using their equipment. This is entirely because the balanced standards are not being observed.
Do different brands/levels of balanced XLR ended cables going to and from differentially balanced components make a difference?
They shouldn't. The question really should be- does the equipment used support the balanced line standard. If no- then cables make a difference. If yes, the cables won't affect things even in long runs.

I feel compelled to set the record straight in a couple of areas:

BTW-There is no such thing as a "balanced cable". That term implies that an XLR/Cannon connector was only designed to be used with fully balanced left/right channel audio components which is very misleading and completely false. There are solid state components that are fully balanced with both RCA and XLR connector's. The inventor of the XLR never used the term "balanced" for his connector. When the very first stereo receiver was invented by Sydney Harman in the 1950's, the Festival 1000, it was a fully balanced design in twin cabinets with the left channel in one cabinet and the right in the other. with a control panel on the front of each unit. A classic dual mono design. The unit had RCA connector's only. The term "balanced" was a label put on XLR cables by Audiophiles in the 80's. You can label an RCA cable as well as a "balanced cable" if its used between fully balanced components.

The above post is 100% false. The Festival 1000 employed single-ended circuits (combined with a Williamson-style power amp section); thus it used RCA connections. RCA connections are not balanced- the shield connection is used to shield the 'signal' but in reality the shield is important as it completes the circuit. In a balanced connection the shield is ignored (unless the equipment in use does not support the balanced standard, AKA 'AES File 48'). The phone company originally used balanced line connections in the 1950s, which is what made trans-continental phone calls possible and the term 'balanced' was used way back then.

As applies to audio, the balanced connection is used to minimize the effect of the interconnect cable. You would think audiophiles would jump on this like a hobo on a ham sandwich! But I have been surprised at the amount of push back over the decades since we introduced the idea. Regardless, if your gear actually supports balanced operation as intended, the cable isn't going to be something that requires audition- its simply going to work without editorializing.

Since I mentioned the balanced standard, here are the important bits again for those that did not read the thread Al linked above
1) the source will be low impedance able to drive 2KOhms with no worries
2) the signal is pins 2 and 3 of the XLR connection, and travel as a twisted pair within the cable
3) ground is shield only and is ignored by both the source and at the receiving end

Its that last bit that gets so many in trouble- if it is not followed, then the shield (just like with single-ended operation) becomes part of the overall sound and you start to 'hear' 'differences'. ARC as pointed out does not support the standard, as they have no preamps that can drive 1 or 2Kohms and their outputs (pins 2 and 3) occur with respect to ground (pin1) rather than with respect to each other.  With such a preamp the choice of cable will be important- ***which means the point of balanced line operation is defeated***, even if the preamp is internally balanced!

We got around this problem with a floating output that ignores ground. In the old days this was done with an output transformer (usually set up to drive 600 ohms which is the old balanced standard input impedance); we are the OTL guys so we did it OTL; our outputs are balanced and direct-coupled (thus also eliminating the output coupling cap; the latter being used often indicates that the circuit does not support the standard...).