Digital XLR vs. Analog XLR - Balanced Cables


What is the difference between a digital XLR/balanced cable and an analog XLR/balanced cable?

What if I used an analog XLR/Balanced cable to carry a digital signal from the digital output of one device to the digital input of another device?

Any risks/damage, etc. . .
ckoffend

Showing 15 responses by kijanki

Mmike84 - It all depend on the DAC. If you use asynchronous up-sampling DAC that rejects jitter like Benchmark it won't make much difference what cable you use. I built my RCA to BNC cable using 75 ohm Canare coax.
Small correction to my previous post - signal will travel 2x1.5m=3m in 15ns (not 30ns) just clearing about half of typical 25-30ns transition (where threshold is).
Digital and analog cables are constructed differently. In analog cable low inductance, low capacitance, low dielectric constant, purity of conductor and good shielding play important role while digital cable should have exact characteristic impedance (usually 110 Ohm for XLR), fast propagation and excellent shielding.

If you use analog XLR cable for digital link you'll get most likely a little fuzzy sound since jitter = noise in time domain (to be exact jitter creates sidebands not harmonically related to root frequency).

Don't save money on this. If you don't need this analog cable then sell it and get good digital cable.
I don't know why DCS want to transfer signal on two cables (Benchmark uses one for 24bit/192kHz) but jitter rejection properties might make cable discussion irrelevant.
Musicnoise - of course there is a difference in construction and materials. Digital cable geometry is tailored to deliver 110 Ohms characteristic impedance and therefore eliminate reflections. How reflections add up and corrupt edge of the signal (causing jitter) can be calculated using Bergeron diagrams. As for analog cables one of the most important factors is dielectric constant of isolation material. Lowest dielectric constant close to one of air=1 is obtained by using oversized tubes made of foamed teflon. Foamed teflon has even better (lower) dielectric constant than solid teflon while oversizing keeps wire inside away from dielectric. Another factor is purity of the metal - not important with digital cables but very important with analog. The best is very pure zero crystal copper or silver. Purity reaches 99.9999999% for copper (9N) and 99.99999% (7N) for the silver. Zero crystal process is simply forging metal into hot forms instead cold ones. Cooling very slow prevents formation of the crystals (impurities resides between crystals). Zero Crystal copper has just one or two crystals per foot while regular oxygen free copper has few thousands. On the top of this many cables have anti-vibration shields and some have even fluid inside. All this is important of course if you believe, like I do, that cables make a real and big difference. If you don't you can as well use lamp cord - saves a lot of money.
Kal - If you think that characteristic impedance can be different (and it is completely different for analog cable) why don't you connect your TV to roof antena using any cheap shielded cable. Reflections that will apear are pretty much what is causing jitter in digital transmission. As for shields and fluids - I had once inexpensive IC Audioquest Topaz. I read on internet that this cable transfers vibrations (is audible). So I turned volume up and hit the cable few times with the stick (pen) and to my surprise I could hear it in speakers.

As for digital cable being optimal as analog IC - you must be kidding! Digital cables are made with complete disregard of quality of materials. Dielectric constant is ignored since above 100kHz only ratio of inductance and capacitance defines characteristic impedance. Metal is also secondary since signal at these frequencies travels only on the surface (usually silver plated).
Kal - That's a tricky question since there is no parameters for cables at all (other than characteristic impedance - irrelevent for analog).

If you like sound of digital cable as IC then used it. I'm merely suggesting that you won't find good metal or fancy dielectric (like foamed teflon) there. Other way around, you might find IC that has close to 110 ohm or have DAC like Benchmark that ignores quality of the cable. By all means use it. It is also possible that differences are there but you don't hear it - even better because it saves a lot of money.

I tend to do things by the book. When it says digital cable I go to store and buy digital and not the analog cable.

Sometimes things are not audible because are masked by other factors and improving system is like peeling layers of pink from the pink sunglasses - you don't notice each single peel but eventually you'll get clear uncolored picture.
I don't see reason why any company would use oversized foamed teflon tubes and 99.9999999% pure copper in digital cable but if you say they do - I trust you.
Ckoffend - Purcell, being upsampling DAC (and not oversampling), most likely rejects jitter. Quality and type of cable might be not very important (it isn't with my Benchmark - also upsampling DAC).
Here is the quote from Stereophile article "A Transport of Delight: CD Transport Jitter"

"While we're on the subject of the digital interface, I should point out that the engineering for transmitting wide-bandwidth signals was worked out nearly 50 years ago in the video world. In video transmission, the source has a carefully controlled output impedance, the cable and connectors have a precisely specified characteristic impedance and are well-shielded, and the load impedance is specified within narrow tolerances. If these practices aren't followed, reflections are created in the transmission line that play havoc with video signals. This issue is so crucial that a whole field called Time Delay Reflectometry (TDR) exists to analyze reflections in transmission lines.

The audio community should adopt the standard engineering practices of video engineering for digital interfaces. This means designing transports with a carefully controlled 75 ohm output impedance, precisely specified characteristic impedance of the cable (75 ohms with a narrow tolerance), and junking RCA connectors in favor of true 75 ohm BNC connectors. By applying standard video engineering techniques—in use for decades—the high-end product designer can greatly improve the performance of the transport/processor interface. We've seen what happens with a poorly implemented interface with the SV-3700 and different cables: higher jitter in the recovered clock and degraded sound quality. The engineering needed to optimize the digital interface is readily available. Let's use it."

As far as I know bandwith of the cable determines losses in the cable in dB/ft while jitter is strictly property of mismatched characteristic impedance (SQRT(L/C)). Antena/video 75 ohm cables might have different losses (RG59, RG6, RG11 etc) but won't create reflections as long as they have exact 75 ohm. Please correct me if I'm wrong.
Ckoffend - I would say that most, if not all, digital cables have particular targeted characteristic impedance - in case of audio 75 ohm for unbalanced and 110 ohm for balanced. But just look at typical 75 ohm video coax - it is so much different than analog cable. First of all analog cable uses separate wire for the ground. Carrying ground thru shield is really bad idea and grounding shield at both ends is even worse (XLR is but it was mistake). Metal is important only on the surface (plated) because of the skin depth at these frequencies. Lower dielectric constant is important but not as much as in analog cables. Dielectric constant of polyethylene, most likely used in digital cable is 3.3 while foamed teflon is getting close to 1.5 (and oversized air tubes can bring it even bit closer to 1)).

I understand the need for exploration and experimentation but when I buy cooking oil I am not tempted to try motor oil instead just because there is zero cholesterol and no saturated fat in it(and no other parameters to differentiate them). It was designed for the car and I have no reason to question it. It is a matter of taste, so to speak, but we can get easily lost here since changes between cables are very small. Going by the book shields us from many mistakes.
Jitter will transfer from time domain as a noise. It won't change the sound other than making background less black. That was what I noticed with jitter rejecting Benchmark. Its jitter bandwith is in order of few Hz and at frequncies of interest (kHz) gives -100dB rejection of the noise that was at -80dB to start with - practically complete rejection. Cables here don't make any difference - similar with your Purcell but if you have instead of upsampling DAC oversampling DAC or even NOS DAC than digital cable will make huge difference.

There is an excellent article in Stereophile (available on line) on the jitter explaining how sidebands are created, why they are audible and showing everything in numbers with typical transport/CD. Just educational - you can use any cable with Purcell (if I understand it right)
Whenever an electromagnetic wave encounters a change in impedance some of the signal is transmitted and some is reflected (impedance boundary). Reflected signal creates all sorts of shape distortions making overshoots, oscillations and staircase (Bergeron diagrams). Rule of thumb says that you can consider that line (cable) is in the low frequency domain when trise>6t where t is line delay. Signal travels thru conductor at about 70% of the speed of light making 1m in 4.8ns. Multiplying this by 6 gives us 29ns. for 2m interconnect it will be 58ns and for 3m it's 87ns (50ft would be disaster - 438ns) . Most of the output drivers switch below 29ns (much less 438ns)therefore we have transmission line effects. Selecting slower driver by designer wouldn't do any good because it creates noise induced jitter on the receiving end. Receiving end has either asynchronous reclocking in upsampling DACs or dual PLL in the rest of them. PLL, even dual, works poorly for fast jitter.

I still recommend Stereophile article - it might be not up to your standards (as an engineer and/or scientist) but at least it is not as boring as IEEE stuff and one can even understand it for a change. And it is audio related - have I mentioned that?
I do not know what doesn't make sense to you. Frequency of transmitted signal in the cable has nothing to do with transmission line effect no matter what multiplier you put on the top of it. What is important is the highest slew rate appearing. You can transfer 10Hz square wave and still have transmission line effect. Slew rate of about 25ns is very common in the output driver (most of them) but some are even in order of 10ns. Using tr>6t is a very common test if line is not a transmission line and you will find it in many publications. Your 50ft cable (analog or digital) is a transmission line (very bad one) for typical output driver. An no - reflections do not round the edges but create whole havoc by creating overshoots, ringing, staircases etc. If you believe that cables make no difference just say so and do not bring pseudo engineering/scientific arguments here because somebody will always call you on it. As for IEEE - I don't read their journal but was at their meetings - not eager to go back.
Ckoffend - Maybe whole thing it is too technical but it is important to
understand that 192kHz signal is really transmitted at 25MHz. Maybe part of
article below will explain better why digital cables exist. I don't want to
engage more in discussion here since it's becoming counterproductive and
I'm signing off.

"This article is from the Audio Professional FAQ, by with numerous
contributions by Gabe M. Wiener.

5.8 - What kind of cable AES/EBU or S/P-DIF cables should I use? How long
can I run them?

The best, quick answer is what cables you should NOT use!

Even though AES/EBU cables look like orinary microphone cables, and S/P-
DIF
cables look like ordinary RCA interconnects, they are very different.

Unlike microphone and audio-frequency interconnect cables, which are
designed to handle signals in the normal audio bandwidth (let's say that
goes as high as 50 kHz or more to be safe), the cables used for digital
interconnects must handle a much wider bandwidth. At 44.1 kHz, the digital
protocols are sending data at the rate of 2.8 million bits per second,
resulting in a bandwidth (because of the biphase encoding method)
of 5.6 MHz.

This is no longer audio, but falls in the realm of bandwidths used by
video. Now, considerations such as cable impedance and termination become
very important, factors that have little or no effect below 50 kHz.

The interface requirements call for the use of 110 ohm balanced cables for
AES/EBU interconnects, and 75 ohm coaxial unbalanced interconnects for
S/P-DIF interconnects. The used of the proper cable and the proper
terminating connectors cannot be overemphasised. I can personally testify
(having, in fact, looked at the interconnections between many different
kinds of pro and consumer digital equipment) that ordinary microphone or
RCA audio interconnects DO NOT WORK. It's not that the results sound
subtly different, it's that much of the time, it the receiving equipment
is simply unable to decode the resulting output, and simply shuts
down."
Renato13 - It is system thing. XLR protect from induced noise by using twisted pair and differential signal of much higher amplitude but at the same time might slow down transition if drivers have limited slew rate because they swing higher voltage. In addition shield is grounded on both ends - possible source of ground loops. Any jitter creation is always system dependent. It is usually wise to use 1.5m cable because signal travels forth and back (reflection) about 30ns (5ns/m propagation) just clearing original transition that lasts typical 25-30ns. Longer cable adds to noise pickup.