The digital cable is actually AES/BU and the pinout will not be the same. I've never tried a balanced XLR as a digital link and can't tell you if it will cause any damage. I would recommend caution.
40 responses Add your response
Digital and analog cables are constructed differently. In analog cable low inductance, low capacitance, low dielectric constant, purity of conductor and good shielding play important role while digital cable should have exact characteristic impedance (usually 110 Ohm for XLR), fast propagation and excellent shielding.
If you use analog XLR cable for digital link you'll get most likely a little fuzzy sound since jitter = noise in time domain (to be exact jitter creates sidebands not harmonically related to root frequency).
Don't save money on this. If you don't need this analog cable then sell it and get good digital cable.
Thanks for clarifying this clearly for me. I checked other answers that you have submitted and you appear to have a very good understand of electrical signal processes. While I had planned on getting a pair of balanced digital cables, I had hoped to initially do a test with analog balanced cables initially. My intention is to use two cables to carry a 24/192 ks/S digital signal as single digital cables (outside of 1394) do not have sufficient bandwidth to accurately carry this signal. I am going from an upconverter to a DAC.
I've played with this exact scenario a lot over the past few months. I've been using a Cardas Golden Ref analog XLR, and a Cardas Digital XLR Cable. First as has been discussed here already the connection pattern is exactly the same. Meaning they are interchangeable as far as as termination and connection. Secondly the analog XLR does almost as good a job as the digital XLR cable. I'm no expert in explaining what the differences are electrically or anything, but I can give you my opinions based on actual sound comparisons. Third in one of your follow up posts you mention "getting a pair of digital XLR cables". Unless you have more than one DAC you'll only need a single digital XLR. Lastly I have found significant sonic improvements by keeping the digital signal in the digital realm as far down the chain as you can get it. If you do the digital realm can compete favorably with even the best analog rigs. There are some companies that make digital preamps and amps which if you don't need an analog option for a turntable, or have a very large budget, will improve your end sound as it relates to price point.
I am running a DCS purcel to a Delius. The Purcel upsamples to 24/192 ks/S, but the only cabling method that is capable of delivering this highest upsampling is by using a pair of XLR (AES/ebu) cables, each delivering 96 rates per cable, combined to deliver a total of 192 via the two cables - at least for the DCS equipment. The ideal cabl would be a 6" cable + the terminations.
I have plenty of spare balanced analog cables, but have not yet purchased a pair of balanced digital cables for this applicaiton.
The analog vs digital XLR cable construction, materials etc are not going to make any difference - if indeed there is any difference in construction. As to carrying digital signals are not going to introduce jitter with the cable. Other than compatibility with connectors, there is no benefit to using a balanced connector for transmission of digital signals - there would have to be an incredible amount of noise to corrupt the digital signal, and if there were, the effect would not be in the least bit subtle. There is benefit in balanced connections for analog signals - assuming, of course, that the connections are actually to balanced inputs.
Musicnoise - of course there is a difference in construction and materials. Digital cable geometry is tailored to deliver 110 Ohms characteristic impedance and therefore eliminate reflections. How reflections add up and corrupt edge of the signal (causing jitter) can be calculated using Bergeron diagrams. As for analog cables one of the most important factors is dielectric constant of isolation material. Lowest dielectric constant close to one of air=1 is obtained by using oversized tubes made of foamed teflon. Foamed teflon has even better (lower) dielectric constant than solid teflon while oversizing keeps wire inside away from dielectric. Another factor is purity of the metal - not important with digital cables but very important with analog. The best is very pure zero crystal copper or silver. Purity reaches 99.9999999% for copper (9N) and 99.99999% (7N) for the silver. Zero crystal process is simply forging metal into hot forms instead cold ones. Cooling very slow prevents formation of the crystals (impurities resides between crystals). Zero Crystal copper has just one or two crystals per foot while regular oxygen free copper has few thousands. On the top of this many cables have anti-vibration shields and some have even fluid inside. All this is important of course if you believe, like I do, that cables make a real and big difference. If you don't you can as well use lamp cord - saves a lot of money.
You point out the parameters which are important for digital and those which are important for analog. However, none of these parameters are incompatible making it entirely possible for a single cable to be optimal for both.
As for anti-vibration shields and fluids, I will not bother to comment.
Kal - If you think that characteristic impedance can be different (and it is completely different for analog cable) why don't you connect your TV to roof antena using any cheap shielded cable. Reflections that will apear are pretty much what is causing jitter in digital transmission. As for shields and fluids - I had once inexpensive IC Audioquest Topaz. I read on internet that this cable transfers vibrations (is audible). So I turned volume up and hit the cable few times with the stick (pen) and to my surprise I could hear it in speakers.
As for digital cable being optimal as analog IC - you must be kidding! Digital cables are made with complete disregard of quality of materials. Dielectric constant is ignored since above 100kHz only ratio of inductance and capacitance defines characteristic impedance. Metal is also secondary since signal at these frequencies travels only on the surface (usually silver plated).
For the record, I did end up trying a non-digital pair of balanced cables and they worked great. Yes, I will end up replacing them with a digital pair of balanced cables, but wanted to confirm that everything would perform as reported with the dual digital cables from the upconverting to the DAC.
Thanks for all your input and if anybody has any recommendations on balanced digital cables, please feel free to offer them.
First, we have a basic disagreement about the significance of some parameters but just because a parameter is ignored or not specified for a particular application does not mean that it cannot share that parameter with the other application. So, making a digital cable with regard for quality of materials and dielectric constant is entirely possible making it suitable for analog and digital applications.
Second, you suggest "If you think that characteristic impedance can be different (and it is completely different for analog cable) why don't you connect your TV to roof antena using any cheap shielded cable." That's a red herring but let me suggest to you that a defined impedance cable suitable for that would also work for analog since there is no defined impedance for analog that would prevent this parameter from being suitable.
So, despite our differences, let me ask you if there is a specific parameter necessary for digital that would make it unusable for analog? Or vice versa?
Kal - That's a tricky question since there is no parameters for cables at all (other than characteristic impedance - irrelevent for analog).
If you like sound of digital cable as IC then used it. I'm merely suggesting that you won't find good metal or fancy dielectric (like foamed teflon) there. Other way around, you might find IC that has close to 110 ohm or have DAC like Benchmark that ignores quality of the cable. By all means use it. It is also possible that differences are there but you don't hear it - even better because it saves a lot of money.
I tend to do things by the book. When it says digital cable I go to store and buy digital and not the analog cable.
Sometimes things are not audible because are masked by other factors and improving system is like peeling layers of pink from the pink sunglasses - you don't notice each single peel but eventually you'll get clear uncolored picture.
OK. IMHO, it is hard "to do things by the book" when there are are no definable parameters except characteristic impedance for digital and LC values suitable for the loads in both cases.
I tend to ignore the labels that manufacturers put on a cable unless they say what their reasons are for putting on that particular label. I do know that some have used the same cable/connector for digital and analog.
The temporary cables that I have tried, to confirm that I am getting the full 24/192 upsampling sound very good. As I previously mentioned, to upconvert to this level, it is required that I used two balanced digital cables to handle the full bandwidth. DCS indicates that it is impossible to transmit this amount of bandwidth on either a Coax digital cable, a glass optical digital cable or a single balanced AES/EBU cable. I don't necessarily question this, but before spending a several hundred dollars per cable, I wanted to be sure that the two appropriate cables would actually deliver the required upsamping level I wanted to hear/test. Since I have plenty of balanced analog cables, I was just seeking a reasonable and fast opportunity to test.
The only thing about cables that is going to make any difference as to jitter is whether the cable has sufficient bandwidth so as not to cause jitter due to the inherent characteristics of the data signal itself. Even then there should be correction circuitry at the converter stage to correct for any transmission induced jitter. Likely, the analog and digital versions of the XLR cables both exceed the bandwidth necessary to avoid transmission induced jitter.
Here is the quote from Stereophile article "A Transport of Delight: CD Transport Jitter"
"While we're on the subject of the digital interface, I should point out that the engineering for transmitting wide-bandwidth signals was worked out nearly 50 years ago in the video world. In video transmission, the source has a carefully controlled output impedance, the cable and connectors have a precisely specified characteristic impedance and are well-shielded, and the load impedance is specified within narrow tolerances. If these practices aren't followed, reflections are created in the transmission line that play havoc with video signals. This issue is so crucial that a whole field called Time Delay Reflectometry (TDR) exists to analyze reflections in transmission lines.
The audio community should adopt the standard engineering practices of video engineering for digital interfaces. This means designing transports with a carefully controlled 75 ohm output impedance, precisely specified characteristic impedance of the cable (75 ohms with a narrow tolerance), and junking RCA connectors in favor of true 75 ohm BNC connectors. By applying standard video engineering techniquesin use for decadesthe high-end product designer can greatly improve the performance of the transport/processor interface. We've seen what happens with a poorly implemented interface with the SV-3700 and different cables: higher jitter in the recovered clock and degraded sound quality. The engineering needed to optimize the digital interface is readily available. Let's use it."
As far as I know bandwith of the cable determines losses in the cable in dB/ft while jitter is strictly property of mismatched characteristic impedance (SQRT(L/C)). Antena/video 75 ohm cables might have different losses (RG59, RG6, RG11 etc) but won't create reflections as long as they have exact 75 ohm. Please correct me if I'm wrong.
Kijanki, I hear you. But one of my concerns is with many of these cable companies is when does the marketing end and the real engineering begin. Even with digital cables, there are certainly some cable manufacturers who clearly point out that the cable is a "true 75 ohm" or "true 110 ohm" cable while many others just call the cables digital cables.
Ckoffend - I would say that most, if not all, digital cables have particular targeted characteristic impedance - in case of audio 75 ohm for unbalanced and 110 ohm for balanced. But just look at typical 75 ohm video coax - it is so much different than analog cable. First of all analog cable uses separate wire for the ground. Carrying ground thru shield is really bad idea and grounding shield at both ends is even worse (XLR is but it was mistake). Metal is important only on the surface (plated) because of the skin depth at these frequencies. Lower dielectric constant is important but not as much as in analog cables. Dielectric constant of polyethylene, most likely used in digital cable is 3.3 while foamed teflon is getting close to 1.5 (and oversized air tubes can bring it even bit closer to 1)).
I understand the need for exploration and experimentation but when I buy cooking oil I am not tempted to try motor oil instead just because there is zero cholesterol and no saturated fat in it(and no other parameters to differentiate them). It was designed for the car and I have no reason to question it. It is a matter of taste, so to speak, but we can get easily lost here since changes between cables are very small. Going by the book shields us from many mistakes.
Jitter will transfer from time domain as a noise. It won't change the sound other than making background less black. That was what I noticed with jitter rejecting Benchmark. Its jitter bandwith is in order of few Hz and at frequncies of interest (kHz) gives -100dB rejection of the noise that was at -80dB to start with - practically complete rejection. Cables here don't make any difference - similar with your Purcell but if you have instead of upsampling DAC oversampling DAC or even NOS DAC than digital cable will make huge difference.
There is an excellent article in Stereophile (available on line) on the jitter explaining how sidebands are created, why they are audible and showing everything in numbers with typical transport/CD. Just educational - you can use any cable with Purcell (if I understand it right)
As to characteristic Z and BW: First, the reason to set a characteristic impedance of a cable is to reduce transmission line effects. T-line effects amount to standing waves. These only become important when the wavelength of the signal approaches the length of the cable how far the signal has to travel. So, whether or not a given and specified characteristic impedance of a cable will matter depends on the cable length. So does the bandwidth of the cable, for that matter, because the total capacitance is determined by the length of the cable. As between the two, as I will explain below, the bandwidth is going to be more important, for the lengths we are talking about.
As to the reference to the Stereophile article not exactly a reference that is going to add validity to a technical position when posing such position to an engineer. Next time try something a little more accepted in the scientific / engineering community such as an IEEE journal, or even something published by the AES or the ARRL.
As to the transmission line effect, we are talking about interconnects here. I made the assumption that the lengths are somewhere in the neighborhood of less than 10 feet. T-line effects only kick in when the wavelength of the highest signal component approaches the length of the cable. Standing waves, if they are present, will tend to round off the edges of the square pulse, this is what causes the jitter due to T-line effects. The purpose of selecting the characteristic impedance to match the source and the load impedances is to get rid of T- line effects.
The highest signal component in the case of digital audio will be about 10 times the fundamental frequency of the signal because at that frequency you have a nicely shaped square wave.
The wavelength of a 100 MHz signal is just under 10 feet, so you really arent getting T line effects until you approach that cable length, if we are talking about a signal with 100 MHz components. A safe rule of thumb is a 1 to 10 ratio, so there one could argue that to completely eliminate the possibility of T-line effects the cable should be less than 1 ft long. However, the transmission rate of digital audio at a 96 kHz sample rate isnt 100 MHz. If you go out two decades, you are still at only 10MHz, which is a wavelength of just under 100 ft. Hence, a 6 foot interconnect will not be a source of jitter due to T-line effects.
More likely (but still not very likely) is that the rounding of the pulse will be due to bandwidth limitations. If the cable has too high of a capacitance value, it is possible to create a low pass filter that will start rounding the square wave and create jitter. The chances of that happening are also slight at the lengths we are talking about, but more likely and does not depend on the creation of a standing wave. For that reason the bandwidth of the cable is more important. A subtle difference, but there is a difference.
But, on a practical side it just doesnt matter a cable made for analog transmission will work find up to about 50 feet and most interconnects for home audio are not that long.
Remember also that jitter only becomes a problem at the conversion. Circuitry at the convertor should reconstitute the clock and reject jitter that is not extreme.
Is it a big deal, no not if you are purchasing the ICs new, I havent priced digital vs analog ICs but there is no reason that one should be significantly more expensive than the other. Furthermore, since the 110 ohm low capacitance cable is not going to cost significantly more for 6 foot lengths and it will work just as well for analog, my guess is that reputable sellers simply make up all their cables out of the same cable and connectors and just charge a small amount more to sell you one rather than two cables; i.e. $ 60 /pair vs $35 each. Not unfair.
Whenever an electromagnetic wave encounters a change in impedance some of the signal is transmitted and some is reflected (impedance boundary). Reflected signal creates all sorts of shape distortions making overshoots, oscillations and staircase (Bergeron diagrams). Rule of thumb says that you can consider that line (cable) is in the low frequency domain when trise>6t where t is line delay. Signal travels thru conductor at about 70% of the speed of light making 1m in 4.8ns. Multiplying this by 6 gives us 29ns. for 2m interconnect it will be 58ns and for 3m it's 87ns (50ft would be disaster - 438ns) . Most of the output drivers switch below 29ns (much less 438ns)therefore we have transmission line effects. Selecting slower driver by designer wouldn't do any good because it creates noise induced jitter on the receiving end. Receiving end has either asynchronous reclocking in upsampling DACs or dual PLL in the rest of them. PLL, even dual, works poorly for fast jitter.
I still recommend Stereophile article - it might be not up to your standards (as an engineer and/or scientist) but at least it is not as boring as IEEE stuff and one can even understand it for a change. And it is audio related - have I mentioned that?
I do not know what doesn't make sense to you. Frequency of transmitted signal in the cable has nothing to do with transmission line effect no matter what multiplier you put on the top of it. What is important is the highest slew rate appearing. You can transfer 10Hz square wave and still have transmission line effect. Slew rate of about 25ns is very common in the output driver (most of them) but some are even in order of 10ns. Using tr>6t is a very common test if line is not a transmission line and you will find it in many publications. Your 50ft cable (analog or digital) is a transmission line (very bad one) for typical output driver. An no - reflections do not round the edges but create whole havoc by creating overshoots, ringing, staircases etc. If you believe that cables make no difference just say so and do not bring pseudo engineering/scientific arguments here because somebody will always call you on it. As for IEEE - I don't read their journal but was at their meetings - not eager to go back.
Ckoffend - Maybe whole thing it is too technical but it is important to
understand that 192kHz signal is really transmitted at 25MHz. Maybe part of
article below will explain better why digital cables exist. I don't want to
engage more in discussion here since it's becoming counterproductive and
I'm signing off.
"This article is from the Audio Professional FAQ, by with numerous
contributions by Gabe M. Wiener.
5.8 - What kind of cable AES/EBU or S/P-DIF cables should I use? How long
can I run them?
The best, quick answer is what cables you should NOT use!
Even though AES/EBU cables look like orinary microphone cables, and S/P-
cables look like ordinary RCA interconnects, they are very different.
Unlike microphone and audio-frequency interconnect cables, which are
designed to handle signals in the normal audio bandwidth (let's say that
goes as high as 50 kHz or more to be safe), the cables used for digital
interconnects must handle a much wider bandwidth. At 44.1 kHz, the digital
protocols are sending data at the rate of 2.8 million bits per second,
resulting in a bandwidth (because of the biphase encoding method)
of 5.6 MHz.
This is no longer audio, but falls in the realm of bandwidths used by
video. Now, considerations such as cable impedance and termination become
very important, factors that have little or no effect below 50 kHz.
The interface requirements call for the use of 110 ohm balanced cables for
AES/EBU interconnects, and 75 ohm coaxial unbalanced interconnects for
S/P-DIF interconnects. The used of the proper cable and the proper
terminating connectors cannot be overemphasised. I can personally testify
(having, in fact, looked at the interconnections between many different
kinds of pro and consumer digital equipment) that ordinary microphone or
RCA audio interconnects DO NOT WORK. It's not that the results sound
subtly different, it's that much of the time, it the receiving equipment
is simply unable to decode the resulting output, and simply shuts
The last explanation about slew rate and ignores the relationship between the two. A little background in Fourier theory may clear up the lack of understanding as to the relationship between signal shape, frequency, and slew rate.
Your home audio equipment is not going to miss pulses by the changes caused by sending a pulse with fast rise and fall times through a path with a bandwidth typically available through analog interconnects.
As to T-line effects, the explanation misses the forest for the trees. The ultimate problem caused by standing waves is the rounding of the pulses.
The text proffered in the last few posts were simply lifted from elsewhere and offered as an explanation. But, they are out of context and inapplicable to the discussion at hand, which is whether digital vs analog IC's make any difference in home audio interconnects. If the person who initially posted the question is using the cables to transfer audio data in a typical fashion, i.e. from for eg. a CD transport to an outboard DAC, he doesn't need 110 ohm interconnect to do so - whcih was my original statement.
What is amazing is that this posts from Kijanki started out with a statement that bandwidth made no difference when it comes to jitter, but yet now offers quotes that refer to the importance of bandwidth. The reason for this contradiction appears to be a lack of a firm grounding in the meaning of the terms and effects discussed i.e. slew rate, frequency, bandwidth, and t-line effects.
Ok...So in a pistachio shell, say you have the option to use either an AES/EBU digital 110 ohm cable OUT from a USB to SPDIF converter to a WYRED4SOUND DAC2 AES/EBU input OR a 75 ohm digital SPDIF interconnect from same USB to SPDIF converter into same W4S Dac2. I keep hearing two tales: 1) that there's no difference since both are digital and 2) that there is a difference, claiming the EAE/EBU connection is superior.
Any opinion or truth is truly appreciated.
Renato13 - It is system thing. XLR protect from induced noise by using twisted pair and differential signal of much higher amplitude but at the same time might slow down transition if drivers have limited slew rate because they swing higher voltage. In addition shield is grounded on both ends - possible source of ground loops. Any jitter creation is always system dependent. It is usually wise to use 1.5m cable because signal travels forth and back (reflection) about 30ns (5ns/m propagation) just clearing original transition that lasts typical 25-30ns. Longer cable adds to noise pickup.
01-17-12: Renato13There will often be a difference. If there is a difference, it can be expected that more often than not the AES/EBU connection will be superior, but in many cases the opposite will be true. The technical issues that are involved are system dependent, as Kijanki indicated, and largely unpredictable. See my post here and my two posts here.