Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho

Showing 9 responses by blues_man

First of all, the different digital interfaces have different bandwidth. Within a single intrface type, even slight imperfections can cause signal loss. Only ATT optical has enough bandwidth to handle all the dat correctly. Yes this lame by todays standards, but it was leading edge 20 years ago.
Gmkowal I don't believe that you understand the signal transfer in CD playback. Most other digital transactions people have discussed are NOT real-time. CD playback is real-time. The protocol is one way with 44000 transmissions per second. Signal is lost. Bits are dropped. On other real-time digital aplications, say telephony, there is both a retry mechanism if the protocol allows and data recovery. CD protocol has no such provisions. Again rember this is technology that is almost 20 years old. The improvement in technology since the CD standard to the present is greater than the improvement in technology from Mr Bell And Edison to the start of CD technology. I suggest that your claimed knowledge of physics is flawed by your lack of understanding of this signal transfer. Before you go blasting a lot of other people, you'd better know what yo're talking about. As for some backgrtound, I've been working with digital audio on and off for over 30 years. I have spent over 8 years designing commercial digital audio and video and data transfer interfaces. So I'm not someone who has just read a few articles in StereoShill.
Gmkowal just a follow on my previous response. If you don't believe me, prove it yourself. Attach a scope to the output of your DAC. Play some test tones on your transport, ( Use frequencies above 8k that's where the effect starts to be more noticable). Get some different cables and measure the signals that come out of the DAC. I have measured some pretty significant differences. I've always believed if you can measure the difference, it exists.
The problem is really in the transmitter and receiver. Even though there is a "standard", there are always difference in implementation of the hardware. I started out believing that there was no difference in digital cables. That was before I was familiar with the CD standard which isn't very robust. After a while I started measuring lots of cables and interfaces just out of curiosity. Its not easy to determine which cable sounds best with a particular transport / DAC combo. You have to rely on what other people have tried. First definitely goe with AES/EBU interface over RCA cable. It's not that much more. The one I liked best was the MIT reference. Its very expensive, $800, I believe its not worth the money but if cost is no object. I'll be putting the one I tested for sale here soon (at least half off). As I posted in another thread, the best thing is to get a Transport / DAC with a custom interface. I got the Spectral because I thought it sounded best. If you have access to a scope, use the method I suggested above with a high frequency tone. Whichever cable gives the most accurate signal is probably the best.
I no longer have an analyzer, but whebn I did it was oked up to CD players and Transport / DACs. You just can't send bit patterns with your equipment because its probably far more sophisticated than what's in a CD playback system. Remember you also have to send 44,100 samples per second. A scope works fine at the output of the DAC to determine differences in the analog output of test signals. If you use the same transport, same DAC and same source, any differences must be from the cable. Also disconnect the destination end of the cable and put a volt meter on it, notice the large differences.
OOps almost forgot, in your test how did you verify that all samples sent were received and stored? If you sent 500,000 samples, did 500, samples get stored?
On someone's reccomendation I purchased a RadShack Digital RCA cable. They claimed it was very good, it was so bad there was audible static. I worked with an engineer once who screwed up the implementation of my DAC driver that it was always 2 bits off on each word. This causes the static. On most protocols I've workrd with, if they don't sync up they just drop the data. I assume that the thought here is dropping a sample here or there is no big deal. I'm pretty sure they didn't realize how audible the inadequacies were going to be. If they really wanted to do this right, they should have gone with laser pickup on analog discs.
Using a FIFO and reclocking is not the end all and is not so simple. When the Gensis Digital Lens came out, Bob Harley assumed that all transports would sound exactly the same. He found that he was wrong. Rather than reclock, the best solution, given the current CD methodolgy is to have a custom data transfer mechanism like Lvinson and Spectral. I believe that the SPDIF interface is inadequate for high quality playback. I did some testing about 7-8 years ago to prove that there was no difference in transports or digital cables and published that dat on the net. At the time I had only worked with proprietary digital record and playback protocols and was not that familiar with the CD playback mechanism. My tests proved just the opposite of what I thought. Not only that but subjectively the tests seemed to show that jitter was not the most important of the digital flaws, both read errors from discs and signal loss of high frequencies seemed to be more of a problem. While discman buffering is universal and basically required for a device that bounces around, everyone I've ever owned always said to turn off the buffering when not needed because the sound was not as good as the direct signal. I still believe that the fewer components in between the transport and the DAC the better. Buffering of CD data has all sorts of other issues that need to be handled like skipping forward / back etc that make it impractical since you always have to handle the case when the buffer is empty. All the sysytems I have designerd, and most commercial systems with proprietary protocols in general send raw data to the DAC and the DAC interface has the only clock and jitter is at an absolute minimum. Jitter is a fact of life in any digital medium. The use of the clock has been there since digital technologies in the 60s. I always get a kick out of manufacturers who claim jitter reduction devices that produce 0 (zero) jitter. It's physically impossible to accurately measure jitter anyway. We can improve the accuracy, but there is always some small error in the measurement. This error will decrease as technology improves. By the way jitter errors have a much more detrimental effect than just causing pitch errors. Lack of perfect integrity of the audio signal effects soundstage and imaging and if jitter is so bad that the dac syncs incorrectly on the signal, severe static can be produced. See my previous postings.
Gmkowal 2 points here. It ought to be obvious that your test is inadequate, simply use a CD source. It should take about 4 seconds if memory serves me to fill a 1 meg buffer of SPDIF data. That's real time. When I say jitter can not be measured accurately I'm simply pointing out that any device that measures jitter has to have a clock, that clock has jitter, so your measurement has to be off by some amount. That amount may be small but it always exists. My point was that many claims of jitter reduction devices are total BS. 1439bhr I thought the Levinson also had a proprietary data transfer interface. My point was that some component combinations do and they seem to do the best job. My points were simply that jitter reduction devices are a poor substitute for a good implementation between a DAC and transport. They may even incrrease jitter. My second point is that the jitter on most "good" systems is low enough to not be a major factor in the sonic degredation. I believe jitter is third behind errors in the transport and signal loss of high Khz signals.