Why do digital cables sound different?


I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
danielho
Just got the Stereovox, and there's really a night and day difference. The Monsters are warmer but details are lost and smeared - not bad for music listening, but they were just a bit too euphonic. The Stereovox (burnt in by the Cable Co) have excellent detail, depth, soundstage, imaging, etc., but not without its faults either. They can sound bright and edgy and digital sounding.

As it is, I'll probably keep both, using the Monster for older brighter soundtracks. But for incredible steering, imaging and surround effects on good soundtracks, I'll keep the Stereovox. And for music, I'll just keep using analog interconnects.

So digital cables sound pretty much the same? To my ears the difference is as dramatic as any interconnect I've used. In fact they're almost polar opposites of each other soundwise. As if each parameter is on the opposite end of the same spectrum.

Hmm, anyway I'm now a confirmed believer though I can't help being a little disappointed by the Stereovox's after all the hype ("most amazing deal in audio - ever" etc.). I'm hoping the Cable Co just didn't cook them long enough or something, since the bright digital sound I'm hearing sounds a lot like analog cables that aren't fully broken in. We'll see.
This is a very interesting post. Everyone who posted a comment is right; well, maybe except the one about Nicole Kiddmann.

Ideally, it shouldn't make a difference in which digital cables you use, but really, just because it is one and zeros, it is still an analog signal. It has a slew rate, overshoot, undershoot, and ringing. Any mismatch in cable impedance, terminations, or source impedances can cause some anaomolies in the digital word. When it gets bad enough, the 1's become zeros and the 0's can become ones. The same is true for optical links, too.

I wish I could post a picture here, I have some eye diagrams of high speed digital signals in the 2.5 GHz and 5.0 GHz range which are very enlightening. When the cable is lossy enough, there is very little difference in a one or a zero.

Do note that it is datastream dependent; that is, a long series of ones won't degrade like several ones followed by one zero, etc.
A digital waveform can be very badly distorted, as viewed by the eye, but a well designed line receiver will still properly distinguish ones from zeros. Furthermore, if an error is made, or even a group of errors, perhaps due to some unrelated power glitch or scratch on the CD, a data stream with error correction encoding (like a CD) will still be recovered exactly. Them is the facts.
Precisely. Which is why it is so curious that digital cables do seem to sound different from one another.