Why do digital cables sound different?

I have been talking to a few e-mail buddies and have a question that isn't being satisfactorily answered this far. So...I'm asking the experts on the forum to pitch in. This has probably been asked before but I can't find any references for it. Can someone explain why one DIGITAL cable (coaxial, BNC, etc.) can sound different than another? There are also similar claims for Toslink. In my mind, we're just trying to move bits from one place to another. Doesn't the digital stream get reconstituted and re-clocked on the receiving end anyway? Please enlighten me and maybe send along some URLs for my edification. Thanks, Dan
Dan, you've just asked the $60,000 question. There is debate, even among audiophiles, as to whether digital cables make any difference at all. For the longest time I was using an el-cheapo RCA wire that I think came with a VCR I had purchased. I didn't have any complaints about it, but I'd read enough reviews of digital cables claiming a marked difference among them. Well, I took the plunge and purchased Harmonic Technology's Platinum-Link coaxial digital cable to replace the "junk" I had been using. I have a high-end system, but I can't honestly say that I can hear any difference at all between the two cables. I have to agree with you on this. I don't see how cables make any difference when all they have to do is transfer 1's and 0's. I know others with disagree with me. Ultimately, the only thing that matters in this hobby is what your own ears can detect.
What an honest and measured response Macm just made. I think that the differences my be ones of impedence,between different cables and also the connectors, I have as yet to hear any improvement that even begins to jusitfy the Costs involved.
Since we are talking about a digital signal (digits/numbers) and not analog sound you would think any cable would do. However, a bad or weak cable can affect the timing and pace of the digital signal, making your digital clock and jitter control work overtime among other things.
gotta disagree guys. my dac/pre (accuphase dc-330) allows multiple connections between it and a transport (accuphase dp-90, in my case). i have connected between the two: 6 9's (.99999996 pure copper) coax, toslink and opitcal cables ( i can also use aes/ebu but my transport doesn't have an ouput for such a connection). with the remote for the 330, i can toggle between the connections virtually instantaneously. there's a big perceived difference among the three connections that even the uninitiated appreciate; the optical always wins. this may be one of 'em subjective/objective mysteries but, nonetheless, everybody, i mean everybody, can hear the difference among these connections. i think maybe some of our inscrutible cosmoligists are right: despite what einstein theorized, the speed of light ain't really constant.
I think that the placebo effect can be blamed for most of it. Analog cables can be very effected by the impedence match with the components they are connect to, and construction approach. Any properly functioning digital cable would be indistinguishable from any other. The only people benefiting from overpriced digital cables are the manufacturers who make them and the retailers who sell them. Let the retailer take a blind test and see if they can really tell the difference.
dtf: yeah, and i bet your wife/girlfriend is indistiguishable from nicole kidman. or is that just the placebo effect?
I have to say they DO sound different. I don't know why. They DO! I tried 4 different cheap to mid priced cable and they all sound diffrent. I am using the Music meter amd MIT T3 for DVD and CD player respectively. If you cant't tell any difference, probably your system is mid resolution or yor ears might be......
What amazes me about the cynics' justifications for not hearing differences in digital cables is, when all is said and done, the same things that made the "all (analog) cables sound the same" debate of several years ago. First of all, not everyone's hearing is as developed or sensitive as that of some. So it is possible that some simply can't and never will be able to hear these differences. More interesting however is how reluctant many are to give music, that which all of these toys are trying to reproduce, the proper respect. Music(sound) is so complex, so beautifully subtle. Is it so difficult to imagine that there are still many aspects of sound as it relates to recorded music that have yet to be properly explained or even identified? Not to me. Why do we assume that there has to be an explanation "now". The very things that make us want to listen to certain music, the emotion, the mystery; how on earth can these things be quantified? They can't be. Not yet anyway.
I understand that, in theory, there shouldn't be a difference. However, in my system there are clear (but sometimes subtle) differences bewteen HAVE Canare Digiflex Gold, Transparent (regular old 75 ohm video cable), generic BNC and the Monarchy (solid teflon core) that I'm using now. It is also easy to distinguish between the BNC and SPDF outputs (this is simply a matter of imedance). The best combination is using the BNC (true 75 ohms with adapter) and the Monarchy. The solid teflon core really does make a difference -- there are white papers available to expalin why this works, contact Monarchy Audio for copies or more information. By the way, this interconnect is not expensive. Under $100 and there really is a significant and very audible difference. This is the only digital interconnect that I noticed a huge improvement with -- something about the solid teflon core actually makes a difference. Go figure.
I don't care what any of the critics say there is a difference. I replaced an audioquest digital pro with an illuminati D60 with Meridian gear years ago and the difference was gigantic for me. How that skinny little wire can make such a difference is unbelievable, I liken it to power cord upgrades from stock - immediate and significant improvement.
I wasn't trying to start a religous war here when I asked the question. I'm just confused. With analogue, it makes sense that if there is an impedence mismatch (among other things) that these factors can change the signal being transmitted. The problem is that with digital and such a short distance to move the bits, unless you have a bad cable (where you get a lot of errors), shouldn't ALL good cables move them the same way and thus represent sound the same way? With all the techie types out there involved in audio, I was thinking that someone would have already taken measurements at the sending and receiving side to see IF the digital stream is the same. Isn't it ONLY if the sreams are different, at the two ends, with various cables that there will be a difference in sound? This is fairly basic, no? There is no magic involved...and if the digital stream at the receiving end is different than at the sending end regardless of how it sounds...it is just wrong! Or am I missing something here and not understanding what's involved??? What I am interested in are the possible reasons that cables COULD sound different. My only guess so far is that a weak signal or interference makes it hard for the receiving end to distinguish what is a 0 and what is a 1... I think this was inferred by Sugarbrie It is also my understanding that the Pros use a different digital signal...and that the voltages are higher (and making the difference between a 0 and 1 more easily discernible?). Can anyone hear differences in using different cables on aes/ebu? Thanks!
I share others' puzzlement here. Differences between coax and AES/EBU are perhaps easier to accept, but when comparing two different brands of coax, what can possibly be going on? In a conversation with a leading DAC designer earlier this week, he agreed there should be no differences as long as the cables are properly designed, which includes hitting the right impendence measurements. And he added that not all cables are designed properly. (There's also the camp that contends that cable length is very important in digital cables due to relections or something.) But increasingly, the DACs do such a good job of handling all of the timing issues that cable differences should be diminishing. I think I'm just repeating what others have said, but here's my new thought: my BS detector will really start ringing if I hear a claim that some brand's digital cables have sonic properties similar to their analog interconnects and speaker cables. E.g., Cardas Lightning has the characteristic Cardas warmth and fullness, Nordost digital cable is ultra fast but lean, etc. To my mind, whatever it is that lends cables their different sonic characteristics (if you believe that), will not lend the same characteristics to digital cable except by pure, unlikely coincidence. -Dan
Danielho, again I think it is important to look at these issues from the standpoint of the message not the messenger. There simply is too much information in a musical event to encode and then decode for our existing recording and playback equipment. That is not debatable. It is the reason that even the very best systems still don't sound like the real thing. I believe that there are still many types of distortions, still unidentified, that affect a musical waveform wether in the digital or analog domain. While I realize that it is "simply" ones and zeros involved here, is it not plausible that the "code" needed to carry the information that distinguishes wether say, a saxophonist is playing a Selmer or a Yamaha instrument is so inadequate that any problem or simply difference in the way that code is transmitted would further distort the sound?I assure you those differences can be heard. I think those things would have to include cable materials and all that we don't know about why say, silver can have, generally speaking, a generally identifiable sonic quality. Why is it that I hear a certain family resemblance between the Kimber digital cable(silver) and their KCAG analog interconnect when introduced into my system? There has to be something at work here that we just don't get yet. But my ears tell me so. Happy listening!
There is a reason. The digital signal is transferred at a much higher frequency than the audio signals. At these higher frequencies, the output impedance of the transport, the characteristic impedance of the cable, and the input impedance of the DAC must be matched in order to get all of the energy sent by the transport to be absorbed by the DAC. Any mismatch in impedance will result in some energy being reflected back to the transport. This results in standing waves which distort the signal, that is, it is different than what would ideally arrive at the DAC if all was matched. Some of the enrgy is disipated in the output stage of the transport, and some in the cable. There are other cable losses such as dieletric loss and radiation loss that are minimized by a well designed cable. This area of electronic theory is called transmission line theory and will be covered in any book on basic electronic communication theory. This is an ovesimplified explanation, but my point is that this an extremely complex subject that can't be explained by statements like, " I don't see how a wire can make any difference." I taught this subject for 9 years and the more I learned the more I realized how little I knew.
Bravo Bruce, precisely my point.
Danielho - to answer your latest post, no, there is nothing going on beyond what you're supposing - the transport generates a digital stream of data and the receiver receives it. From a technical point of view, it's either received the same as it was sent or it's not, and the benefit of a premium cable would have to be dependent on it's ability to allow the receiver of the data to get it "the same" more often than a lesser cable does. The objectivists point of view would suggest that the only way for the music to sound different would be for the bits to be different (or absent).

One of the severe limitations of the current transport / DAC technology is that there is no redundancy or error correction built into the transfer - the transport sends the data and "hopes" it gets there. If it doesn't, there is no way to fill in the hole or to recognize that it has been changed. Digital audio in the future most assuredly will be transported with a scheme that has complete error correction and redundancy which again, technically speaking, should basically eliminate any instance where the receiver gets something different than what was sent.

Then you have to decide whether the way digital data gets transported, assuming it doesn't get changed at all, can have an effect on the way the music ultimately sounds. I'm certainly not saying that people who say they hear a difference don't, but I can't explain why it would be and I haven't seen a lot of explanations. Perhaps there is something going on that is not yet explainable.

Finally, I agree that somebody in the world has to have run a test where they "capture" the data received by the DAC and compare it to the data sent by the transport, but I have never seen anyone document such a test. With all the communications advances of the past few years, I doubt very seriously that recent transmit and receive circuits / chips are not capable of sustaining the data rates necessary for CD playback over a 1 meter cable in a controlled environment, and therefore there is little sonic degradation based on the digital data being lost or changed between transport and DAC.

Pops or anybody: Is the Illuminati D60 the same as the currently advertised Illumination D60? I have been watching this thread, but have nothing to add as I will be purchasing my "first" coaxial cable shortly to add a Bel Canto DAC to my system. I have been in contact with the Cable Co. in regard to 1.0 meter IC's, but now am consisering a 12.0 feet IC, which will allow me to shorten my speaker cable runs (I could then place the DAC and amp very close to my speakers). If anyone has tried 12.0 feet (4.0 meter) coaxial 75 ohm cables, I would also appreciate comments on your experience. Does the added length degrade the sound?
Ask and ye shall receive. Frogman set off my BS detector. I'm open minded here and do no want to attack anyone personally, but a family resemblance suggests that the listener is projecting a family persona on the family members. Because the job of a digital cable is quite different from the job of an interconnect or speaker cable. If an interconnect rolls off high frequencies, will its digital counterpart also roll off the JUST THE BITS THAT CARRY THE HIGH FREQUENCY INFORMATION?
Drubin: No. This is a case where "bits is bits" -- while the bitstream obviously carries the musical information, one cannot point to particular bits say, "here, these are the ones carrying the high frequency content". A role that digital cable may play, as has been pointed out earlier, is in the area of transmission line theory. Impedance mismatches may result in distortions in the waveforms representing the bits (!) resulting in errors in clock recovery, resulting in DAC jitter, resulting in harshness in the sound. These problems can be overcome by using a proper 75 ohm coax cable, sufficient bandwidth in the driver and receiver circuits, and DACs that use well-designed buffer logic to accept the data off the SPDIF interface and reclock it back out without jitter. This technology, found on $49 Discman players provides "skip-free" operation so that Discman player while walking, running, bumping into things, etc. Relative to TOSlink, the issue is the bandwidth of the electronics driving the optical transmitters and receivers. If the BW is too narrow, then distortion of the digital waveform ensues, leading to jitter if proper buffering and reclocking circuitry is not used. I agree with previous posters: once you have a reasonably well made optical or 75 ohm coax cable, there is NO value in multi-hundred dollar esoteric audiophile digital or optical cables.
It all boils down to this, all cables distort/degrade/alter the original signal. The best cable is no cable. The best cables, regardless if they are cheap or expensive, are the ones that transport the signal with the least distortion. It seems silly to argue digital cables have no effect on sound, as CornFedBoy above states when switching between various cables, differences are heard because each cable changes the original signal by adding various distortions, Sam
Drubin, you claim to be open minded.I say not open minded enough. Wrap your open minded brain around some of these observations and tell me what you come up with: In none of the posts concerning this subject( with the exception of mine ) is there a single direct reference to a musical observation as it relates to cable effects. I say, do a lot more listening , educate your ears, worry less about understanding the technical issues and more about understanding the beauty and meaning of music, recorded or not. That is what opens one's mind to better understanding the unexplainable, and much of what we are talking about here is unexplainable in my opinion. Concerning what set off your BS detector: I have recently tried various digital cables between my EAD Transport and DAC. I heard without a doubt a very similar change in the sound of my system when I used the Kimber silver digital cable as I do when I use KCAG between pre and amp or between DAC and pre. I don't claim to be able to tell you why that is only that I heard it. To me it is plausible that there is some connection here that we simply can't explain based on our understanding at the present. Obviously the manufacturer has certain design philosophies that carry over from one line of cable (analog) to another (digital); perhaps the materials used (silver)are the issue. I am confortable with the idea that there is something going on here that we just can't document yet. You see, my brand of logic tells me that something that is so often bandied about in so flip and simplistic a manner, such as the idea that "bits is bits" can't possibly do justice to the wonderfull complexity and subtlety of music. Like Karl I feel that the more I know the less I understand. Just friendly comments and hopefully food for thought.
Dekay, I believe they are one in the same, that Illuminati and Illuminations, name change for some reason and I believe less expensive.
Thanks Pops. David.
Fair enough, Frogman. Good post. I am mostly a subjectivist, have a lot of money invested in cables and hear a lot of differences between things that many argue must sound the same. I think being open minded works both ways. In this particular case, the idea of family resemblances flies so strongly in the face of common sense (to me, at any rate), that I feel we must temper our observations with an open mind to the possibility that what we observe may be influenced by what we expect to hear.
I just ordered the Mapleshade Double Helix Digital Cable. It is not a coaxial design. Since I have never used a digital cable before I thought that I would start with the unknown. I also enjoy talking with Pierre at Mapleshade who has freely shared some good basic advice on my setup. When my system settles down again I will try some other options from The Cable Company, but for right now there are too many changes going on (in my system) to perform sound auditions. When the time comes I will post to this or a new thread as there is not a lot on this subject at the site right now.
Megasam: yes, all cables have some effect (i.e., error) on the signal waveforms passed though them. However, the beauty of digital communications is that, with proper system design, these analog errors can be ignored or removed at the receiving end. That is, errors are NOT necessarily additive in a chain of digital components, as they are in analog communications (i.e., in analog, the SNR must get worse with each additional component in the signal chain). Example of how errors may be removed in the digital domain include error control coding and reclocking of buffered data. A favorite question of mine is "how can this be so?". When faced with claims of audible differences between digital cables some possible explanations that come to mind are a) placebo effect, b) a substandard cable has been replaced with one of proper bandwidth and impedance (and once it's "right" it can't get "better" in a digital system, or c) some sort of weird equalization of the digital waveform is going where the distortions of a particular cable are being used to compensate for distortions of poorly designed digital data transmitters and receivers in the transport and DAC. In a digital communications system, these kinds of effects are sometimes called ISI, or intersymbol interferance. To me, case c) is not an acceptable state of affairs and cable and audio equipment manufactures should be better serving us.
First - the digital datastream carried on a coax connection is an analogue signal, albeit used to represent 1's and 0's. Second - I have never seen a signal coming out of one of these cables on a scope that is a perfect square wave. Third - lack of perfection in the square wave means jitter. Fourth - jitter produces harmonic distortion in the output of the DAC, different forms of jitter distortion producing different harmonic signatures - some sounding soft, some sounding harsh. Fifth - I have never heard or measured a reclocking device (including the Genesis Digital Lens) that does not reveal some of the jitter distortion created by upstream cables and components. And what is more important is that digital cables do sound different, provided of course you have a high resolution system and sensitive ears. I am intrigued however about the observed phenomena of a cable's sonic signature when used as an analogue interconnect, being present when used as a digital cable. I have heard this too, and with cables other than Kimber, and I reject the placebo argument in the context of how I test components. I find this one harder to explain and can only surmise that we cannot look at interfaces between components as separate systems, and that each interface may leak artifacts of itself into other parts of the total system. The active devices that buffer interfaces are meant to deal with this, but perhaps no real world electronic part works exactly how it is designed to work?
Even with perfect square waves there will be jitter in a clock recovery circuit, due to the stochastic nature of the bit stream. Buffering the data and reclocking with a nice stable clock avoids the jitter problem, at the expense of some relatively long term drift in average sample rates to accomodate changes in the data transmission clock rate. If done properly (i.e., large data buffer, low loop bandwidths), then the time constant would be on the order of seconds. A $49 Discman CD player with "skip free" circuitry does this. So does my Levinson 360S. Now we just have to get audio manufacturers to work on the price points in between. :-)
For some reason all of these explanations are never good enough for some people. This topic inevitably leads to a long debate. Science tries to explain an experience and often falls short. All I need to know is that I tried 5 different coax cables and they all sounded different. One was consistantly better than the others, and I bought it. Anyone who can't hear the difference needs work on their listening skill.
My experience is that there are definite differences between digital cables. I suspect it has more to do with the interface of the plugs and sockets (particularly w/ RCAs) than it does with the cable itself.
Another point: it could be that different cables have differing degrees of shielding effectiveness, and therefore, generate different levels of RFI which is coupled back into the analog electronics. P.S. Don't knock science and engineering. That's what gave you something to listen to. Empiricism has its place, but progress depends on interpreting those results in the framework of physical laws and analytical principles -- or else we'd still be riding ox carts with wooden disc wheels over stone bridges.
1439, I agree with you as well, but I think there are enough possible explanations available to accept that there are differences between cables, and get on with picking the best one for their system, instead of continuing to deny that reality.
I'm not denying reality, just looking for plausible explanations of it, and pointing out that existence of a plausible explanation (other than placebo effect) means that our beloved audio equipment has design deficiencies. Consider the case of digital cable interconnects between computer equipment: either they work right or they don't. One does not swap printer cables, for example, hoping to increase the resolution of the printed page! If it is indeed the case that digital cables have effect on the sound, why is this an acceptable situation? The whole point of digital technology is to avoid these degradations entirely. I expect more for my dollars, and want to spend more on source material, and less on "work-arounds" (e.g., esoteric cables needed to compensate for performance shortfalls in the equipment design).
1439, I was speaking generally and not personally. It seems to me that some high end companies like VAC and Mark Levinson are adressing a similar issue by manufacturing integrated amps, (no need for pre-amp to amp interconnects) I am sure that one day cables will be eliminated altogether. 1439, I'm with you on wanting to spend more on source gear, but even with well designed equipment the cabling still effects sound. To get the best almost always takes more money. .
1439 - I agree with your point 100% and have made it myself many times - if differences are heard using different cables, the bits must be being altered (a bad thing)and if they are, then we as consumers should be demanding a better technology for the interface, not spending a bunch of money on cables and transports. As you say, you can transfer bits perfectly in the computing world with good quality but relatively cheap cabling and absurdly inexpensive hardware. If a DAC with an ethernet input interface was available, it would be easy to set up a whole-house music distribution system that performs as well (or better, if transport technology is as spotty as it would appear it is) as the best transports. Hopefully such an interface isn't too far off in the future.
You make a mistake if you think that just transmitting the bits accurately is all that is required. Jitter (or time-based distortion) is irrelevant when there is no need for time syncronous transmission - ie. computer communications. But in audio or video you must deal/live with time-based distortion. Don't fall for the marketing BS that says a Levinson DAC or a Discman eliminates time-based issues through buffering. And by the way, don't believe that doing away with cables eliminates the problem either - otherwise we would have stuck with the three-in-one music centres of the 60's - Bmpnyc, your dream came true forty years ago.
bmpnyc... no offense taken. If there's a physical effect, then there's a physical cause. If there's no physical cause, there can be no physical effect. Those of us with the scientific mindset seek to comprehend the linkage. Understanding the linkage is every bit as enjoyable to me as other intangibles such as "beautiful design" and "great build quality".
Redkiwi - you're right that having time synchonization requirements makes the environment more demanding. However, as long as you have 1) a redundancy scheme and 2) sufficient resources above and beyond the demands of the basic application to support the redundancy scheme, then you can effectively eliminate the time synchronous demands. The Levinson DAC / Discman buffering doesn't eliminate it because there's still no redundancy - if they send the data and it's not received correctly, there's no recovering the lost data. But if I have a 100Mbit ethernet connection and have to keep up with only the bandwidth necessary for CD playback, I can send / resend the data dozens of times if need be and still keep up. If I can transfer files across a LAN perfectly accurately at 10Mbit/sec, I should be able to transfer music "files" perfectly at a rate of 1.5Mbit/sec. If current transport /DAC interconnect technology can't perform this same feat, we should demand better.
First of all, the different digital interfaces have different bandwidth. Within a single intrface type, even slight imperfections can cause signal loss. Only ATT optical has enough bandwidth to handle all the dat correctly. Yes this lame by todays standards, but it was leading edge 20 years ago.
I would like to reply to your question. No, No, No, No, No, No!!!!! There is simply no way that any digital cable can color or change the sound of a system. While there are physical reasons why analog cables (interconnects, speaker cables etc.) can have an effect on sound quality it is beyond the laws of physical possibility for a digital cable to have an effect. The D/A in your equipment could care less if the 0111001111 bit stream came from a piece of copper or fiberoptic material. I understand that audio is very subjective but one must be careful when opening one's mouth. Consider the ramifications of someone taking what you are saying as gospel and purchasing an expensive cable what there is no physical possibility the cable will effect the sound one bit.
Bruce, cut the crap! We are not talking about frequencies in the GHz region. These are simple logic levels moving at less than a MHz. I have been a digital designer for 20 years and have never heard such garbage. I there is any effect at all it would result from delay caused by capacitance in the cable not by excessive standing waves. If I knew what the impedance of the DAC was, I bet we would would see very little return loss if we swept the cable on a network analyzer even up to 50 MHz. Sorry to burst your bubble but you are way off base.
With all due respect, and I respect your 20 year experience as a digital designer, my 25 years as a professional musician tell me very loudly and clearly that I hear these differences. What's more, they aren't all that subtle in many cases. My musical colleagues (those that care about these things )also hear them. Is it not more productive and potentially enlightening to consider as plausible what the ears of those who use them for a living hear. I hate to break your bubble, but I assure you that the subtleties (subtle variations in timbre, pitch, time etc. ) that a musician has to be sensitive to playing in say a Brahms clarinet trio are far more subtle in absolute terms than the oftentimes obvious diifferences that are heard between cables, including digital. By the way, digitally recorded music to many colleagues of mine still does't "swing" the way it should and certainly not as well as good analogue. The groove or "fun factor" is diminished; not catastrophically but diminished none the less. I would like to respectfully encourage all of us to do more listening without focusing on the technical aspects of the sound. "Hearing" is not only what takes place in our ears, but letting that go on to touch us emotionally. Then that in turn frees us to "hear" more, and the cycle continues. There is infinitely more to hear/experience in most good music than most think. I remember that years ago when I first started reading the mags a couple of reviewers were fond of pointing out in their description of the prowess (or lack thereof)of various very expensive components, that these components were somehow to be praised for allowing the listener to "hear that the instrument being played was an English horn and not an oboe". This is almost laughable, I assure you that the difference in timbre between these two instruments is so obvious, that they can be easily heard over the lamest grocery store sound system. Then why bother? Because there is so much more than most imagine. I point all of this out only to encourage the cynics to consider the possibility that they are missing out on a whole lot of fun in their listening by letting technical issues dictate what and how much they should be able to hear. Happy listening.
Hey Gmkowal, Digital cables do sound different!. From a theoretical standpoint, there does not seem to be a basis to "different sounding digital cables". As an Electrical Engineer myself, who represents a high end manufacturer of A/D and D/A converters, I absolutley agree with your statements. I, unfortunately, like so many, cannot come up with a good explanation to why different cables yield different sounds. Perhaps it is not the transmission media per say, but the interaction at either or both ends, or a combination, or both, I wish I could find the reason, because it exists!. Sometimes the sonic differences can be as large as changing interconnects, really!. Each D/A converter is different to how large the change may be. (I am currently on my ninth digital front end, and my sixth transport, so I have quite a bit of experience here). If you have already tried to hear the differences (for giggles, since it can't possibly be true), and you don't hear any, then the resolution of your speakers, amplifiers or other components is not allowing you to experience the differences. Some of the major sonic differences between cables involve the harmonic structure of the music , the soundstage width or depth. Unfortunately, this is also the smallest detail to preserve all the way to the speakers. Please try to open your mind on this one, it took me 3 years of preaching "there is no way there can be a difference, I do this for a living!", then (for giggles) I tried to prove my point. Boy, was that an embarassing moment. As a digital designer, you should consider that maybe there is something we are not considering as trained and schooled "experts". In the end, maybe you can become the hero that comes up with a logical reason to "why do digital cables sound different?". I stopped trying, and just listen to music through my best sounding digital interconnect.
Why is it that when refuting the argument that there is no technical reason for a digital cable to impart a sonic difference, those who hear a difference resort to the, "well, your system must not be resolving enough and/or your hearing not good enough or well-trained enough" line? Why not, "you're claiming I'm pre-disposed to hear a difference, while I think you're pre-disposed to NOT hear a difference". I would guess that if you took all the systems owned by those who say that they don't hear a difference and compared them to the systems of those that do, the quality and resolving capabilities would be quite similar. In any case, that point of view always detracts from the discussion in general, especially when it's layered on top of, "I don't know why it sounds better, it just does". If the system is indeed more resolving, offer up a hypothesis on how a system on which such differences can be heard effects this improvement so we can all learn from it.
I have a small confession.. I enjoyed choosing the cables in my system. This enabled me to "customize" an otherwise mediocre system, and approach the high end without spending too much money. I know "too much" is relative. My cabling is valued around 35% of the cost of hardware, but thanks to Audiogon I was able to spend around 20%.
Response to Kthomas about not offering up a hypothosis. I refuse to offer up a purely speculative hypothosis, about an audio issue, that simply has no presently known logical eletrical engineering basis. I don't know how long you have been involved with high end audio, but let me tell you what happens with far reaching hypothosis, regarding electronic issues that don't seem to make any sense :THEY BECOME THE BASIS FOR MANY SNAKE OIL COMPANIES, THAT PURPORT TO BE HIGH END AUDIO MANUFACTURERS, TO INCORPORATE INTO THEIR PRODUCT LINES! I have seen this happen again and again. I remember when Class A amplifiers were considered the "best" sounding amplifiers, so therefore "Class A amplification are the best amplifier designs". The same thing happended with zero negative feedback designs, and more recently, SET amplification. Unfortunately, these so called "facts" are very misunderstood, and EXTREME generalizations at best. Each "fact" has set the high end electroinic industry back more than you can imagine. As audiophiles buy into these facts, they propenciate manufacturers to build these products, that incorporate these topologies, EVEN IF THIS IS NOT THE BEST WAY TO GET GREAT SOUND! The best example that I can think of regards speaker cable design. It has been hypothosized that a great sounding cable needs a ton of current carrying capabilty. So in turn, the majority of the high end cable companies have produced these huge thick cables for us audiophiles. Indeed, many manufacturers don't even consider that thinner guage designs might produce a much better sound. After all, the more current carrying capability, the better bass and dynamics, right? This is a no longer a hypothosis, but a fact, right? Well, this is just not the case . I personaly know three speaker cable manufacturers that are pulling their hair out, trying to design cables that represent what the audiophile community thinks are better (read big and heavy)... BUT ...this is just not true with many better sounding, properly engineered configurations. Other cable companies (that I personaly know...read; big, well known names) have figured out how to make each of their larger cable offerings sound better, as they incorporate more conductors and make each cable more expensive in the process. More importantly, they can then charge an ear, arm and a leg for these huge cables. These guys (that know the real truth) are laughing all the way to the bank! The other cable manufacturers ,that design huge cables. are not necessarily trying to rip us off. They have just not discovered the ultimate truth about cable configurations and conuctor size.(It amazes me how a "fact" can be created overnight in the audio engineering community.) This is why I refuse to offer up speculative, potential non-sense hypothosis that has no engineering basis. When I have a legitimate hypothsis, that has some fundamental electronic explanation, I will always offer it up as a hypothosis (with hope that it will not become a "fact" overnight).
I think I am repeating myself, but continue to get replies which imply that the only reason why digital cables could sound different is bit errors. This is not the reason at all - the reason is jitter - noise-based and time-based distortion. The problem is not about distortion causing a DAC to read a 0 as a 1, or a 1 as a 0. It is about the fact that we are talking about real-time transmission and that a DAC produces harmonic distortion at its output when the arrival times of the 0s and the 1s are not perfectly, regularly spaced. I really am having trouble saying this in as many different ways as I can. It is not about redundancy so that when an error occurs the data can be resent - we are not talking about data packet transmission here. Bandwidth capability is in fact an issue here. Even though the bandwidth for data transmission is low by most standards, if the cable was only just able to transfer the data accurately then the square waves would be very rounded indeed and jitter errors at the DAC would be enormous. Higher bandwidth cables allow sharper corners to the square wave with less undershoot or overshoot. Optical cables are also free from earth noise adding to the signal. It is not about bit errors, it is about timing based distortions. I work with loads of PhD telecommunications engineers but their grasp of these concepts is slight at best, because it is irrelevant for the audio fidelity needs of telephony and irrelevant for data packet transmission. But the best of them acknowledge that their training is insufficient for high quality audio.
Redkiwi: I get it. It is not only that you say yes and no, but "how" you say it as well. As far as I know nothing is perfect and there are always variences to be had. The sub atomic clock in Denver is pretty accurate, but I would assume that DAC's and digital transmission lines are not even in the same ballpark. Perhaps when we get organic based DAC's all cables will sound the same?
Redkiwi, the overshoot and undershoot you speak of is caused by capacitance in the cable. While overshoot and undershoot themselves may not necessarily effect the DAC output signal, the capacitance in the cable may effect the data pulses risetime. This effect, I would assume may be audible. While jitter is a degrading factor in a system of this type the transmission cable is not likely to increase or add jitter to the bitstream. I feel capacitance is the real culprit here. The less capacitance in the cable the better.