Yes, same for AES/EBU.
I would not be surprised if cable designers do in fact lack knowledge. There’s one well known cable company that sells a digital 110 ohm cable, terminated in RCAs, which is way off base because SPDIF protocol calls for a 75 ohm cable while a 110 ohm cable is an AES/EBU cable and there you use XLRs .
I had had another cable drop shipped to me direct from the manufacturer that simply did not work- no continuity in the center pin.
Another cable - an XLR, from yet another manufacturer, developed an intermittent short. Upon closer inspection, it was clear the workmanship in connecting the XLR was quite substandard.
I’ve experienced RCAs without any real strain relief eventually leading to a broken connection.
I would rather ther consider the opinion of an EE doing digital work vs someone who buys reels of Mogami or whatever cable, puts on some techflex, gets some heat shrink with a company name printed on it, and does in house terminations, because I’ve come to think many cable companies are no more than just that, just repackages. At least Blue Jeans cables is utterly transparent about that.
I would not be surprised if cable designers do in fact lack knowledge. There’s one well known cable company that sells a digital 110 ohm cable, terminated in RCAs, which is way off base because SPDIF protocol calls for a 75 ohm cable while a 110 ohm cable is an AES/EBU cable and there you use XLRs .Yes, that could very well reflect lack of knowledge, but another possibility that wouldn't surprise me is that it was done intentionally, to make that cable sound as different as possible than the competition. My perception has been that it is not uncommon among audiophiles for "different" to be perceived as "better," at least in the short term, even if it isn't. And as Steve wrote in this paper regarding jitter, which would presumably be the main consequence of this kind of impedance mismatch:
I remember the original threads and article claiming this and had bounced it off a very sharp engineer friend of mine at the time and several others. In short, I don't believe it is true; there are more than a few high-quality SPDIF and AES cables on the market which test out perfectly at 1.0 meter.
It depends on the transition time. 1m might be perfectly fine with transports that has 15ns transition times. In addition, reflections appear only on impedance boundaries. With a perfect match (rare) there will be no reflections. Very short cables, under a foot, will also work fine. Rule of >1.5m is just for general case of typical transport.
Jitter is basically an added noise. Music free of noise might sound clinical and sterile at first. One person even posted, after listening to jitter supressing DAC, that he preferred all instruments "together" (sound blob) instead of hearing individual instruments. Also, music with added noise can sound more dynamic - like distorted guitar vs clean Jazz guitar. I believe that jitter destroys everything, but we tolerate it since we got used to it.
I remember the original threads and article claiming this and had bounced it off a very sharp engineer friend of mine at the time and several others. In short, I don’t believe it is true; there are more than a few high-quality SPDIF and AES cables on the market which test out perfectly at 1.0 meter.As is explained in Steve’s S/PDIF article to which Zavato provided a link, what length will be optimal is dependent on the risetimes and falltimes of the signal provided by the component which drives the cable. (Risetimes and falltimes referring essentially to the amount of time required for the signal to transition between its lower and higher voltage states, and vice versa). Those parameters are rarely if ever indicated in the specifications of digital audio components.
Several other factors are also involved, including the happenstance of how well the impedances of the cable and the two interconnected components match, the propagation velocity of the particular cable, the susceptibility of the particular components to ground loop issues, the jitter susceptibility of the particular DAC, etc.
So there is obviously some unpredictability that is involved. But as someone who has also on occasion been alleged to be a sharp engineer :-) I would consider Steve’s recommendation of 1.5 meters to at the very least provide the best odds of being optimal.
This assumes, btw, that a very short length, such as say 6 inches, is not practicable. In circumstances where it is practicable, I suspect it is likely to be an even better choice than 1.5 meters.
Edit: This post was written before seeing Kijanki’s post just above, with which of course I agree.
Speaking of digital, several years ago a buddy came over to hear my system , listened a while and got up and pulled the power-cord from my internet modem/router . Sound went from very good to live music !
Also put a heavy blanket over my TV which was not near the system. Not as big a deal but a slight difference .
I too have noticed this phenomenon, when I lived in my apartment. The differences were NOT subtle. All the electronics, including my alarm system and wifi were all on the same branch.
In my new home, I don't have a dedicated line perse, but nothing extraneous is shared on the electrical branch, which I think is a good thing. I haven't gone to see if unplugging the modem and wifi downstairs on a different circuit will make a difference in the house yet. But you are not alone. It was almost mandatory for serious listening in my apt.
Are you all saying any coax digital cable that is not 1.5m long should be tossed out?Kalali, please read the posts by me and by Kijanki dated 1-25-2016. As you’ll see, there are numerous technical variables affecting what length will be optimal in a given application. Some of those variables are almost never specified (e.g., the risetimes and falltimes of the signal provided by the source), and some have little if any predictability (e.g., susceptibility of the two connected components to ground loop effects, which can contribute to jitter at the point of D/A conversion).
So as I said in concluding my post:
I would consider Steve’s recommendation of 1.5 meters to at the very least provide the best odds of being optimal.Also, past threads here have provided anecdotal evidence that 1.5 meters is not always the best choice. Some members have reported making direct comparisons of 1.5 and 1 meter cables that are otherwise identical, and preferring the 1 meter length.
Also, some comments on the Lessloss writeup that Steakster linked to, which I disagree with to some extent:
It makes no mention of the effects of length on the **timing** with which signal reflections arrive at the destination component (i.e., the DAC), and instead focuses mainly on the amplitude with which those reflections arrive. But in a home system application what is **far** more likely to be significant (for a given set of component and cable impedance values, within their respective +/- tolerances) is arrival time, as explained in Steve Nugent’s paper that was linked to earlier in the thread. Not arrival amplitude, which won’t differ greatly as a function of cable length, in home system applications.
In fact the Lessloss paper itself states that it provides a "reflection-attenuation network, built into the very cable itself, ... [which reduces] the level of the first reflection by 5.6 dB. This is equivalent to a silver digital line of this type of 117 meters in length."
5.6 db is a reduction of only about a factor of 2, in terms of voltage, which by their statement would occur without the special built-in network only if the cable were 117 meters long!
The reason timing is what matters is that what the DAC detects are the **transitions** between the high voltage and low voltage states (and vice versa) of the signal it receives. If the reflections arrive at times in between those transitions, or at times during those transitions that are not close to their mid-point, they will be ignored.
That said, I have no specific knowledge of how the reflection-attenuation network of the Lessloss cables may be designed, and no experience with or knowledge of the sonics Lessloss digital cables may provide in typical applications. In any event, though, it sounds like the inclusion of the special network in their cables means that the rationale for the usual 1.5 meter recommendation is inapplicable in their case.
Al, thanks as always for the thorough explanation. I saw the references to the comparisons between 1.0m and 1.5m cable lengths but was left with the impression that longer lengths were not desirable, all things considered and being equal. To your point, I find it difficult to believe a single length could be optimal in all circumstances given all the other variables, including the cable material and construction techniques.
Interesting....testing 1m, 1.5m and 2m AES/EBU's that test our absolutely compliant with a 110-ohm spec, there is no perceivable (i.e. audible) difference in sound quality, pacing, image placement, etc...with AES/EBU cable lengths from 1 meter on up. I cannot speak for 0.5m or 0.75m AES/EBU cables or those that I've not tested (with music where it matters) that are not 110-ohm characteristic impedance but for those I have on a high-resolving system that shows me everything out of place, there does not seem to be any difference.
Contrast that with the fairy tale that is often propogated that for 50-ohm master clocking applications there is no difference between use of a proper 50-ohm and a proper 75-ohm cable. This could not be further from the truth in that case as evidenced by some testing that several of us plus a well-known cable manufacturer are going through. In this case, it does very much matter, i.e. the cable must be of the correct impedance from end to end. In that camp, several manufacturers (component as well as cable) are proliferating theory that as long as the cables lengths are not long enough for the cable to become a transmission line, having 50-ohm versus 75-ohm does not matter. Our testing/listening has proven (we think) otherwise. The tests were done with cable anonymity (i.e. a label on them, nothing more) and no fore-knowledge of whether a cable was 50-ohm or 75-ohm. All cables were 1.5 meters in length.
Is it possible that the most important part (and I've read the Positive Feedback posting) of the equation is whether the cable, be it S/PDIF, AES/EBU, etc...is properly built to the letter of the spec. and tests out completely compliant at the target characteristic impedance rather than the length of the cable (1 meter and over)???
Assuming 25ns transition time (typical transport) and about 5ns/m signal speed (typical cable) reflection in 1.5m cable will come 15ns later (after of the beginning of transition) just missing threshold point around 12.5ns (deforming transition by adding to original signal above threshold point). Engineers tried to predict it making even special tools for that (Bergeron Diagrams) but different transports have different (unknown) slew rates while speed of signal in the cable is dielectric dependent. It is all guessing game. System with very short transitions (few ns) will be less sensitive to the first reflection but will produce much stronger reflections on smaller impedance boundaries. We can only say, that there is a chance that 1.5m or 2m cable might be better than 1m cable (exactly opposite to ICs or speaker cables)