Digital cable optimal length?

Last time I asked about optimal phono cable length, i got mostly answers like 1.5 m or less. I had experimented since then using 6 ft long RCA/RCA as phono cable and compared with same make 1.5 RCA/RCA cable as phono for along time with all my three TT set ups and result was same.

it does DETERIORATE the sound quality drastically as the cable gets longer (i had tried 4ft long also)

Now the question about digital cable.
Would having 3 m long BALANCED /BALANCED digital cable have similar results?
Have you tried?

Plenty of reading material and links if you do a search for "length" in the Digital forum over at Audio Asylum. You can also search the Digital forum on Audiogon. I've never had the opportunity to compare different lengths of digital cables. My understanding is that the shorter 1/2 meter cables are not the best way to go. The 1 1/2 meter length appears to be the recommended length for regular SPDIF. I thought that I read somewhere that Kimber states that the Orchid AES/EBU performs best at a 1 1/2 meter length also. I do not know if that would apply to all AES/EBU cables. Apparently the length affects how resistant the cable is to "back reflections" of the digital signal that can cause added jitter.
I agree with MrDerrick. See the post by Tobias, and the subsequent comments by me and others, in this thread:

The technical factors for digital audio transmission are completely different than for a phono cable, due mainly to the much higher frequency spectral components that are present in digital audio signals, as well as cartridge sensitivity to cable capacitance (for moving magnet cartridges), very different signal amplitudes, etc.

-- Al
1.42 meters..Tom
what sonic differences can one expect to hear when using a cable too short or too long?
Too short means confusion and lack of coherence. Musical phrasing is smeared, timing seems subtly off, the image is out of focus. Instruments are harder to place and lack definition. Soundstage is vague. It's easy to spot the difference if you compare two lengths of the same cable.

I've never heard too long but I imagine the effect must be about the same, since the problem--timing of internal reflections--is the same.
Thanks Tobias, Almarg. I did search under 'length' and found more results. I quite don't understand the technical rationale other than maintaining certain impedance, I do believe that it would be unwise for me to buy longer cable to just try out, chances are it would be a waste of money.

I am not sure buying a 'cheap' 1.5 m and 3m digital cable form Radioshack would give me some conclusive result or not. But if does not cost too much, I might try this.

Theaudiotweak's answer 1.42 m is rather curious. Could you care to explain, Tom?
I owned both .5 and 1m Kimber Orchids for use between my CAL Delta and either Sigma or Alpha DAC(back when). The .5m was consistantly terrible(in a number of audible ways), compared to the 1m(which was wonderful). Rat Shack cables(like everything else from them) would be a total waste of money. These are inexpensive, but highly regarded, by everyone that has auditioned them: ( They won't break the bank, at any length you choose, either. NO- I've no "connection" with the company. =8^)
I quite don't understand the technical rationale other than maintaining certain impedance

At high frequencies (much higher than audio frequencies), what are known as transmission line effects come into play, for electrical signals travelling through cables. One of those effects is that if the impedance of the cable, the connector, and the load (destination) device are not precisely the same (and they never are), some fraction (usually a small fraction) of the incoming energy will be reflected back toward the source (instead of being absorbed by the load).

When that reflection arrives at the source, it will again encounter an imperfect impedance match, and so some small fraction of it will be re-reflected back to the original destination.

The length of the cable affects the amount of time that is required for that two-way round-trip. When that re-reflection arrives at the load, it (or most of it, the part that is not re-reflected once again) will sum together with the original waveform, resulting in small but significant distortion of the original waveform.

With a digital signal that is used for clocking, as well as to convey data, what is important is that whichever edge of the signal is being used by the destination device for clocking (by "edge" I mean a transition from either low to high or high to low, i.e., 0 to 1 or 1 to 0, and actually some applications use both edges) are as "clean" and undistorted as possible, or else jitter results (meaning small fluctuations in the timing of the clock period). Typically the middle area of a transition edge is what is responded to by the destination device, so the cable length should be such that the re-reflection does not arrive at that time. That time, in turn, will depend on the risetime (or falltime) of the edge (the time it requires to transition from high to low or low to high). Quoting from myself in the thread I linked to above:

If the input impedance of the dac and the impedance of the cable don't match precisely, a portion of the incident signal would be reflected back to the transport. A portion of that reflection would then re-reflect from the transport to the dac. The two-way reflection path, assuming propagation time of roughly 2 nanoseconds per foot, would be 12ns for the 1m cable, and 18ns for the 1.5m cable.

I don't know what the typical risetimes/edge rates are for transport outputs, but it does seem very conceivable that the extra 6ns could move the arrival time of the re-reflection sufficiently away from the middle area of the edge of the original incident waveform so that it would not be responded to by the digital receiver at the dac input.

Hope that clarifies more than it confuses!

-- Al
just curious -- though i haven't read anything to this effect -- but would this 1.5M rule apply to standard interconnects as well? Or in general just the shorter the better?
No, all of this has no relevance whatsoever to interconnects carrying analog audio signals, or any other kinds of signals of comparably low frequency.

And in the case of digital interfaces, the 1.5 meter figure would only be applicable to situations where the signal risetimes/falltimes are similar to the values for typical transport-to-dac digital interfaces. Similar principles would apply to other digital and high speed interfaces, but the numbers would work out differently.

-- Al
Thanks Al. Appreciate your knowledgeable posts.
Yes, Indeed Al. Thanks for elaborate clarification. It is much more clear now.
Thanks Roadmann99999 for the link. I had Canare once. $32 for 3 m digital cable is a surely a bargain. May be for me the solution would be to bring my other transport closer to my common DAC.

I am still itching to try out though. If it works, it will save me a lot of trouble. The real estate between two transports, a Pre, three TTs, two phonos is really crowded. All of my components have separate power supplies too, and that makes things even more jumbled.

The Canare idea sounds real good. I will report back. Later..
Nilthepill -- A further thought. While I haven't researched or particularly seen listener comments on going longer than 1.5m, it seems to me that you should be able to have a significantly longer run than that without running into the problem we have been discussing (that arises from going shorter than 1.5m).

Especially if you are using only redbook cd data rates (44.1kHz sampling, 16bit data for each of 2 channels). But perhaps even at 96kHz/24bit/2 channels, or more.

Given that the two-way propagation delay of a 1.5 meter cable gets the re-reflection past the middle area of the leading edge of the original waveform, adding the additional delay of a longer cable will not become a problem until it is large enough to place the re-reflection on the NEXT edge, the next edge being of the opposite polarity (e.g., negative-going instead of positive-going). (Even if only the positive-going edges of the waveform are used by the dac, distortion in the middle of a negative-going edge could conceivably cause it to be seen as a positive-going edge).

The clock rate is 2 times the bit rate for SPDIF and AES/EBU:

So for redbook data, the clock rate is around 2.8MHz, a half-period is around 180ns, which at 2ns/foot propagation speed (which is roughly typical for electrical cabling), or 4ns/foot factoring in the fact that a round-trip is involved, corresponds to a length of 45 feet, or about 14 meters. To allow some tolerance on what part of the edge is actually responded to by the dac, we should reduce that somewhat, say to 10 meters.

At the other extreme, if you were transmitting 192kHz 24bit samples, you are increasing the data rate by a factor of about 6.5 compared to redbook (192/44.1 x 24/16), so the 10 meters would be reduced to approximately the 1.5 meters we have been talking about. For 96kHz 24bit samples, the corresponding answer is right at the 3 meters you were hoping to use.

So the bottom line, it seems to me, is that if you will not be dealing with data rates that are above 96/24, you could very well see no degradation from a 3 meter cable. That assumes, of course, that the cable is good quality, so that other possible forms of degradation don't arise.

Hope that helps,
-- Al
Excellent info Al. Now it is much clear as to what is happening. So then, is it possible to precisely calculate optimum( high success rate) length for 44.1/16 bit data transmittal? I am getting greedy here. Is 4 m better than 3 m. I am sure I won't need longer than 4 m. Is AES/EBU automatically better either at 3m or 4m or SPDIF exceeds AEU/EBU for this lengths?


Nil -- If you are just using 44.1/16, based on my analysis above you should be equally good at 4m or 3m. It would probably be a good idea to try to cross-check that against actual user experiences, if you can find relevant information via search (here or via Google).

AES/EBU and SPDIF do not differ significantly in terms of protocol, bit rates, or clock rates, and therefore do not differ significantly in terms of the reflection effects we have been discussing. They do differ in terms of amplitude, use of balanced interfaces, etc., which would seem to favor AES/EBU (balanced connections, higher amplitude) for longer runs. Just as any longer run will benefit, in terms of noise immunity, etc., from being balanced and having higher amplitude (everything else being equal). But at only 4m, for a digital signal, I would suspect that the difference would not be particularly significant.

See the following, re the differences between AES/EBU and SPDIF:

-- Al
Al, do BNC or RCA type plugs exist that have a true 75 ohms impedance? And what kind of mechanical joint would you apply between the conductor and the plug (to minimize impedance mismatch): soldering or crimping?


BNC's: Yes.

RCA's: Generally no; in a few cases maybe/approximately/sort of. :)

See the last few posts in this thread:

As for crimping vs. soldering, I don't know; sorry.

-- Al
YES- 75 ohm BNC and RCA connectors most certainly DO exist: Refer to the site URL that I posted previouly in this thread.
Rodman, I have to admit to a little skepticism about that Canare RCA connector. In the page you link to, a special joining method is cited as part of the 75-ohm connection. The RCA plug itself is not claimed to be a 75-ohm device.

Elsewhere I believe I have seen a Canare RCA plug which the company identified as "true 75-ohm", but this claim is in conflict with statements from other sources concerning the impedance inherent to the RCA design.

I certainly would like there to be a 75-ohm RCA plug and would be happy to be shown one whose impedance figure could be trusted beyond the shadow of a doubt.
Tobias- Read the Jan 2,1997 entry in it's entirety: ( You would think that the people marketing, or the thousands that have purchased Canare 75ohm RCAs over the years, would have noticed if they did not deliver a true 75 ohms by now. ( "Beyond a shadow of a doubt"? Buy a pair(they'll cost you under $10), attach them to some 75 ohm cable, and use an ohmeter on the assembly. I'm certain thousands of others have tested the RP-C4 RCAs, and LV-61S cables, to remove their doubts too. ( It seems some companies are buying Canare products, disguising/relabeling them, and selling them as their own(interesting)! =8^)
The link Rodman provided to Markertek's catalog entry for the Canare connector is fairly persuasive, I think ("200MHz performance/vswr less than 1.1"), assuming it is factual.

Also, note that the connector is designed to be crimped, which perhaps answers Dazzdax's question.

I should point out, though, that an ohmmeter will be of no help in determining the impedance of a connector (or a cable, for that matter). It will indicate an open circuit (infinity ohms), because impedance and vswr (reflection) effects only come into play at high frequencies. Specialized test equipment that feeds a high (e.g., rf) frequency signal into the device under test would be required to make a meaningful measurement.

-- Al
How come NONE of you have referred to this article which seems to give some reasons for using 1.5m length for digital cables???
Bombaywalla, you caught me out. Steve Nugent of Empirical Audio with his paper in Positive Feedback is at the bottom of the whole affair and should get credit for it. I did refer to this paper in at least one of the posts to which Almarg has linked or referred, above.

I would like to add that IMVHO Almarg does the best job I have yet read of explaining the technology involved.