USB cable hype


Can someone explain the need for expensive USB cables for short runs? The only parameter of concern is impedance. I personally have verified error-free transmission in the Gbps range regardless of cable make/model as long as the cable length is short. There is no magic. It is just about impedance control to minimize loss and jitter. This is inexpensive in the MHz range. I will pay more for a cable that it is well built. I will not pay more for hocus pocus.
axle

Showing 1 response by almarg

03-09-15: Axle
My question wasn't about the cable's effects on SQ. It was about failure: At what distance does a vanilla USB cable start to fail? I'm sure the answer varies with drivers and date rate. For example, we know that PCM should travel farther than DSD.
Five meters is the maximum cable length specified for USB 2.0. However the rationale for that limit, as stated in the following excerpt from Wikipedia's writeup on USB, does not seem applicable to situations that don't involve multiple hubs:
USB 2.0 provides for a maximum cable length of 5 meters for devices running at Hi Speed (480 Mbit/s). The primary reason for this limit is the maximum allowed round-trip delay of about 1.5 μs. If USB host commands are unanswered by the USB device within the allowed time, the host considers the command lost. When adding USB device response time, delays from the maximum number of hubs added to the delays from connecting cables, the maximum acceptable delay per cable amounts to 26 ns.[80] The USB 2.0 specification requires that cable delay be less than 5.2 ns per meter (192 000 km/s, which is close to the maximum achievable transmission speed for standard copper wire).
So 5 meters would seem safe in terms of error free transmission (perhaps not necessarily with respect to sound quality, though), but I don't know what limit can be expected beyond that distance.

As for the original question, "Can someone explain the need for expensive USB cables for short runs?": I can't explain the need for **highly** expensive cables, if any, but I believe I can explain some of the reasons why many people report sonic differences between USB cables, and why investing in something better than a generic cable figures to often be worthwhile.

As you alluded to in the OP, the reasons relate primarily to jitter, and possibly also to the effects of coupling of noise into analog circuits. Cable differences will affect signal risetimes and falltimes, waveform distortion resulting from impedance mismatches, susceptibility to groundloop-related noise (that depending in part on the resistance and inductance of the ground conductor in the cable), and the transmission of noise that is superimposed on the signal by the computer.

Depending on the quality and characteristics of the design of the DAC, all of those factors will to some degree affect the amount and the frequency characteristics of noise that may end up at the point of D/A conversion, thereby affecting jitter. In some designs it seems conceivable that analog circuits further downstream may also be affected. Paths by which noise can potentially couple from the USB port to the point of D/A conversion and/or analog circuits include grounds, power supplies, stray capacitances, radiation through the air, and other such paths that may bypass some of the intended signal path.

I should emphasize, though, that none of this provides a basis to assume a high degree of correlation between cable performance and cable price. And none of this provides a basis to assume that a comparison between a variety of specific cables will yield results that are consistent among different computers, DACs, and systems. And none of this provides a basis for attributing specific sonic characteristics to specific USB cables, as audiophiles often seem to do.

Regards,
-- Al