Quality of digital cable from from source to DAC?


Hoping just to gain some understanding.

With respect to the transfer of digital data via metal cable in the audio spectrum, the seeming common thought I’ve managed to discern thus far is that while there is likely no apparent SQ differences to be gained in upgrading the quality of my Cat 6 ethernet cable that link my router (ATT Optical feeding an Apple Airport Extreme) to my streamer (50’ run of off the shelf Cat 6 feeding a Lumin D1), there seems to be considerable reviews claiming significant SQ improvements to be had in utilizing ‘higher quality’ digital cables linking the source (streamer or CD transport) to a DAC (Qutest). Why would this be? Is the digital data file going from the router to the streamer somehow different, or more suspect to error, than the digital data file going from the streamer to the DAC?

Any insight would be greatly appreciated, even if nothing more than passing a long a related link.

Thanks,
Todd
ecolnago

Showing 2 responses by kijanki

@ecolnago
All digital cables affect sound of the DAC differently.  In order to make sure DAC receives just right amount of data it has to synchronize internal D/A conversion rate with incoming data (S/Pidif) or receives data at different rate, but signals back to increase or reduce amount of incoming data (async USB, Ethernet).  In case of async USB or Ethernet D/A conversion clock is independent of incoming data timing, but it might be affected indirectly by noise injected by the cable into DAC. S/Pdif delivers data in real time and the average rate of this data adjusts rate of D/A conversion. S/Pdif cable can affect this rate two ways. Coax S/Pdif cable transitions are fast, hence susceptible to electrical reflections in the cable, that add to and modify shape of transitions affecting timing. DAC corrects most of it by adjusting to average of the frequency, but it is not perfect. In addition cable injects picked up electrical noise. Optical cable is different. It doesn't inject electrical noise and doesn't create ground loops (like coax cable can), but transitions are very slow. Slow transitions in presence of electrical noise can affect exact moment in time of level recognition (crossing threshold). It all comes to making clock of internal D/A conversion stable in time. Jittery conversion clock produces noise added to music.

In case of Ethernet or USB I would pay attention only to quality of shielding and run cable away from other cables. Same goes for coax S/Pdif, but matching characteristic impedance of the cable (to avoid reflections) is also very important. It is also desired to keep cable very short - less than foot (to avoid reflections) or longer than 1.5m (to avoid first reflection).  With optical cable, quality of the cable (clarity, etc) plays role, but the most important is to keep system electrical noise low, by using power conditioners for the source of the signal and the DAC.


Todd.  Let me try to explain jitter.  Imagine you play 1kHz sinewave recorded on your CD.  Digital words of changing amplitude, representing sinewave, are converted in even intervals into analog values by D/A converter.  You get analog 1kHz sinewave.
Now imagine that these time intervals are not exactly even, but are getting shorter and longer 50 times a second.  Now you won't get only 1kHz sinewave but also other frequencies, mainly 950Hz and 1050Hz called "sidebands".  Distance from the main (root) frequency depends on the frequency of the interval change (jitter), while their amplitude is proportional to amount of interval change.  These new sidebands have very small amplitude, but are not harmonically related to root frequency (1kHz) and that makes them still audible.
With many frequencies (music) there will be many sidebands - practically a noise added to music.  Sidebands have small amplitude that is proportional to amplitude of the signal.  This noise stops (is not detectable) when music stops playing.  You can only hear it as lack of clarity in the music (since something was added).  

Many factors can produce D/A conversion clock jitter.  Cable can be one of them, while injected electrical noise can be another.  In case of Ethernet or USB cable cannot affect timing directly, since samples are placed in the buffer ahead of time, but cable can still inject electrical noise into the DAC. 

Jitter can be induced by one noise frequency (correlated jitter), like 50Hz in our example, or by multi-frequency electrical noise (uncorrelated jitter).  Effect of jitter is less audible when frequency of the jitter is low, because sidebands will be closer to root frequency, hiding better (less audibly visible) . Also random jitter (uncorrelated) should be less audible, being sort of averaged.

(As for 90% speed of light it is more like 60-70% dependent on the dielectric.  I assume about 5ns/m)