Certainly, most of what I have read stresses the
importance of jitter rejection/immunity at the DAC end, presumably becuase
changing interconnect cable lengths might work on some equipment but not
In S/PDIF, timing is set by the transport's clock, as you know. That means that
timing gets screwed up (i.e. jitter gets worse) if the timing information is not
received correctly by the DAC; errors are maintained through to conversion.
One way for the DAC to deal with this is to maintain a constant 75 ohms
impedance from its input. Read Steve Nugent's paper to find out at least one
reason why this doesn't always happen, though.
Furthermore, whether or not the DAC handles impedance in its own signal
path correctly, the RCA connector, when used, invariably offers a mismatch;
there is no such thing as a 75-ohm RCA connector.
These are two causes of jitter in setups with an external DAC and digital
interconnect. The timing adjustment offered by a 1.5m cable length
diminishes jitter from these causes. Almarg's limpid post makes it easy to
Shadorne, you may have heard a lot about jitter rejection/immunity at the
DAC because of the weakness of the S/PDIF clocking system, which uses the
transport's clock, not the DAC's, to establish system timing. ( You can read
more about this weakness at the LessLoss site. ) Double-clocking DACS like
the Apogee Mini-DAC are one approach (in my view, quite successful) to
solving this problem. Another is slaving the transport to the DAC's clock, as
done by Linn (Karik/Numerik), Cambridge (DiscMagic/DACmagic) and
LessLoss. Also, do read Steve Nugent's paper for yet more.
No matter what the DAC's internal design, though, the presence of an RCA
connector suggests that a 1.5m cable may be optimal with that unit.