Implications of Esoteric G-0Rb atomic clock


The latest TAS (March 2008) has an excellent piece by Robert Harley: a review of the Esoteric G-0Rb Master Clock Generator, with sidebars on the history and significance of jitter. This Esoteric unit employs an atomic clock (using rubidium) to take timing precision to a new level, at least for consumer gear. It's a good read, I recommend it.

If I am reading all of this correctly, I reach the following conclusions:

(1) Jitter is more important sonically than we might have thought

(2) Better jitter reduction at the A-D side of things will yield significant benefits, which means we can look forward to another of round remasters (of analog tapes) once atomic clock solutions make it into mastering labs

(3) All of the Superclocks, claims of vanishingly low jitter, reclocking DACs -- all of this stuff that's out there now, while probably heading in the right direction, still falls fall short of what's possible and needed if we are to get the best out of digital and fully realize its promise.

(4) We can expect to see atomic clocks in our future DACs and CDPs. Really?

Am I drawing the right conclusions?
Ag insider logo xs@2xdrubin
I'm having a little trouble following this, but I wonder if "frequency accuracy" is not being used in the same way by all of the posters above. Just to be clear, we are not talking about audio frequency response, at least as I understand what RH is talking about. Dgarretson's summary is excellent.
Jitter occurs all over the place. It is of no significance except at the real-time A/D process or the real-time D/A process, which is when the distortion takes place.

Drubin: The point that Serus and I are both making is that frequency accuracy means nothing regarding audio quality. It is used by some marketers of digital audio technologies to introduce new numbers with flying colors (they look really good on the ads). However, the point in Jitter is not whether the frequency is accurate, it is however the point whether the SAMPLES are accurate. And that's what I am trying to show in my example: samples can be way off and cause massive distortion at the D/A or A/D process but the frequency can be right on target. Frequency is a total amount of oscillations per unit time. Jitter is how much each sample is off time target each and every time. And with clocks, this can be 33 million times a second. So potentially, a clock can make 33 million little mistakes a second and still be accurate to a fraction of a second within years and years of running.

These two things must be differentiated. And it is important to understand that Bach sounds great, whether the music is tuned to A=440 Hz or A=440.2 Hz. Nobody, not even Bach himself, would ever notice the difference. But I think it's safe to say he wouldn't have liked Jitter.

Liudas
JH's point regarding frequency is that a crystal clock is subject to transient fluctuations in output frequency caused by power supply variations & ripple, whereas the output frequency of a rubidium clock is inherently stable. He states that a transient variation in clock output frequency is "the very definition of jitter." Timing errors in the clock translate directly into a misshapen waveform and amplitude errors in the reconstructed analog signal, as well as introduction of spurious sideband frequencies unrelated harmonically to the original signal.

While I accept the benefits of ultra low-jitter clocks, I wonder whether some jitter in clock frequency is reintroduced in the clock-link cables that connect the G-ORb to the transport & DAC.
Do you really think jitter is the only reason for awful digital sound? The P-01 transport has very low jitter but still benefits from the Rubidium clock. There are other transport that have more jitter but sound as musical and with the same detail resolution as the P-01.

Chris