Implications of Esoteric G-0Rb atomic clock


The latest TAS (March 2008) has an excellent piece by Robert Harley: a review of the Esoteric G-0Rb Master Clock Generator, with sidebars on the history and significance of jitter. This Esoteric unit employs an atomic clock (using rubidium) to take timing precision to a new level, at least for consumer gear. It's a good read, I recommend it.

If I am reading all of this correctly, I reach the following conclusions:

(1) Jitter is more important sonically than we might have thought

(2) Better jitter reduction at the A-D side of things will yield significant benefits, which means we can look forward to another of round remasters (of analog tapes) once atomic clock solutions make it into mastering labs

(3) All of the Superclocks, claims of vanishingly low jitter, reclocking DACs -- all of this stuff that's out there now, while probably heading in the right direction, still falls fall short of what's possible and needed if we are to get the best out of digital and fully realize its promise.

(4) We can expect to see atomic clocks in our future DACs and CDPs. Really?

Am I drawing the right conclusions?
Ag insider logo xs@2xdrubin
WARNING: There is a lot of marketing going on here. Please consider the fact that frequency stability is not the issue of Jitter! The issue of Jitter and how that distorts audio at the A/D process is a problem of each sample coming at slightly the wrong time. Frequency stability has nothing to do with this. A small example:

The numbers 3,3,3,3,3,3,3,3 average out as the number 3.

The numbers 2,4,2,4,2,4,2,4 average out as the number 3. (This is as simple as 24 / 8 = 3).

The numbers 1,5,1,5,1,5,1,5 also average out as the number. (This is as simple as 24 / 8 = 3).

However, were these Jitter deviations from perfect timing (fluctuations from the ideal value, which was in this example, 3) you would have had worse and worse audio going down the line.

The marketers of all three pieces of equipment in the above example could easily all have said the same thing without lying: "Our Clock is so stable that you can play it for 3 million years and it will never be off by more than a fraction of a second".

Please notice the significance of this. Be aware of the logical trap that this "frequency stability" terminology is putting people into!

Liudas
Rubidium is not necessary for timing accuracy, but it makes sense that a commercial rubidium reference oscillator would pay special attention to close-in phase noise in the vicinity of the main carrier, which is what's improtant in regard to jitter.
To put the frequency accuracy issue to rest, here are the facts. Standard (i.e. cheap) crystals are guaranteed within 100 PPM (parts per million) in the proper circuit. Add another 100 PPM for extreme temperature/humidity and circuit voltage variations. The total is 200 PPM or 0.02% of frequency accuracy. For a 1 KHz signal it will be off by 0.2 Hz. Even an acute dog wouldn't be able to tell the difference...
It's a joke when some manufacturers offer an "upgrade" to a TCXO option. How about upgrade to a low-noise oscillator? I'll take that option any day!
I'm having a little trouble following this, but I wonder if "frequency accuracy" is not being used in the same way by all of the posters above. Just to be clear, we are not talking about audio frequency response, at least as I understand what RH is talking about. Dgarretson's summary is excellent.
Jitter occurs all over the place. It is of no significance except at the real-time A/D process or the real-time D/A process, which is when the distortion takes place.

Drubin: The point that Serus and I are both making is that frequency accuracy means nothing regarding audio quality. It is used by some marketers of digital audio technologies to introduce new numbers with flying colors (they look really good on the ads). However, the point in Jitter is not whether the frequency is accurate, it is however the point whether the SAMPLES are accurate. And that's what I am trying to show in my example: samples can be way off and cause massive distortion at the D/A or A/D process but the frequency can be right on target. Frequency is a total amount of oscillations per unit time. Jitter is how much each sample is off time target each and every time. And with clocks, this can be 33 million times a second. So potentially, a clock can make 33 million little mistakes a second and still be accurate to a fraction of a second within years and years of running.

These two things must be differentiated. And it is important to understand that Bach sounds great, whether the music is tuned to A=440 Hz or A=440.2 Hz. Nobody, not even Bach himself, would ever notice the difference. But I think it's safe to say he wouldn't have liked Jitter.

Liudas