Anyone Ever Tried This...?


During a turntable based listening session I had an uneasy feeling that the T/T may have been running fast...

Rather than dig out the strobe disc + accessories, for fun, I decided on an alternative way of verifying the speed long-term. I thought why not run an LP and CD in parallel and see which one wins the “race” and by how much?
(First time I’ve ever tried this experiment...) 

One of the few good things about CD is that the program length will always be indistinguishable from the original source. Also the “timeline” will be better than any species of turntable. It can be considered a “true reference” in that regard.

Method was very simple. The CD was fed through a TVs own internal speakers from a blu ray player. The analogue source was replayed via the system within the same room.
Bearing in mind the possibility of eccentricities in LP geometry/drilling skewing the result, which LP to choose for the comparison?
The chosen LP track was Mike Oldfield’s “Tubular Bells” Part 1. The beauty of this track is that the piece runs for 25-26 minutes uninterrupted by track breaks thereby presenting an ideal opportunity to estimate not only the speed error but any tendency to manufacturing irregularity. (In other words, perhaps an indicator of how disruptive the LP manufacturing process might turn out to be in the regular course of playing an album.)
BTW..the turntable under test is of the massive, unsuspended, belt-driven variety.

It took a good few attempts to get the 2 sources synchronised to the point where the difference was negligible. (Needledropping the track, first, definitely helped)
...and finally, they were off!

So who won?

Well, in the end, the turntable “won the race” by a time difference I estimated to be less than 100mS.(Put that estimate down to decades of digital HT experience).
So the average speed error could be of the order of +0.006% or less (by no means a “badge of distinction” but not exactly poor either. I suspect that state-of-the-art DDs might fare slightly better along with mega-expensive belt drives)
The end result still surprised me and was way better than “broadcasting standard” of 3%. ;)

If you’d like to try this experiment yourself, Tubular Bells Part1 is particularly effective for judging the delay because it presents a large number of isolated “pulses” of guitar sound at the very end so you’ll get any number of practice tries before making a final judgement or measurement.

I’m sure those with audio analysis software will be able to take a short cut and give more precise figures.... ;)

moonglum
No, it was a TW Acustic Raven One...

Interestingly, the album that triggered the whole exercise (Camel “Mirage”) by sounding “too pacey” was later checked against the CD version and was found to be indistinguishable in that regard. LOL!
I am not sure I agree with the basic premise that CD playback is free of any timing error.  The digital encoding may be, but during playback, you have a rapidly spinning disc being scanned by a moving laser beam.  Those two mechanical devices have to operate perfectly and in concert with each other, if the timing is to be "perfect".  Since we live in the real world, I would bet that doesn't happen.  Second, you neglect the fact that a human engineer was involved in generating the two different forms of software from presumably a tape or a digital master.  Said engineer may well have fiddled with the content such that the two versions of the same piece would be different in time duration from one another.  Finally, I don't believe we can reliably detect a time interval as short as 100msec.  Did you test yourself for your reaction time?  Did you repeat the experiment at least 3 times?  That would help to compensate for errors at the start point, if you averaged several results.  And finally, if you're going to check timing in this fashion, why not in fact use a strobe, as Chayro suggests?