Differences with CD ripping speeds audio folklore?


I have often wondered why people claim that lower CD ripping speeds produce a higher quality resulting WAV file. After all, wouldn`t people avoid using CD-ROM`s that routinely produce errors? Computer data demands high accuracy, or else programs may not work correctly or data may be inaccurate. In addition, CD`s are encoded with redundant data that allows the drives to automatically correct many errors, and detect those that it cannot correct. So why should reading an audio CD be any different?

So I conducted a test this morning. I used one of my old machines which had an older CD ripping program that allowed me to choose the speed of the rip. I chose 1x. On my newer machine, I used MusicMatch Jukebox to rip it, which averaged at about 25X. I transferred all the files over to my Unix machine and did a bitwise comparison on them. As expected, they are IDENTICAL.

So could the theory that lower CD ripping speeds sounding better be yet another example of audio folklore?

Michael
128x128Ag insider logo xs@2xsufentanil

Showing 1 response by kthomas

I've always been convinced that there is a lot of folklore around the handling of digital data, including the burning of CDRs at differents speeds. There's plenty I quite possibly don't understand or know, but I've always been satisfied that I can prove that I can get bit-perfect copies, something which you can do with freeware or shareware programs.

I'm also convinced that if we can't handle digital music "perfectly", that the audio industry is dropping the ball. We can do much higher data rates basically "perfectly" in any number other applications, so if we can't do it in the audio environment, we're not trying hard enough