Ripping CD's - Bypassing Computer CD Player


At the risk of sounding stupid, could someone point me in the right/best direction of how I can rip my CD's to a hard drive while maintaining fidelity? Hold on, I know how to do it with my computer and I know the difference between lossless and lossy files. My concern is that the CD players on computers are not of sufficient quality to do a really good job. I've tried to find the best CD player for my computer, but I know it's not nearly the quality of my stereo componentry. My thought is to use my "audiophile" quality CD player(s) to rip to a storage medium. Is there a component that I can attach to one of my current CD players that would seamlessly backup the CD's and/or a combination CD player/hard drive that would do the same thing?
Ag insider logo xs@2xnab2

Showing 8 responses by dtc

The transport in a CD player has to deliver the bits without retries for data errors given the need to deliver the bits in real time. The drive in a computer is not under any real time constraints, so it can resample as much as necessary to get an exact copy, as long as the CD is not really damaged. In fact, the computer drive should do a better job of insuring that all the bits are correct.

That said, there used to be regular reviews of drives for ripping. Not sure if those reviews are still being done or not. In the early days, everyone swore by Plextor drives. But they have not been made since 2005 or so. Latter Plextor drives were just re-badged from large suppliers.

Using a ripper like dBpoweramp you should have no trouble with pretty much any drive in your computer. dBpoweramp will also check your rip against other people's rips, which is an added level of assurance.
NAB2 - those discussions never reach a conclusion. There are people who insist that two bit for bit identical files sound different. There are people that think that flac always sounds worse than wav. There are people that think that a wav file that has been converted to flac and back to wav will sound different. There was even a controversial 4 part article in the Absolute Sound that found sound differences in pretty much every conceivable way to produce bit for bit identical files.

Who knows what is going on, buy there is a small group of people who are absolutely convinced of these differences, but most people cannot hear the differences on their systems. Its kind of like expensive power cables, little silver cups, myrtle wood blocks, etc. You can spend your life looking for these differences or you can listen to the music. Just depends on which part of the hobby most interests you.

You might want to spend some time experimenting. Most of us have. I must say getting a new Chord Hugo DAC swamped any possible improvements in wav versus flac, etc.
Kijanki - I assume bit perfect software/hardware delivery is a given. I am talking about different sound from bit identical files. There is no timing information in a digital file so the jitter comes on playback. There are lots of sources of it, as you say. But, how do you get different jitter from 2 identical files played back on the same system? It is theoretically possible, for example, if one file is contiguous and the other is badly fragmented and you computer and disk drive are really noisy. But if two bit identical files are contiguous and on the same platter on the same drive, some people will still say they sound different. That is the part I just cannot hear. Can you?

Let's not take this thread down the road of debating all those issues, unless the OP wants to. There are certainly endless threads on that topic I just wanted to explain to him some of the issues that are so often debated.
Some people claim that 2 bit for bit identical wav files will sound different if one was ripped directly to wav and the other ripped to flac and converted to wav. Same if a wav file is converted to flac and back to wav. Makes no sense to me, but some people claim they sound different.

Comparing flac to wav, some people claim the wav sounds better because the flac has to be decompressed and the extra CPU cycles needed to do that produce electrical noise that degrades the quality of converter or DAC conntected to the USB. On my system, the cpu runs at well less than 5% while decoding flac. Hard for me to understand how that changes a galvanically isolated DAC. I can understand that they will be a difference if the computer is controlling the timing. But in most cases today the external device is controlling the timing. The computer just needs to have a full buffer.

Different file layouts should not change the sound, unless you believe that the minute differences in how the CPU processes them causes a change in the connected equipment. The bits delivered to the buffer are identical for flac versus wav. Again, I am assuming the converter or DAC is controlling the timing.

Personally, I do not hear these differences. I just wanted to help the OP understand the issue.

There are also now devices on the market that try to completely isolate the USB signal lines and the ground from the PC. They also provide a separate 5V supply, independent of the computer. These devices may help if you have a particularly noisy PC or a poorly implemented DAC or converter. Some people swear by these devices, others ignore them.
The timing in most USB connections today is controlled by the converter or the DAC, not by the PC. So, there should be no timing issues specific to the PC. All it has to do it keep the buffer full and deliver the data when asked. I guess it is possible that the USB on the PC does not handle the interrupt request from the converter/DAC correctly, but I think that is a remote possibility. In any case, that should be independent of the CPU and disk usage, unless the USB implementation is really bad.

People are concerned about electric noise on the USB ground and on the signal lines, as well as RF noise from the PC that can effect the converter or DAC. That is why the devices I mentioned earlier are available. More and more equipment has galvanic isolation on the data lines, usually in the form of small transformers. Expensive cables can be used to try to eliminate RF problems on the cable and connectors.

Using a separate +5V power supply certainly makes sense if the USB device is powered by the +5V on the USB cable. Certainly, a $500 linear power supply will be much cleaner and more stable that the signal on a USB. However, that has nothing to do with file formats, HD usage, CPU usage etc.

As to bit identify files in the same format sounding different, that makes absolutely no sense to me. In the Absolute Sound article I mentioned, the authors said the sound go worse the more times files were converted, even if they ended up bit identical. I will say, they did get a lot of push back on that topic. However, others do report similar experiences. But, hey, if people believe they hear differences, that is up to them.

If people think they hear differences from all these things, who as I to say? For me, these effects are interesting to think about, but I don't hear most of the differences that these people report.

I did not mean to get this thread off topic, just wanted to summarize some of the issues that people discuss.
" When rip wav->flac->wav, depending on the software, possible loss of precision?"

That is not likely. The data is integer not floating point format. And, you can compare the first and last wav files and they will be bit for bit the same. People do binary compares, find the files identical, and report they sound different. At one point I spent a lot of time doing binary compares and looking a file formats. I taught computer science for years, so I am pretty familiar with the issues here. I never found any differences in the files people claimed sounded different. It makes no sense to me that those 2 wav files would sound different, but some people have reported it.

"If file is already wav, why rip again to wav?"

Some people have ripped to flac and then decided to convert to wav because they think it sounds better. Somebody then pipes in and says they should re-rip everything, since the converted wav file will not sound the same as the ripped wav file, even though they are bit for bit identical.

Let's not get too deep into this. There are lots of discussions on AA and other places about this and nothing ever gets resolved.
Loss of precision is often a a result of floating point calculations. That is why I mentioned it. The calculations used to compress and decompress files is entirely an integer process. The program may make mistakes, but given you can convert to wav to flac and back to wav and have a bit identical file says that the conversions are being done properly. There is no evidence of any program error or loss of precision. The files before and after conversions are the same based on binary compares, not just checksums.

As to bytes versus integer, in a typical 16/44 file each datapoint is an 16 bit number, which cannot be represented by a byte. Hence, my reference to integers rather than bytes. But the real point is that the calculations do not result in any loss of precision.

The wav to flac to wav tests are just that - tests. They try to see if the conversions influence the sound. People have done this because of the reasons I stated. Some people believe the original and final wav files do sound different, even though the original and final wav file are bit for bit identical. You can say STOP, but people report that they can sound different. I am just summarizing what some people report.
If you want to read more about the controversial Absolute Sound article where much of the above is discussed...

Absolute Sound Forum