Ripping CD's - Bypassing Computer CD Player


At the risk of sounding stupid, could someone point me in the right/best direction of how I can rip my CD's to a hard drive while maintaining fidelity? Hold on, I know how to do it with my computer and I know the difference between lossless and lossy files. My concern is that the CD players on computers are not of sufficient quality to do a really good job. I've tried to find the best CD player for my computer, but I know it's not nearly the quality of my stereo componentry. My thought is to use my "audiophile" quality CD player(s) to rip to a storage medium. Is there a component that I can attach to one of my current CD players that would seamlessly backup the CD's and/or a combination CD player/hard drive that would do the same thing?
Ag insider logo xs@2xnab2
Some people claim that 2 bit for bit identical wav files will sound different if one was ripped directly to wav and the other ripped to flac and converted to wav. Same if a wav file is converted to flac and back to wav. Makes no sense to me, but some people claim they sound different.

Comparing flac to wav, some people claim the wav sounds better because the flac has to be decompressed and the extra CPU cycles needed to do that produce electrical noise that degrades the quality of converter or DAC conntected to the USB. On my system, the cpu runs at well less than 5% while decoding flac. Hard for me to understand how that changes a galvanically isolated DAC. I can understand that they will be a difference if the computer is controlling the timing. But in most cases today the external device is controlling the timing. The computer just needs to have a full buffer.

Different file layouts should not change the sound, unless you believe that the minute differences in how the CPU processes them causes a change in the connected equipment. The bits delivered to the buffer are identical for flac versus wav. Again, I am assuming the converter or DAC is controlling the timing.

Personally, I do not hear these differences. I just wanted to help the OP understand the issue.

There are also now devices on the market that try to completely isolate the USB signal lines and the ground from the PC. They also provide a separate 5V supply, independent of the computer. These devices may help if you have a particularly noisy PC or a poorly implemented DAC or converter. Some people swear by these devices, others ignore them.
Knghifi, I cannot hear the difference between different
formats or sources but it might be related to jitter
suppression in my DAC. Jitter might be related to amount of
electrical noise in the system making it difficult to
compare since this noise is changing. Radio stations have
to cut power at certain time, possibly at 6PM by FCC rule
(since propagation at night is much better). Testing one
file vs identical file before and after 6PM could result in
different sound.

Dtc, It is possible that decompressing or compressing file
that is being played somehow affects the timing but once
files played have the same checksum they have to sound the
same no matter how many times converted before. Badly
fragmented HD wouldn't change the timing since timing is not
attached yet (it is data) while HD is at least 1000x faster
than necessary to deliver this data (while data goes thru
buffers) but might possibly change amount of electrical
noise drive produces. It is far fetched but I've learned
not to question what other people can or cannot hear
especially when younger and/or musicians (trained ears).
Some people claim that 2 bit for bit identical wav files will sound different if one was ripped directly to wav and the other ripped to flac and converted to wav. Same if a wav file is converted to flac and back to wav. Makes no sense to me, but some people claim they sound different.
If file is already wav, why rip again to wav? When rip wav->flac->wav, depending on the software, possible loss of precision?

To confirm integrity of 2 wav files, do a binary compare. Google binary compare ... free on all platforms.

Ripping is basically wrapping data into another format.

Comparing flac to wav, some people claim the wav sounds better because the flac has to be decompressed and the extra CPU cycles needed to do that produce electrical noise that degrades the quality of converter or DAC conntected to the USB. On my system, the cpu runs at well less than 5% while decoding flac ...
Processing a file is not CPU intensive. The most important is timing and logic (software) in processing the different file formats.

Some people claim ...
Some people claim they can hear speakers, DACs ... breaking in after 2000+ hours. Depending on my mood, how much wine had for dinner, time of day ... my system sounds different. Is this system breaking in or just product of the environment?
The timing in most USB connections today is controlled by the converter or the DAC, not by the PC. So, there should be no timing issues specific to the PC. All it has to do it keep the buffer full and deliver the data when asked. I guess it is possible that the USB on the PC does not handle the interrupt request from the converter/DAC correctly, but I think that is a remote possibility. In any case, that should be independent of the CPU and disk usage, unless the USB implementation is really bad.

People are concerned about electric noise on the USB ground and on the signal lines, as well as RF noise from the PC that can effect the converter or DAC. That is why the devices I mentioned earlier are available. More and more equipment has galvanic isolation on the data lines, usually in the form of small transformers. Expensive cables can be used to try to eliminate RF problems on the cable and connectors.

Using a separate +5V power supply certainly makes sense if the USB device is powered by the +5V on the USB cable. Certainly, a $500 linear power supply will be much cleaner and more stable that the signal on a USB. However, that has nothing to do with file formats, HD usage, CPU usage etc.

As to bit identify files in the same format sounding different, that makes absolutely no sense to me. In the Absolute Sound article I mentioned, the authors said the sound go worse the more times files were converted, even if they ended up bit identical. I will say, they did get a lot of push back on that topic. However, others do report similar experiences. But, hey, if people believe they hear differences, that is up to them.

If people think they hear differences from all these things, who as I to say? For me, these effects are interesting to think about, but I don't hear most of the differences that these people report.

I did not mean to get this thread off topic, just wanted to summarize some of the issues that people discuss.
" When rip wav->flac->wav, depending on the software, possible loss of precision?"

That is not likely. The data is integer not floating point format. And, you can compare the first and last wav files and they will be bit for bit the same. People do binary compares, find the files identical, and report they sound different. At one point I spent a lot of time doing binary compares and looking a file formats. I taught computer science for years, so I am pretty familiar with the issues here. I never found any differences in the files people claimed sounded different. It makes no sense to me that those 2 wav files would sound different, but some people have reported it.

"If file is already wav, why rip again to wav?"

Some people have ripped to flac and then decided to convert to wav because they think it sounds better. Somebody then pipes in and says they should re-rip everything, since the converted wav file will not sound the same as the ripped wav file, even though they are bit for bit identical.

Let's not get too deep into this. There are lots of discussions on AA and other places about this and nothing ever gets resolved.