CD Copies...why do they sound worse?


I had a theory that I haven't discarded yet that not all CD blanks are equal in terms of composition. Yes, they all are made of aluminum and polycarbonate, and when you burn a CD you are creating small holes, or dents in the blank. There is the red book standard that must be adhered to, but as in anything else, I'm sure there are better grades of aluminum and poly available, you get what you pay for. Since the laser reads the digital stream by optically scanning the surface of the CD and interpreting either a one or zero, you'd think it's a go/no-go operation. The original and copies do not sound the same, even to the uncritical ear. I thought for a while it may have had something to do with the relative quality of the CD blanks I was using to copy, in other words, the pressing plants simply use a better grade of master CD's. My friend has a contact and we were able to acquire bulk CD blanks from Saturn Disc that makes CD's. No difference, copies still aren't right. I guess we can eliminate the CD blanks for now. Here's where things get a little outside normal thinking in my twisted logic: we know there are error detection and correction schemes used in intrepreting the data on the CD, employed when the bit being read isn't immediately recognizable to the player. Is it possible the home-made copy that was burned using a cheap consumer grade burner, contains more errors? Are the pits burnt in the CD either irregular in shape or depth? Does the laser in these consumer grade CD burner introduce errors? If so, the EDAC is pretty busy, and doesn't always get it right, which would explain a general lack of quality due to latency delays in the data stream while the EDAC does it's work, and in the process is bound to mis-interpret zeros and ones, there is no 100% accurate EDAC. To me, this is a good place to start in terms of understanding the obvious differences in sound quality.
jeffloistarca

Showing 2 responses by 1439bhr

Here's a case where it's not necessarily so that "bits is bits". It is true that the transcription (CD-R) cannot be more accurate than the original. But even if the transcription is identical to the original, the playback machine may read the original aluminum pitted disk more or less accurately than it reads the dye marks on a CD-R. Compounding that is the likelihood that the CD-R will have transcription errors and that all CD players have built in error correction coding processing that is used to detect and interpolate through, if not correct errors... too bad that CD players don't have a little instrumentation readout to give some idea of the level of BER being experienced. Looking at the BER for a stamped CD vs the CD-R transcription would give some quantitative clues about what's really going on.
Here's what's puzzling me... doesn't a CD-ROM burner have PERFECT bit-for-bit error-free capability when recording a computer data or program file? Think about it: if perfect (i.e., vanishingly small error rate) transcription and playback aren't achieved, recordable CD-ROMs wouldn't be acceptable as a reliable media for computers. Lesser performance for audio CDs and componentry is not acceptable, as low cost equipment can achieve virtual perfection relative to the writing and reading of digital data on optical media, as demonstrated by the personal computer industry.