Using BLER to determine audio quality of CDs?


No sure if this post goes under Music or Digital, but here goes.

Has anyone used a CD/DVD RAM disc scanner to get block error rate (BLER) data of their audio discs? I was doing some ripping of my audio CDs and I have a software utility for my Plextor CDROM drive that lets me scan CD/DVDs for BLER errors. It reports both C1 and C2 error rates. Usually, this is used to check burned CDs and DVDs for errors, but I though I would try it on my commercially produced audio CDs.

I have some older discs from the early 80s, and was shocked to find some very high error rates. For example, an early 80s American production of Pink Floyd Dark Side of the Moon produced C1 AND C2 error rates in the thousands! The Rebook spec is MAX 220 for C1. Animals and Wish You Were Here were equally as bad.

Maybe this is evidence of "CD rot", although I have old Polygram German pressings (Camel Snow Goose, Moonmadness) of the early 80's that show C1 error rates of Max 20, and zero C2. Also, my Johm Klemmer Touch cd (early 80's) shows C1 rates of about 25 max, no C2 errors, and its an MCA release.

Thought this might be a good method for checking the older CDs for increased errors, or even for new pressings to weed out potentially bad sounding CDs.
dhl93449
I usually let my ears tell me what something sounds like. Maybe I'm just living in the past. Hey, isn't that a song title? :)
Tpreaves:

Of course, but my ears tell me something is not quite right with those PF albums. I won't say its a one for one correlation, esp since I do not have another copy/recording for comparison. But I do find the CDs with the lower rates sound good.

Its stands to reason because high C2 errors cause more error correction extrapolation to be occuring. Maybe this is audible, maybe not. But its like having more distortion. Sometimes an amp having higher distortion is audible over one having lower distortion, sometimes not. But if you have a choice, many opt for the lower distortion.

I would think that any disc having low to zero C1/C2 errors could not sound worse than one with thousands of errors, all other things being equal.

If you do a search for CD block error rates, you will find some sources (who produce CD masters) that spec under 2 C1 errors and zero C2 errors for the highest quality masters.
Dhl93449, CD read errors are corrected to certain degree and interpolated above it. For instance, scratches along the disk shorter than 4mm will be corrected but between 4-8mm will be interpolated. Above 8mm gap/pop is produced.
I found this in a search. Maybe helpful to explain what is goin on:

"CD-ROMS and CD-R discs are encoded with Cross-Interleaved Reed-Solomon Code (CIRC). This code uses interleaving to distribute errors and parity to correct them. With a bit rate of over 4.3 million bits per second, the need for robust error correction is obvious. The error rates in the low-level decoding strategy are monitored over two levels -- referred to by most hardware manufacturers as C1 and C2. A third level of "Extended" Error Correction (ECD/ECC) is used in many (but not all) CD-ROM formats.

A disc's "Block Error Rate" (BLER) is the sum of corrections and passes made in the C1 decoder. The C1 decoder is designed to locate and correct up to two bytes of information on a CD block. If more than two bytes are detected, the entire block is passed to the de-interleaving stage and the C2 decoder.

In most cases, only a small amount of BLER represents uncorrectable blocks. The Red Book allows for a raw error rate in the C1 decoder of up to 3 percent of the possible blocks in errors per second over a ten-second range.

BRST (Burst Error) is a localized group of missing data, perhaps caused by a speck of dust or a scratch -- a burst of errors in one spot. It is the same data as that tested for BLER, but unscrambled (de-interleaved) before it is checked. Interleaving is aimed at correcting BRST. It is easier to correct one bit out of 10 bytes than 10 bits out of one 16-bit word, which is why the data is encoded or interleaved across an entire block.

Often referred to as E32 or E42 errors, a disc's uncorrectable error count represents the number of blocks that could not be corrected by the de-interleaving and C2 decoding stage. The block errors corrected or passed through the C2 decoder by and large tend to represent non-random or physical flaws, which cause the most concern in CD-R testing.[These are the "C2" errors] While CD-R discs frequently have lower BLER rates than pressed discs, they far exceed their replicated brethren in E32s and other uncorrectables, since by definition the Red Book specification does not allow any errors to pass the C2 decoder."

Another quote from a CD mastering source:

"C1 Errors.
C1 Errors refer to the block error rate (BLER), which consists of bit errors at the lowest level. C1 errors are always expressed
in errors per second. All CDs and CDRs contain C1 errors. They are a normal result of the write process. However, the maximum C1 error rate for a quality recording is an average of 220 errors per second based on 10 second samples.

C2 Errors.
C2 Errors refer to bytes in a frame (24 bytes per frame, 98 frames per block) and is an indication of a CD player's attempt to use error correction to recover lost data. C2 errors can be serious. A CD player may correct them, then again, it may not.
C2 errors are usually an indication of poor media quality, or a CD writer's failure to produce a quality burn (see conclusion).

CU Errors.
CU Errors refer to uncorrectable errors that are present after C2 error correction. No CU errors are allowed in a recorded disc. Generally, discs with CU errors cannot be played at all because they contain data that cannot be recovered.

Conclusion.
CD replicators consider a disc with an average of 220 C1 errors per second, "a good quality disc." Typically, our masters average less than 1 C1 error per second with absolutely no C2 or CU errors. We have our own standard which states that in addition to no C2 or CU errors, we will not ship any disc that averages more than 2 C1 errors per second."

So, if you have a disc with thousands of C1 and C2 errors, the chances are pretty good that the results will be audible.

If you have one with 20 or so, chances are it will sound will be mostly uneffected by error correction in the CD player.

I am using the CDROM scanner to selectively weed out CDs that may need replacement, or ones that I would not use as a ripping source for archiving.

On could use one's ears solely for this, but I prefer some quntatative data or measurements to support the process.
Are these numbers just something that you are curious about? A recording either sounds good to you or it doesn't. You can crunch numbers all day long but the outcome is still the same.
Well, here is an update. As with a number of things, a measurement may not always be the measurement you think it is...

I am curious about the numbers as a potential screening method, particularly since I have a large number of 80's era production CDs, and I suspect some of them may be degrading audibly. So they might go on a potential "replacment" list if they have a lot of errors. I am also curious to see if a method to actually measure "CD rot" can be found this way.

But to get back to the measurement. I have now found that the BLER error rates are measurement speed dependent. If measured at 48X, I will get a large number of errors (thousands of both C1/C2 errors). Freaked me out when I found this on a recently obtained MFSL CD. But slowing down the measurement read speed to 4X (the lowest it will go), produced C1 errors on the same MFSL disc in the single digits or low teens, with 0 C2 errors. Now that is more like it. Since CD players run at 1X, error rates may be even lower.

This begs the question re the new players that use cheap computer CDROM drives and "read ahead" for data, storing it in a buffer for playback. Yes, they have read/re-read error correction circuitry but do you really want this circuitry running on errors that might not actually be there(as real physical defects) if the drive speed is low? IE false errors generated just due to read speed?