How is more than 16 bits better for CD?


A response to Roadcykler's question made me wonder about a related topic... If the data on standard CD's is encoded as 16bit, how can an 18, 20 or 24 bit DAC improve things? That is, if the waveform of 16 bit audio is made up using 65,536 levels, where do these extra bits come in? Does the DAC 'guess' the extra bits?
carl109

Showing 1 response by nonoise

Transnova is correct. I have the Consonance CD Linear and it is a 16 bit, non
oversampling/upsampling (unless you punch in the 88.2 option) with no op-amps and most importantly, no digital filter to mess up the sound.
I know this will sound lame but it seems to me that when you increase the bits, its like tracing over and over on an original drawing.
Sure, it will be 'like' the original drawing, but now you have a bolder line that creates its own version of the original, while at the same time masking/destroying the subtle clues and character of the original. Hence, a 'bolder' bass, 'sparkling' highs, and a somewhat bland middle as it 'guesses' where the lines should be drawn.
I was told that when you go up to (what was it?) 20 or 24 bits, it akin to having a reference point every nine inches from LA to NY.
Sounds like overkill to me when all one had to do was remove the digial filter and properly implement 16 bits as intended.
Don't want to start a firestorm here but its my take and I love the sound.