How is more than 16 bits better for CD?


A response to Roadcykler's question made me wonder about a related topic... If the data on standard CD's is encoded as 16bit, how can an 18, 20 or 24 bit DAC improve things? That is, if the waveform of 16 bit audio is made up using 65,536 levels, where do these extra bits come in? Does the DAC 'guess' the extra bits?
carl109
Shadorne wrote:

" So the extra bits of a 24 bit DAC playing 16 bit CD data only help improve performance when PROCESSING the signal, such as upsampling and EQ adjustments. It does not improve the dynamic range of the original 16 bit CD data. A higher accuracy (24 bit DAC versus 16 bit DAC) will reduce artifacts from "rounding/truncation" in the processing. (Upsampling 44.1 Khz data being a processing step)."

I never thought of the truncating process that is surely ocurring during writing on a CD down to 16/44.1Khz from original 24/96 khz recording. It seems the increased word length and upsampling is trying to 'simulate' the Original signal that was truncated. Very interesting!!

Am I interpreting correctly?
Do any of the chip manufacturers even currently make 16bit chips for audio use?
My understanding is that the extra bits are to increase the accuracy of the most significant bit since a 16-bit converter cannot, in practice, do a 100% linear conversion.
I believe that the Consonance "Linear" CD Players use a 16 bit/non oversampling DAC