Please explain encoding/decoding and what is best?

Often I read spec sheets about cd players and they say they
have 1 bit, 16 bit, 24 bit, single bit, dual bit etc., etc., and I am confused as to what that means in terms of what provides the best sound quality. I always thought that the more bits the better but could use some clarification.
Hi Steve; I'll take a stab at this. First, you are just talking about standard CDs here (redbook CD with 16 bit encoding, and not SACD or DVD-A). The CD players digital to analog converter (DAC) does the decoding. Sony, and maybe some others use single bit converters called "Pulse Code Modulation" PCM technology that is easily competitive with 16 bit DA converter chips.

Assuming equal cost, 24 bit DAC chip machines should be better than 20, 18, or 16 bit. But to some extent (and I think most would agree) you get what you pay for in audio, so a high quality 16 bit converter should/could out-perform a cheaper player that advertises higher bit conversion (decoding).

But in practice, 16 bit DACs seldom actually achieve a true 16 bit conversion-- they typically get about 14 bits at best. So a higher bit DAC may/will actually achieve true 16 bit output. Most manufacturers of high quality CD players or DACs usually use 20 or 24 bit DACs to actually achieve a true 16 bit conversion. Remember, you can never exceed 16 bit output unless of course noise, dither, or interpolation are applied as in up/over sampling, then decoded.

Listen to CD players or transports and DACs in your price range and purchase the one you like best and sounds best to you. And as noted above, usually you get higher quality sound as you spend more, but the law of diminishing kicks in at some point. I believe in as high a quality digital front-end as I can afford, but like most I'm budget constrained too. Good Luck. Craig