Take a more holistic approach to audio. Narrowing it down as you do will not guaranty success in the frivolous pursuit that is audio. The same broad approach works equally well for life in general.
6 responses Add your response
There was a time when transistors were replacing tubes, and table radios and portables were advertized as having some particular number of transistors. The more the better, supposedly. Of course manufacturers quickly picked up on this and started populating their circuits with numerous transistors that basically did nothing. The same may be true for bits.
When CD players first came out, I bought a Mission player simply because it sounded much better than the Sonys, which I gone out to the store to buy. After I bought it I discovered that it had two 14-bit D/As whereas the Sonys had one multiplexed (shared) 16-bit D/As. The Mission, like Philips used oversampling to get effective 16-bit resolution. The reason for the 14-bit design was that the last two bits of then-vailable 16-bit D/As were just meaningless noise, whereas all the bits were valid in the 14-bit units. I wonder how many people bought the Sony because it claimed two more bits?
All things being equal, more bits should give better resolution, but all things are not always equal. Also, there are diminishing returns as you go over 16 bits. Trust your ears.
The statement that "One CD player has 24 bits, the other 18" makes no sense. Players do not "have" bits. The CDs have the bits, and they all have the same number--16 per sample, and 44,100 samples per second. No CD player in the world can give you more resolution than that (with an asterisk for HDCD).
These "specs" are concocted in marketing departments, and there is no meaningful way to compare the marketing inventions of one manufacturer to another--let alone any reason to. It's quite possible that "18 bits" and "24 bits" are not even referring to the same thing. (I suspect the latter may refer to 64x oversampling, but that's just a guess. And I doubt the "18-bit" machine is doing only 4x oversampling.)
Tell us what the two units are, and you will get some opinions.
As a practical matter, there are virtually no commercial recordings that utilize more than 20 bits. Infact, within the recording industry, the last 4 bits of a 24-bit system are referred to as "marketing bits". Therefore, for the foreseeable future, any machine that provides true 20-bit playback is about as good as it gets. 18-bit reproduction will be -- at best -- a marginal improvement over 20-bit. (These statements are not mine personally -- they were remarks from several of the top recording industry pros, several of whom attended the last meeting of our Northwest audio club.)
Over the years I've owned 11 different CD players and 8 different D/A converters. Each has represented the latest pinnacle of technology for that time. When the 24 bit chipsets became available, they were the single largest improvement in my front end. Even my first generation 24 bit dac sounded much more like an ultra high end turntable, or reel to reel master recording (which are my references as what a makes a great digital playback front end).
I realize that the "extra" 4 bits should be technical overkill since the recording only starts with 16 bits. But how else can I explain the sonic improvement with 24 bit upgrade? I have a couple dacs that have identical power supplies, output stages and clocks and the 24 bit versions are much more analog sounding. I should also note that my last two 20 bit D/A converters were no slouches either! They both contained Burr Brown's absolute best 20 bit "K select" dacs! Yet my next D/A, with only cheap Crystal Delta Sigma 24 bit dacs, was MUCH better sounding in terms of depth, soundstage size and preservation of space!
Personally, I'm glad that many manufacturers chose to go with the " 24 bit overkill" chipsets. Until 24 bit chipsets became available, I never could get into music with the same lust and passion as with my analog rig!
Note: It is important to realize that it takes MUCH more than just a 24 bit converter chipset to make a killer sounding CD player or stand alone D/A converter. Many CD player and D/A manufacturers blow their latest 24 bit designs by using poor sounding output devices, cheap clocks and/or crappy power supplies. These unfortunate design decisions make their "latest generation" CD Player/DAC's sound worse than a very well designed older generation 16, 18 or 20 bit digital front end. That being said, I still FIRMLY believe that 24 bit chipset(s) are ABSOLUTELY needed to produce the "best of the best" from digital playback. If you want to get the closest to the master tape with your digital set-up, it MUST contain a 24 bit chipset IMHO. Think of the 24 bit chip as a 12 cylinder engine. You can make a powerful V-8 but starting with a 12 cylinder engine always gives you more to work with (even if it seems like overkill). 24 bit chipsets are NOT marketing gimmicks when done right! They're the real deal!!