It is different but honeywell computers used to use what was called a bit and half to a bite. The bites were 12 bit's so that they could have a higher transfer information if I remember correctly. But it been a long time since I played with 1's and 0's.
- 10 posts total
The SPDIF protocol provides locations in each subframe for 24 bits. They might simply (and misleadingly) be basing their statement on that, and setting the 8 least signficant bits to 0, or they might really be using some or all of those bits. There's no way to tell without more information than is provided in the manual (which I took a look at, btw).
Interesting that you mentioned the Monarchy DAC. Here is a review from Lynn Olson that explains some of the benefits of using 20 bit versus 16 bit, at least in this implementation:
some interesting facts about bit resolution.
Dseanm - It has more to do with the type of converter than number of bits. Dac in your Monarchy (PCM63 - now discontinued) is traditional DAC with laser trimmed resistor divider while most of 24/192 DACs (if not all) are Delta-Sigma. There is nothing wrong wit either approach - just different sound. Some people believe that Delta-Sigma is bad for audio and you can even find statement that Burr-Brown placed in the PCM63 datasheet saying that Delta-sigma are so noisy that they cannot even read lower three bits. Same company short time after made PCM1794 That has 6 highest bits of tradditional DAC and 18 lowest bits of Delta-Sigma. It is funny that they don't use word Delta-Sigma but "Advanced Segmented" instead. There is nothing wrong with Delta-Sigma and even SACD is byproduct of Delta-Sigma modulator before filtering. Same for DSD recording. Like everything else it is subjective and in your case you're not a fan of Delta-Sigma technology (or high oversampling, or digital filtering).
- 10 posts total