Analogue clipping from digital sources


Given the high output (typically >2v in RCA, >4v in balanced mode) there is in my experience a significant risk of overloading both the analogue stage of the DAC and any pre- or power amp downstream. Fighting that with low volume settings on the attenuator only further aggravates the issue. In my case I have to run the InnuOS Zenith’s Mk3 output at 85% to deal with audible overloading of the DAC/ amp with audible distortion. Anyone with similar experience?

antigrunge2

@ltmandella : care to elaborate? Or point to relevant sources? Pls don’t make unsubstantiated assertions when posting. Thanks.

and for everyone else, you can research "intersample peak" and intersample peak overs.  It is absolutely a known phenomenon among mastering engineers and discussed regularly, and demonstrated in various testing. 

The consensus on audibility is that it is hardware (DAC) dependant.  Can be very objectionable on very accurate hardware but generally not audible in lossy or low end repro chain.  To avoid requires headroom in the encoding not always allowed due to the loudness wars.

Probably why I prefer DSD.  I am unfortunately very annoyed by any high frequency or peak glitches in digital.  Makes me want to immediately throw the offending component riight out the window...

 

 

 

@ltmandella If I am not mistaken, you are referring to a problem resulting from ‘capping’ peaks during digital recording leading to distortion during the DA conversion, i.e. not clipping although it resembles audibly. This seems to be yet another roadside accident from the loudness wars; thanks for pointing it out.

That is a mastering problem which in my mind cannot be addressed through any means by audiophile consumers (unless they decide to use low resolution equipment) So I don‘t yet understand what your point is in the context of this discussion