To avoid this both channels have to have enough bandwidth that this is off the table.
- 741 posts total
I guess I’m doomed to not understanding this. I thought a phase shift like that shown in the plot where it dips is a negative shift which means a lag in the signal.No a negative "-phase angle" is what’s with the speaker bass loading, totally different, the dotted line ("-phase angle") in this Tannoy speaker review
This is "phase shift" on this Class-D https://ibb.co/jfd6tqy (no - negative wording at all) Like a sub woofer being 180 degrees out of phase to the mains, easily heard when it back in phase
And no you wouldn’t hear it if the whole audio band were 70 degrees out of phase from bass 20hz to highest 20khz. But only the upper/mids and highs are 70 degrees out of phase to the mids lower mids and bass, and this is what many listeners object to when the hear Class-D, coincidence? I think not. And why many say Class-D’s make great bass amps, agree on that.
The only fix is to raise the switching frequency 3 x higher, then the output filter and the 70 degrees of phase shift go up accordingly 3 x higher to around 80-100khz then. It as simple as the that.
As atmasphere correctly said on another amp thread
Bruno Putzey on phase shift in his purifi module.
The way I understand it if the phase shift is different between the two channels in a stereo amp then one signal would be slightly behind the other which would sound like distortion?
@atmasphere Is saying "to avoid this both channels have to have enough bandwidth that this is off the table."
Or I am still lost??
For the most part you are probably not lost djones51, which is why I brought up the square wave question, which once again was ignored. If your square wave is "square", then you don’t have phase "issues". There is dogma, and then there is science.
atmasphere’s comment is related to a comment I made w.r.t. consistent phase shift of both channels. He is posing, which is true, but also implementation dependent, that having a high bandwidth ensures reduced phase-shift in the audio band, which ensures there is not much differential phase shift between the channels. However, dependent on the reason for the phase shift, the shift may be by design consistent between the channels without requirement of extended bandwidth. A difference in phase shift between two channels would not be distortion. If the difference is linear in phase, it is the same as moving one speaker a small amount. The the phase shift is predominantly not linear, it could contribute to "smearing" of the sound stage, as positions would vary based on frequency. Keep in mind, most of that processing of position happens at <10KHz which is evidence by measuring timing ability of subjects with reduced hearing bandwidth (mainly from age). Timing being differential timing between ears, not any absolute timing.
Back to linear phase, which is just a time shift (and great thing about using digital filters is you can do complex filtering and maintain linear phase (or not)). Of course, there is nothing stopping a competent analog designer from designing a front end that intentionally adds more phase shift at low frequency versus high frequency to compensate for the amplifiers characteristic phase-shift. In an all digital amplifier, i.e. with digital input, phase shift of the amplification section is near meaningless as you just run a digital filter that compensates for the shift and makes it linear in phase.
Thank everyone for explaining. I read this as well which to me is showing what roberttdid is saying about the square wave. If the phase shift is distorting it would look like figure 3. In the link.
- 741 posts total