GaN-based Class D power amps

The use of GaN-based power transistor tech is now emerging for Class D audio power amplifiers. Seems appropriate to devote a forum thread to this topic. At least 3 companies have commercial class D amps in their books:

Merrill Audio, with their model Element 118 ($36k per monoblock, 400 W into 8 ohms, 800W into 4 ohms), Element 116 ($22k per monoblock, 300 W into 8 ohms, 600W into 4 ohms) and Element 114 (coming soon).

Review of Element 118 at this link:

ADG Productions, with their Vivace Class D amp ($15k per monoblock pair, 100W into 4 ohms). (The designer emailed me indicating he has another product in the pipeline.)
Review of the Vivace Class D moniblocks at this link (warning: link might not work (1/11/2019)):

Technics SE-R1 Class D stereo amp ($17k per stereo amp, 150WPC into 8 ohms, 300WPC into 4 ohms)
Preliminary review of the Technics SE-R1 at this link:
Technics also has a lower priced GaN-based class D integrated amp in their catalog:

Anyone listened to or own any of these amps?

I guess I’m doomed to not understanding this. I thought a phase shift like that shown in the plot where it dips is a negative shift which means a lag in the signal.
No a negative "-phase angle" is what’s with the speaker bass loading, totally different, the dotted line ("-phase angle") in this Tannoy speaker review

This is "phase shift" on this Class-D (no - negative wording at all) Like a sub woofer being 180 degrees out of phase to the mains, easily heard when it back in phase
And no you wouldn’t hear it if the whole audio band were 70 degrees out of phase from bass 20hz to highest 20khz. But only the upper/mids and highs are 70 degrees out of phase to the mids lower mids and bass, and this is what many listeners object to when the hear Class-D, coincidence? I think not. And why many say Class-D’s make great bass amps, agree on that.

The only fix is to raise the switching frequency 3 x higher, then the output filter and the 70 degrees of phase shift go up accordingly 3 x higher to around 80-100khz then. It as simple as the that.

As atmasphere correctly said on another amp thread
"If you really want to get the soundstage right, the amp needs to have minimal phase shift in the audio regions so it will need bandwidth past 80KHz.

Cheers George
Bruno Putzey on phase shift in his purifi module. 

Class D has achieved very low levels of distortion, but is it possible for class D amplifiers to continue their evolution into something close to a straight wire with gain, i.e. minimal phase shift in the audio band? (A similar question from maty).


Bruno: The 1ET400 module has the frequency and phase response of a 2nd order Butterworth filter cornering at 60kHz. If you look at the phase shift of that, it’s very nearly “linear phase” in the audio band. To take some rough numbers, it if you have a circuit that has a 0.2 degree phase shift at 200Hz, 2 degrees at 2kHz and 20 degrees at 20kHz, that’s the same as saying it has “0.001 degree per Hertz” phase shift. That’s another way of saying that the whole signal is simply delayed by 2.8 microseconds. If you plot phase shift on a linear frequency scale that’s immediately obvious because you get a straight line. Of course a simple delay doesn’t change the sound. It’s literally the same as starting your music a few microseconds later.


Lars: My dad used to say that if you left a CD in its case without playing it back, it’d just sit there accumulating massive amounts of phase shift as time went by.


Bruno: What that matters to sound is how much phase shift differs from a pure delay. Anyone who’s ever done phase measurements on speakers will remember that you have to remove the time-of-flight delay from the data, for instance by marking the leading edge of the impulse response. Otherwise the linear phase shift corresponding to the distance between the speaker and the mic completely clouds the picture. In the case of the 1ET400 module it’s just under 1 degree at 20kHz. There never was a phase shift problem in class D, it’s simply a trick of the light that happens when you plot the phase response on a log scale without removing the fixed delay.

The way I understand it if the phase shift is different between the two channels in a stereo amp then one signal would be slightly behind the other which would sound like distortion?

 @atmasphere  Is saying "to avoid this both channels have to have enough bandwidth that this is off the table."
Or I am still lost?? 
For the most part you are probably not lost djones51, which is why I brought up the square wave question, which once again was ignored. If your square wave is "square", then you don’t have phase "issues". There is dogma, and then there is science.

atmasphere’s comment is related to a comment I made w.r.t. consistent phase shift of both channels. He is posing, which is true, but also implementation dependent, that having a high bandwidth ensures reduced phase-shift in the audio band, which ensures there is not much differential phase shift between the channels. However, dependent on the reason for the phase shift, the shift may be by design consistent between the channels without requirement of extended bandwidth. A difference in phase shift between two channels would not be distortion. If the difference is linear in phase, it is the same as moving one speaker a small amount. The the phase shift is predominantly not linear, it could contribute to "smearing" of the sound stage, as positions would vary based on frequency. Keep in mind, most of that processing of position happens at <10KHz which is evidence by measuring timing ability of subjects with reduced hearing bandwidth (mainly from age). Timing being differential timing between ears, not any absolute timing.

Back to linear phase, which is just a time shift (and great thing about using digital filters is you can do complex filtering and maintain linear phase (or not)). Of course, there is nothing stopping a competent analog designer from designing a front end that intentionally adds more phase shift at low frequency versus high frequency to compensate for the amplifiers characteristic phase-shift. In an all digital amplifier, i.e. with digital input, phase shift of the amplification section is near meaningless as you just run a digital filter that compensates for the shift and makes it linear in phase.
Thank everyone for explaining. I read this as well which to me is showing what roberttdid is saying about the square wave. If the phase shift is distorting it would look like figure 3. In the link.
Post removed