GaN-based Class D power amps

The use of GaN-based power transistor tech is now emerging for Class D audio power amplifiers. Seems appropriate to devote a forum thread to this topic. At least 3 companies have commercial class D amps in their books:

Merrill Audio, with their model Element 118 ($36k per monoblock, 400 W into 8 ohms, 800W into 4 ohms), Element 116 ($22k per monoblock, 300 W into 8 ohms, 600W into 4 ohms) and Element 114 (coming soon).

Review of Element 118 at this link:

ADG Productions, with their Vivace Class D amp ($15k per monoblock pair, 100W into 4 ohms). (The designer emailed me indicating he has another product in the pipeline.)
Review of the Vivace Class D moniblocks at this link (warning: link might not work (1/11/2019)):

Technics SE-R1 Class D stereo amp ($17k per stereo amp, 150WPC into 8 ohms, 300WPC into 4 ohms)
Preliminary review of the Technics SE-R1 at this link:
Technics also has a lower priced GaN-based class D integrated amp in their catalog:

Anyone listened to or own any of these amps?

Bruno Putzey on phase shift in his purifi module. 

Class D has achieved very low levels of distortion, but is it possible for class D amplifiers to continue their evolution into something close to a straight wire with gain, i.e. minimal phase shift in the audio band? (A similar question from maty).


Bruno: The 1ET400 module has the frequency and phase response of a 2nd order Butterworth filter cornering at 60kHz. If you look at the phase shift of that, it’s very nearly “linear phase” in the audio band. To take some rough numbers, it if you have a circuit that has a 0.2 degree phase shift at 200Hz, 2 degrees at 2kHz and 20 degrees at 20kHz, that’s the same as saying it has “0.001 degree per Hertz” phase shift. That’s another way of saying that the whole signal is simply delayed by 2.8 microseconds. If you plot phase shift on a linear frequency scale that’s immediately obvious because you get a straight line. Of course a simple delay doesn’t change the sound. It’s literally the same as starting your music a few microseconds later.


Lars: My dad used to say that if you left a CD in its case without playing it back, it’d just sit there accumulating massive amounts of phase shift as time went by.


Bruno: What that matters to sound is how much phase shift differs from a pure delay. Anyone who’s ever done phase measurements on speakers will remember that you have to remove the time-of-flight delay from the data, for instance by marking the leading edge of the impulse response. Otherwise the linear phase shift corresponding to the distance between the speaker and the mic completely clouds the picture. In the case of the 1ET400 module it’s just under 1 degree at 20kHz. There never was a phase shift problem in class D, it’s simply a trick of the light that happens when you plot the phase response on a log scale without removing the fixed delay.

The way I understand it if the phase shift is different between the two channels in a stereo amp then one signal would be slightly behind the other which would sound like distortion?

 @atmasphere  Is saying "to avoid this both channels have to have enough bandwidth that this is off the table."
Or I am still lost?? 
For the most part you are probably not lost djones51, which is why I brought up the square wave question, which once again was ignored. If your square wave is "square", then you don’t have phase "issues". There is dogma, and then there is science.

atmasphere’s comment is related to a comment I made w.r.t. consistent phase shift of both channels. He is posing, which is true, but also implementation dependent, that having a high bandwidth ensures reduced phase-shift in the audio band, which ensures there is not much differential phase shift between the channels. However, dependent on the reason for the phase shift, the shift may be by design consistent between the channels without requirement of extended bandwidth. A difference in phase shift between two channels would not be distortion. If the difference is linear in phase, it is the same as moving one speaker a small amount. The the phase shift is predominantly not linear, it could contribute to "smearing" of the sound stage, as positions would vary based on frequency. Keep in mind, most of that processing of position happens at <10KHz which is evidence by measuring timing ability of subjects with reduced hearing bandwidth (mainly from age). Timing being differential timing between ears, not any absolute timing.

Back to linear phase, which is just a time shift (and great thing about using digital filters is you can do complex filtering and maintain linear phase (or not)). Of course, there is nothing stopping a competent analog designer from designing a front end that intentionally adds more phase shift at low frequency versus high frequency to compensate for the amplifiers characteristic phase-shift. In an all digital amplifier, i.e. with digital input, phase shift of the amplification section is near meaningless as you just run a digital filter that compensates for the shift and makes it linear in phase.
Thank everyone for explaining. I read this as well which to me is showing what roberttdid is saying about the square wave. If the phase shift is distorting it would look like figure 3. In the link.
Post removed 
Again, who cares who is right? Let George have the last word or this will go on forever.  Some people want to learn, some want to argue.  Decide not to argue.  Let it go.  Peace.