Class A/B switching cause sonic problems?


I have heard different thoughts on this subject and was looking for some other opinions.
Spoke with Mike Creek at HE 2002 and he did not feel there was any degradation when you have a power amp that switches between Class A ( for the 1st watt) and Class B (for power over 1 watt). He though class A's inefficiency and heat generated were not worth whatever benefits there may be. Maybe modern A/B designs don't have problems like older designs did.
Then I saw an article somewhere on the web (darned if I can find it now) where another power amp designer was strongly against switching power supplies.
Is there any agreement on this subject as to who is right?
cdc

Showing 2 responses by marakanetz

Onhwy61, for the amp to perform in different classes of operation the DC power on active elements must be distributed differently so that the offset voltage is "located" at the very "bottom" of input "family" characteristics of a transistor or valve. Thus the switch of the power supply also takes place.

In my point of view that it's great to have either class B or class A.
Class B claimed to have less harmonic distortions theoretically but for real there is no ideal offset point so the positive-wave element will be also passing negative that will be added thereafter to the negative element and vice versa; creating unneccessary phase shifts as well which won't be present in class A.
The only point in class B is its efficiency and ability to fit into the small budget amps.
A/B amps due to the DC supply switching sort-of combine both cons from A and from B.
I can answer it as well.
1 watt of output power is measured according to a certain freequency probably 1000Hz. If it goes to the bass freequencies it would anyway switch since higher current is drown onto the load greatly increasing a slew ratio...
There are A/B amps where the switch to B occurs arround 25...30W where you normally listen to them in class A.