What determines good distortion?


I have a friend using an Audio Research CA 50 integrated amp with 45 watts/channel into Vandersteen 2ce sig II. I use a 50 watt YBA integrated into the same speakers. We both listen at sane levels in small rooms 8 x 12. He thinks that it's better to use a 50 watt tube amp rather than a 50 watt SS amp because tubes when they distort sound more pleasant. I'm thinking that if you drive the amp into clipping it's bad with either a SS or tube amp because clipping distortion is bad whether or not you can tolerate it. Am I wrong?
128x128digepix

Showing 2 responses by atmasphere

The odd-ordered harmonics (above the 3rd) are used by the human ear/brain system to ascertain the volume of the sound being heard. This is pretty important to know if you want your system to sound like real music rather than a hifi.

If an amplifier has troubles with this, there will be two results- it will sound louder than it really is, and it will sound brighter than the music really is. All human ears are very sensitive to this!

The ear hears harmonic distortion as tonality. Electronics can have the fault of being overly 'warm' in sound, which is caused by the 2nd, 3rd and 4th harmonics. Quite often tubes get taken to task over this, but that really has more to do with the topology rather than the circuit being tube or transistor. If for example the tube circuit is fully differential in design, there will be no even ordered harmonics and so the circuit will have a more neutral presentation.

Conversely you can give transistor circuits a richer sound by building them single-ended- this will result in more of the lower-ordered harmonics.

Linearity of the devices themselves has a big effect on the sound of the circuit as well. The simple fact of the matter is triode vacuum tubes are the most linear form of amplification known. Additionally, it is easily demonstrated that even the most pedestrian tube amplifier will make less odd ordered harmonics than any transistor amplifier; a sine wave generator and an oscilloscope are all that's needed to demonstrate this.

This is why tubes are still around half a century after being declared obsolete.
Mapman, the test is simple- put the amp on a speaker load (nearly any load will do) and run the amp up to clipping with the sine wave, and observe the results on a 'scope.

Of course others have already done that. Goto Google, click 'Images' then enter:

tube clipping characteristic

The first two images will show you the difference- the transistor characteristic being the first image. Note the squared-off waveform of the transistor test- as if someone cut off the tops and bottoms of the waveform with a knife. These sharp corners are evidence of odd ordered harmonics. This test is common to all transistor amps.

The tube amp has rounded corners- less odd-ordered and lower orders as well. This characteristic is common to all tube amps.

Now this test is at clipping but this difference between tubes and transistors has been quite well documented in the last 40 years and really isn't a topic of debate as it is so readily measured and heard. It is why nearly all guitar players use tube amp, BTW.

However, almost any audiophile will correctly argue that we don't listen to amplifiers at clipping. And I agree with you in your surmise in the second to last paragraph of your post. A cheap tube amp with lousy transformers and the like is still a crappy amp.

However, the fact of the odd-ordered harmonic distortion issue will not go away despite all this. All it says though that a transistor amplifier will sound harsh compared to a tube amp **generally speaking** (and with rare exception...).

A lot of people are OK with that, thinking that they can use a synergistic approach to deal with the harshness. But look back at my original post here- I specified the difference between sounding like *music* or sounding like a *good hifi*. Its that nuance, that nth degree, that I am talking about- not the ability to weld with the amp or the like.