Output watts per tube


Yesterday I had the experience of testing the dozen EL34 Ruby Tubes out of my Cary V12, with interesting results. Despite having 6-800 hours on them, ten of the twelve tested between 100 and 110 on a tube tester, and the other two came out between 85 and 100. This was using the sensitivity set at 53, per that tester's standard for the tube type.

Then, we used a sine wave scope to check for distortion (none), and two tubes in a test amp running push-pull to measure output watts on a meter. What we got was that those two tubes - any combination of two of the twelve - had a sustained maximum output wattage of 21 watts, even the two that were a little low on the tester.

So here's my questions: Is the 21 watts for two tubes the equivalent of 10.5 output watts per tube; and if my amp takes twelve tubes and makes 50 watts in triode, am I in fact only using about 4.17 watts per tube at any given time; and does that equate to relatively low stress on the tubes and longer tube life?

And, before anyone says it, I know the Rubies are not the greatest tubes. I'm actually running the amp on J&Js, which sound great.

Any thoughts?
grimace
Does the "test amp" run at the same plate voltage and bias current as your Cary V12? If it does not then you are comparing apples to oranges and do not have valid data to extrapolate from, though I would ,most likely, agree that in the test amp those power numbers may be correct.
OK. That's a good point, and I don't know. It was an old Heathkit that had been modified to serve as a test bed.
Isn't you amp putting out 50 watts PER channel?

That would mean 6 left tubes and 6 right tubes at about 8.3 watts each.
The load impedance that the tubes are driving has to be taken into account as well as the B+ voltage... so far there is nothing here that could be considered conclusive, other than the fact that you have some tubes, and apparently, two amps that use them.