Output watts per tube


Yesterday I had the experience of testing the dozen EL34 Ruby Tubes out of my Cary V12, with interesting results. Despite having 6-800 hours on them, ten of the twelve tested between 100 and 110 on a tube tester, and the other two came out between 85 and 100. This was using the sensitivity set at 53, per that tester's standard for the tube type.

Then, we used a sine wave scope to check for distortion (none), and two tubes in a test amp running push-pull to measure output watts on a meter. What we got was that those two tubes - any combination of two of the twelve - had a sustained maximum output wattage of 21 watts, even the two that were a little low on the tester.

So here's my questions: Is the 21 watts for two tubes the equivalent of 10.5 output watts per tube; and if my amp takes twelve tubes and makes 50 watts in triode, am I in fact only using about 4.17 watts per tube at any given time; and does that equate to relatively low stress on the tubes and longer tube life?

And, before anyone says it, I know the Rubies are not the greatest tubes. I'm actually running the amp on J&Js, which sound great.

Any thoughts?
grimace
Isn't you amp putting out 50 watts PER channel?

That would mean 6 left tubes and 6 right tubes at about 8.3 watts each.
The load impedance that the tubes are driving has to be taken into account as well as the B+ voltage... so far there is nothing here that could be considered conclusive, other than the fact that you have some tubes, and apparently, two amps that use them.