Help me understand how screen voltage plays into it - KT120 content
For those who don't want to read my lengthy story --- KT120 push pull pair, cathode biased, 475v to plates, 450v plate to cathode, 25v drop across 130ohm shared cathode resistor. 0c3 tube to drop screen volts to 370v ... OR .. 0d3 tube to drop screen volts to 315v. What would be the difference and reasons to go with 0d3 and lower screen voltage vs 0c3 and higher screen voltage. I'm not trying to squeeze out every watt - just want a bit more clean headroom and hoping cooler, longer lasting output tubes vs 6550.
Here's the longer version if you feel like reading:
So I know just enough to be dangerous but still don't totally grasp all the subtleties of Tube Amplifiers. I have rebuilt several Leslie speaker cabinets in my time - very simple amps - cathode biased 6550 push pull pair delivering only 30-40 watts.
Tube rectified versions generally run about 370v plate to cathode and drop 25v across the 150 ohm cathode bias (bypassed with a 200u cap) and 290v on the screens.
Diode rectified versions run about 395v plate to cathode and drop 25v across the 150 ohm cathode bias resistor and 315v on the screens . They use an 0c3 tube to drop 105v for the screen voltage.
OK that's just to show what normal voltages are for the design -- they keep the screens about 100v lower than the plate.
Now I have one oddball version that has a way oversized power supply because it once drove two extra 6v6 power tubes for a second channel (removed long ago) and it also powered a Field Coil woofer that is no longer used. The field coil was never part of the power supply (it's not used as a choke etc.) but it did pull the voltage down a bit. The amp uses two 5u4gb rectifiers and between higher mains voltages these days and removing all those extra bits that were drawing power, the voltages with a stock 0c3 tube are all quite a bit higher but still within spec for a 6550 but I've been running KT88 and they've worked really well. And I was curious how a KT120 would work (not to blow my output transformer - but to run them cool thinking they might last a long time). I'm not using the amp for a Leslie speaker at all and if I end up destroying the OT I'm prepared to put a higher wattage OT with taps for different speaker impedance (the Leslie OT is 16 ohm only) -- but I'm not intending to blow the OT, just wanted to see if the KT120 will run with more headroom (and they do sound fabulous so far).
So with KT120 tubes, I see about 450v plate to cathode and 25.7v drop across a 130ohm bias resistor which I reckon is about 93.4mA per tube --- that is with me running an 0d3 instead of an 0c3 and that keeps the screen voltage (and voltage to 12au7 driver) about where it is on a normal Leslie amp. (but in my case screen is 150 volts below plate instead of 105v below plate)
If I swap the 0c3 back in, my plate to cathode volts drop to 435v and drop across 130ohm bias resistor rises to 28v and I calculate 101.8mA.
So help me understand the relationship between Screen voltage and Plate voltage (I mean in a Fender amp the screen voltage might actually be higher than plate voltage, but Leslie amps have always kept screen 105v below plate) --- and given the amp in question with these voltages, would you go with the 0d3 to keep screen voltages around 315v or 0c3 and have screen voltages around 370v?