Influence of DC offset on bias

Hi everyone. I'm usually an observer here and hopefully learn something along the way. Can someone shed some light on my question of what if any influence or affect dc offset has on the bias of an amp. I know how to check both and have never come across an amp with identical offset on both sides; well within acceptable levels on both sides, but never the same. Also impossible to adjust them to the same level. Does this render identical bias on both sides incorrect in terms of channel balance as it affects your ears? I would greatly appreciate some insight into this.

Thank you
I'm guessing your referring to the DC voltage bias measurement used by a lot of amps. The DC voltage is most likely the difference (drop) across a resistor tied to the tubes plate. This is an easy way to measure the plate current milliamps load on the tube. Different amps use different value resistors, so the voltage difference may not read the same from one design of an amp, to another. If they all used a *one* ohm resistor for example, a 50 milliamp (mA) load, would read 50 millivolts (mV). Or if they used a 10 ohm resistor on the tube, 50 milliamps plate load would read .5 volts. So if they use a different value resistor that wouldn't translate into the same (mA) reading.

The hard part is your question. I think it was said at one time (years back), that you could have a 5% difference between the left and right channel bias, and not notice the difference. Don't hold me to this 5%, but I don't seem to notice a 5% difference myself. Someone may come along with a different value.

If your amp has adjustable pots for the bias, you should be able to get them fairly close. On a stereo amp, changing one side (channel) will cause the other side (channel) to vary some. Also, if it has several adjustable pots, changing one, may cause any other to vary. Adjust the bias cold at first. You have to keep going back and forth, let the amp run 10 minutes, check again. Check after a half an hour, adjust if needed, than in a hour, check and adjust if necessary.
My above post would be for tube amps in general. I just noticed a past thread of yours referring to SS amps.
I'm referring strictly to SS. Amp has both offset and bias pots. I understand anything up to 50mv offset on each channel is acceptable, 0 being ideal. However, the best I can get is 16mv on the left, 26mv on the right. Bias set at 150ma on each channel. My question is, does the stilted offset skew the bias adjustment? I could just change up the left channel offset to match the right, but not all SS amps have offset pots. I have others with dc servo control and still others with discreet components for the offset but they are never identical on both channels either. So I guess what I want to know is should there be a bias compensation on either channel to accommodate the difference in offset?
Are you sure that you can play around with bias adjustment in SS amp?
In most of the SS cases I would strongly recommend not to adjust.
The pots are there for that very purpose. Channel balance depends on quiescent current being equal in both channels. But my question addresses exactly that issue. Must it be equal or adjusted relative to offset for each channel in order to achieve an equal end result at the speakers?
I rescind my post. It's in error for the purpose of this thread.
In transistors it's the same story only make plate equivalent to collector(assuming connection with common emitter). There's a purpose to adjust the offset so the output is measured the same across collector and ground on both opposite junction transistors pnp,npn pair (I assume that it's a class B amplifier, but I might be wrong) and another channel accordingly. So the offset might be different in transistors (they're way harder to match than tubes and have larger parameter tolerance) but bias current will be the same and 'sound' the same.
Can you tell what kind of amplifier we're talking about?
Brand? Class of operation? Must be a very vintage one. Most of the current ones would have an auto-bias which is a piece of cake to implement.
The amps in question are a pair of LSR&D Superamp monos with no offset pots, AmpzillaII which is a dcservo amp., Son of Ampzilla with both offset and bias pots, and some Bedinis also with no offset pots, a pair of early Boothroyd Stuart Meridian 105 monos with no offset adjustment. I recently sold a Meridian 559 with auto bias because IMO, it had no high-end characteristics at all. The 105's were fabulous and thought it would be more of the same but it sounded very average with no real redeeming qualities; not particularly fast, no great extension or linearity, imaging, sound stage etc.. I wondered if it was because it is not a fixed bias amp? When offset and/or bias is off, these are exactly the characteristics an amp displays. I guess my question is how much tolerance is engineered into auto bias?
An auto-bias bases is simply using a reverse conductivity of diode. Depending on how large your bias current should be you choose the diode with the same reverse conductivity current. It's all in the parameter list. There are certainly more sophisticated and advanced auto-bias supplies that allow to match upto 10% of parameter tolerance.
Thank you. That explains why I was disappointed with the 559. 10% is huge. I can easily detect anything that's not identical. Maybe I should be "Golden ears". Pinpoint imaging and transient response and therefore detail absolutely depend on a perfect bias adjustment in relation to each channel being the same.
Karakanetz, I appreciate your knowledge on this subject and I don't mean to pester you about this but does offset influence speaker sensitivity?
OOps. Sorry, Marakanetz.
Speaker sensitivity can only tag volume level difference of channels when you listen realy quiet. There also factors to mention such as stereo volume pot. Other than that it has nothing to do with speaker sensitivity.
Of your amplifiers, I'm only familiar with the circuit in the GAS amps . . . and both are fairly representative of a common-practice solid-state amp with an emitter-follower output. The complimentary diff-amp is kinda fancy but by no means unique, and the Ampzilla's bootstraped outputs are typical of many higher-powered amps of the era.

So to answer your question, small amounts of Vos are really only possibly of concern to your woofers, in terms of a static displacement in the voice-coil position or eventual loss of magnetic flux. While there is indeed a small difference in current between the halves of the output stage as a result of a DC offset, the optimum bias point in a Class B amp is dependent on the voltage between the output transistors' emitters, not the current through the output stage. And yes, they are sometimes the same thing in practice, but if you were to change the value of the emitter resistors, the optimum bias current value would also change. And since the bias generator sets the voltage between the two halves of the output stage (not relative to ground), the presence of an offset doesn't change the transfer function of the output stage.

The main issues with output stage biasing are how well it's maintained with variations in temperature and line voltage, and how much an increase in signal current affects the linearity in the crossover region. And these issues span everthing from circuit design, the characteristics of the transistors themselves, the layout of the wiring and circuit board, and the physical packaging and thermal design . . . there are no universal solutions.
Okay, Kirkus, you've clearly discerned that my issue is about what ends up reaching your ears. So is this why I hear an audible imbalance when using my multimeter I achieve identical specs and afterward complete the procedure while listening to music when I adjust the bias?
A SS amp without autobias circuitry or stabilization of quiescent current faces these following problems also previously mentioned by Kirkus:
1. due to the age the parameter mismatch percentage increases.
2. due to the temperature and other room conditions mismatch increases unlike vacume tubes.

There are quite large number of quiescent current stabilization circuits designed and created so far and they have advantage or disadvantage of one to another, but the best ones IMHO is implemented in Bryston amplifiers.

Up until before this post I believed that every SS amp post mid-70's will have it as a matter of fact and quite frankly, I'm shocked of what so-called 'purists' are able to build these days. If you somehow will be able to shoot me the circuit diagram, I'd like to take a peek on that.

As to Meridian 559, I'm surprised. It should be something else on the signal path such as loose wire(interconnect?), solder joint, preamp volume pot etc...
I have no schematics but the LSR&D amp was designed by Marshall Leach at Georgia Tech University and is the well known Leach Low TIM amp. Few were commercially produced and subsequently became a DIY project for which Dr. Leach provided parts and instructions via mail order. It's easy to find schematics for this amp just by googling. There was nothing wrong with the 559 as far as I could tell. It functioned as it should. It was just unimpressive. All of the other amps I've mentioned sound better, the Leach being the best of the lot. The Ampzillas are a close second, Bedinis third and finally the 105's but only because they begin rolling off right at 20hz. It's probably why they sound so good otherwise.
and finally the 105's but only because they begin rolling off right at 20hz.

Are you realy fenomenal or you've just measured SPL to make such conclusion?

I still don't get which amp you've been trying to adjust bias? I've looked at the schematics of Superamp and it has a quiescent current stabilization(an auto-bias), but the offset is being set by Q23 in diode connection and R27 I assume at the factory only once and than sealed.
I don't know, maybe but the 105's keep you on the edge of your seat waiting for it to happen and it never does, no sub-audible bottom end at all. I adjust bias on all my amps. I have no auto bias amps. Some don't have adjustable offset. The LSR&D Superamp monos I have definitely have bias pots. There are no other ones in the circuit. The Leach Low TIM DIY circuit has had numerous upgrades over the years so you may have found an up to date version but the factory amps were few in number produced back in the late 70's-early 80's. Unfortunately and sad to say, Dr. Leach passed away this past November. Always returned his e-mails and was available to deal personally with any question you may have.
Hi Csontos . . are you experiencing differences in timbre or level between channels on some of your amps? If so, given the age of the GAS stuff, the first thing I would be looking for is dried-out electrolytics used for coupling caps, especially at the input and in the feedback ladder.
Kirkus, I'm not experiencing a problem, just searching for an explanation on why after setting bias identically, I have to then pinpoint the adjustment while listening to program material. I believe Marakanetz answered my question. All of my amps are gone over and have been recapped and upgraded throughout. However resisters are usually left alone unless they're blown so I'm assuming it stands to reason the adjustment is much more critical; or is it? You tell me. I thought a stilted offset might require a stilted bias but apparently that's not the case. The bottom line is, and whatever the reason, on every one of my amps, perfect channel balance results in a stilted condition. Very close, but stilted. The channel with more offset always has a higher bias also.
Well, bias of course affects distortion performance, and I have seen many, many amplifiers where the final quescient bias level when warmed up varies with how much load there is on the amplifier when its warming up, and ends up being very different from the value and conditions specified by the manufacturer in the biasing procedure. These thermal subtleties seem to be lost on a great many engineers, and a manufacturer's audiophile "brand reputation" seems to have very little correlation with competency in this area.

As far as offset goes, most modern amps *shouldn't* exhibit a change in distortion performance as a result of an offset adjustment. But many early solid-state designs, including the GAS amps, don't have a current-mirror on the input diff-amp, which means that the balance of quescient current between the diff-amp pair is highly dependent on component tolerances . . . and offset adjustment. And even-order distortion products rise pretty quickly as the diff-amp becomes unbalanced.

With a distortion analyzer, oscilloscope, and a bit of experience, one can easily see the different effects of different distortion mechanisms on the amplifier as a whole. But without this, you're pretty much just "stirring the soup" of whatever flaws the amplifier has, and getting whatever floats to the top.
But if my procedure addresses or "corrects" the situation, is that not a best case scenario? Are you speaking of inherent characteristics of a given topology not equipped to deal with certain potential drawbacks, or are you referring to an unseen but needed repair? What's the option to stirring the soup?
Excellent read. I have an 80's integrated here that is very clean and shows no issues, but the left channel dc offset at turn-on goes to over 200mv and then eases it's way back down to around 10 like the other channel. Thing is, it takes about a half hour to do so. Any ideas where to start?
Thanks in advance.
I don't even bother measuring bias and offset for at least an hour. BJTs wander all over the place as they warm up. FETs much less so. If it's setting into the 10mV region after just sitting half an hour I'd call that fairly normal. I'd keep an occasional eye on it, but not worry. 80's era gear is getting to the point where it needs a recap anyways.