Musical Fidelity a3.2 to 5.5 Integrated Amp Volume


I'm just curious if anyone can tell me why I'm experiencing lower volume output when I moved from the a3.2 which has about 120watt 8ohm power, to the a5.5 which has 250watt 8ohm power.

So I moved on to the a5.5 integrated after I found a good price for a used one thinking that I'd use the USB DAC, and the added benefit of a large power increase is usually a nice thing to have. I'm using the Totem Hawks which are 6ohms with @ 88db sensitivity. I went from usually just less than half on the volume knob with my a3.2 for maximum volume that I felt comfortable with, and now with the a5.5, I have to crank it well past half way mark with similar output in volume...

Is my mind playing tricks with me or is there something that I'm missing here, as I expected the volume to much greater considering my Hawks are at 6ohm power which is probably around the 300watt mark draw with this amp. It sounds nice though regardless, but I can't help but wonder why this was the case? ... or was the a3.2 really that good??
mdunko
The only thing that was changed in my system was the integrated amplifiers between the two mentioned, which is why I just find it very interesting how such small variations can produce quite a difference.

With all that said and comparing new MF to old MF stuff, it comes as no surprise that the made in England gear that is made today, is only on the extreme high end side... I think I'm going to stay with the older less powerful a3.2 integrated now! Cheers Zd542!
"So just for the sake of understanding, the 30mV difference between the two is causing the change? That seems awfully marginal to account for the LACK of volume output the a5.5 has versus the a3.2"

That's not the only thing, but yes,the difference can account for a volume change. You can have the same thing happen with sources. For example, 2 CD players who's line stage amps put out different amounts of power will have a direct effect on where your volume knob is positioned to get the same volume level of output.

"Side note: interesting to note how much lower the distortion is on the a3.2 versus the newer a5.5!"

The older MF gear, made in the UK, sounds quite a bit different than the new stuff made overseas.
A5.5 input sensitivity is 330mV
A3.2 input sensitivity is 300mV

So just for the sake of understanding, the 30mV difference between the two is causing the change? That seems awfully marginal to account for the LACK of volume output the a5.5 has versus the a3.2

Side note: interesting to note how much lower the distortion is on the a3.2 versus the newer a5.5!
"Does the whole "made in England" vs "Taiwan" thing have anything to do with it perhaps?"

No, it has nothing to do with that. If you look at the specs on both amps, you'll find that they're some are different, like input sensitivity.

Think of it this way. Your source has a small power amp inside to get the signal to the preamp. Its just like a regular power amp, but much weaker. When you change components, the small amp in the source is usually more powerful, or less powerful. Its that difference of power output that causes the overall volume to change. Its just like switching to a different pair of speakers. More efficient speakers will keep the volume knob at a lower setting, while less efficient speakers will force you to raise the volume in order for it to be the same as the efficient pair. Its perfectly normal.
I forgot to mention that I'm asking this because I'm debating whether to keep the 3.2 or 5.5 as they sound similar in quality (in fact, the 3.2 has better specs - wattage withstanding)
oh I understand it's probably nothing to worry about, but can anyone explain why this is occurring? One would expect a power boost... especially considering it's basically twice the power.

Does the whole "made in England" vs "Taiwan" thing have anything to do with it perhaps?