The amplifier design determines the maximum voltage that it can put out. The load (speaker) impedance, 8 ohms, 4 ohms etc. determines how much current will flow if this voltage is output. That is, if the amplifier can deliver that much current. It would be possible to design an amplifier which did not put out a high voltage, but could maintain that voltage despite a low impedance load which requires a lot of current. The power rating of the amp specifies that it applies at a particular load impedance. If a lower impedance speaker is connected, more power will be delivered.
"High Current"
I listen with my ears, and I dont really often care about the mathmatical conclusions but I have a friend who argued with me that Current cannot increase without wattage increasing as a result. I understand the simple formula is Voltage x Current = Wattage or something to that effect, it's been awhile since I openned a book.
How then can an amplifier from say a company like SimAudio which has a nortriously high current intergrated in the i-5 be only rated at 70 watts per channel?
Is it the differences which the current, voltage and wattage measured that makes the overall impact or can you really have an Ultra High current amp at a very modest Wattage output?
How then can an amplifier from say a company like SimAudio which has a nortriously high current intergrated in the i-5 be only rated at 70 watts per channel?
Is it the differences which the current, voltage and wattage measured that makes the overall impact or can you really have an Ultra High current amp at a very modest Wattage output?