"High Current"


I listen with my ears, and I dont really often care about the mathmatical conclusions but I have a friend who argued with me that Current cannot increase without wattage increasing as a result. I understand the simple formula is Voltage x Current = Wattage or something to that effect, it's been awhile since I openned a book.

How then can an amplifier from say a company like SimAudio which has a nortriously high current intergrated in the i-5 be only rated at 70 watts per channel?

Is it the differences which the current, voltage and wattage measured that makes the overall impact or can you really have an Ultra High current amp at a very modest Wattage output?
lush

Showing 1 response by jeffreybehr

All good answers so far, IMO. Do understand that solidstate amps, because of their VERY-low output impedance, are 'constant-voltage' devices, varying current delivery as load impedance changes. 'ANY' SS amp can double power as impedance halves, again and again...if one starts at a LOW-enough point. True-high-current amps can do that at least once while starting at rated ('maximum') power. The 20-year-old class-A-biased Lazarus hybrid poweramp is rated at 50Wpc into 8 Ohms, 100Wpc into 4, and 200Wpc into 2 Ohms. THAT is 'high current' indeed. IMO, any amp that won't double 8-Ohm rated power once (into 4 Ohms) is NOT 'high-current'.

BTW, it takes lots of expensive engineering to build a HC amp. The relative sizes of the power transformer(s), rectifier, wire, circuit-board traces, driver and output transistors, etc. all affect the amp's ability to deliver high current flows.

To answer your middle-paragraph question, an amp's power is limited generally by its ability to deliver VOLTAGE into the test impedance. Eventually, ANY amp will clip its output waveform at some voltage, and its rated power will be calculated at some point below that clipping voltage.

On your paragraph-3 questions, I have no idea about the 1st half, but my example of the Lazarus answers the 2nd part.
.