Volts x Amps = Watts, and from a purely mathematical standpoint, 100 volts at 5 amps is 500 watts - as is 5 volts at 100 amps as mentioned by Saki70.

The problem is that you can't just arbitrarily mix the volts and the amps and end up with the same results from your amp. There is the factor of the speaker impedance that dictates how much current will flow in the circuit. An amplifer does not push current to the speaker, the speaker draws current from the amp. This all works through the basic principles of Ohm's Law (E=IxR).

An amplifer tries to operate as a voltage source/voltage amp. This means that the amplifer will try to deliver the same voltage output (volume position notwithstanding) as the speaker impedance decreases (lower impedance = higher load = more current necessary).

In other words, if an amp is designed to deliver 100wpc @ 8 ohms, it will need to put out roughly 28 volts. This 28 volts into an impedance of 8 ohms will cause roughly 3.5 amps of current to flow. If the voltage and impedance variables do not change, then 3.5A is all that will every flow in this circuit. The amplifier cannot and will not force more current to be in the circuit than Ohm's Law dictates.

Since output power at 8 ohms is the typical basic rating, the amplifier mfg has designed the power supply voltage rails to support 100w at 8 ohms (28V and at least 3.5A) At 8 ohms, the power output is typically limited by the available voltage from the power supply rails, not the current.

When a 4 ohm speaker is substitued for the 8 ohm speaker, the "100w" amplifer in our example still tries to deliver 28V to the 4 ohm load. If the power supply is robust enough to fully deliver the higher current (about 7 amps) dictated by the lower 4 ohm load, the power delivered would be 200w - or double the 8 ohm rating. This is the "doubling down" effect often mentioned in amplifer discussions.

However, if the amplifer's power supply cannot fully satisfy the current demand of the lower impedance load, it's output voltage will sag. The inability of the power supply to deliver enough current and the resultant sag of the output voltage under load is why most amps and receivers don't double down as the impedance drops.

Also, it is my opinion that having all channels driven for a HT receiver is a bit of a red herring. Sure, it's likely to indicate a more robust power supply and amplifier, however it's not a guarantee of superior sound quality.

Further, it would be a VERY rare, if ever, that any HT source material will require maximum output on all channels at the same time. Of course, if you play continuous test tones at excruciating levels through your system, that's a different story.