difference between 8ohms and 4ohms and voltage con


I have an amp that is supposed to run 160Wpc at 8ohms and 320Wpc at 4ohms. I am just wondering how I know which it is running at? I am moving to New Zealand where the voltage is different (240 rather than 110) so need to get a converter that can handle that wattage. I am wondering if a 550W converter would be able to handle the amp. Any advice would be appreciated.
ice2000
If you don't know the power consumption of the amplifier then the output power should be a third or less of the converter wattage rating. In your case, 320 watt ouptut should have a converter rated at least 1000 watts.

Ideally, any power product serving an amplifier (power conditioner, converter, transformer, etc.) should be rated at the volt-ampere of the circuit or 1,800 VA (watts).
I understand that 8 or 4 ohms will make no difference but it may for the converter and then ultimately for the amp. A 550W converter can handle 160Wpc not 320Wpc. If the converter cannot handle it, it will fry the amp. Anyway, I will check my speakers to see. Thank you.
You need to look at the voltage data on the rear of the amp or manual & see what the maximum wattage consumption/draw is.
Should say something like 120V, 60HZ, 950 watts. The 950 is max draw in my example.
voltage makes no difference. The 8 vs 4 ohms is a reference to your speaker's impedance. You can tell by looking at your speaker's impedance spec. Typically they're rated at 8 ohms, often 4, and it's not uncommon for them to be rated 6 or some other value. Keep in mind this is a minimal rating and the true impedance varies with frequency.

In any event, it's not something you need to worry about.