Which sounds best, 120 volts, or 240 volts?


I read all these reviews and comments that say how a new power cord can make a system sound better! Or different. With just a tiny adjustment, just a few feet of wire making audiable differences. I'm not disputing that, but it raises the question, can you hear the difference between a system powered by 120 volts here in the US, from a system in Europe that runs on 240? Surely that's a huge difference that should be immediately noticable. However it is hard to get a side by side comparison, most components needing internal adjustments to convert from one to the other, and of course the changing of input current. Different frequency too! From 60hz here to 50hz there, these have got to be major differences compared to swapping a cable? Any of you jet setters out there have an opinion on this? Do you prefer the sound on one side of the pond to the other? 
alpha_gt
Theoretically, there should be little to no difference between the two, but many 'philes say 220V sounds better. I don't know personally, but the main difference is that 220V device  uses half the current a similar 110V one would. This is the main reason your home's electric stove, dryer, and A/C unit use 220.
Thank you Joey for responding to my post!  But I must confess that it is not as innocent of a question as it may seem.  What I'm really trying to say, is that if high end power cords claim to make such huge differences in sound, when they only replace the last 5 feet of miles of wire, then there should be clearly humongous differences between devices that actually run on different kinds of current? It should be immediately noticeable to anyone who's ever heard both. I mean, one is 60hz and one is 50hz, that should be a million times larger than any geometry or metallurgy difference?   But I've never heard anyone who's ever even noticed? But you say you've heard people say they prefer 220, so perhaps it does sound different? And, as I mentioned, it's hard to make an A/B comparison when you've got to change countries to do it.

Personally, I invested in some high end power cords, keeping an open mind, and I gotta say, if it made a difference I didn't notice. Certainly nothing I could detect in a blind test.  But there are so many who swear by it, I do try to keep an open mind, so I remain skeptical. 

joeylawn36111
... the main difference is that 220V device  uses half the current a similar 110V one would.
Sorry, but this is mistaken. You're confusing current with voltage. All other things being equal, a given amplifier will draw the same amount current whether using a 240V supply or a 120V supply.

@cleeds I presumed @joeylawn36111 meant the case of a component designed for dual voltage operation (or switchable between the two) in which case for a given power draw the current will halve as the voltage doubles. Of course if you plug a 110v only device into a 220v outlet then all sorts of fireworks might ensue as it does draw the same current

@joeylawn36111 you may also enquire as to whether the plug sorts used in Eurpoean mains setups are also superior - frankly US based mains plugs are some of the flimsiest around (I’m originally from the U.K. so am used to the super beefy British plugs)

ps pretty good summary of this topic here

 

folkfreak2
@cleeds I presumed @joeylawn36111 meant the case of a component designed for dual voltage operation (or switchable between the two) in which case for a given power draw the current will halve as the voltage doubles.
Sorry, you are mistaken. At a given power draw, the amperage will halve as the voltage doubles. The current remains the same.
Amperage and current are the same thing. Ohm's law is current over voltage times resistance. So the unit's needs have not changed, (the unit being the resistance) only the voltage is higher, so the amperage, or current if you want to call it, is lower. The 50 or 60Hz is the frequency of the alternating current, doesn't really come into the ohms law equation. Although it does have an effect on impedance. Which is what they call AC resistance. Most pieces of stereo have a multi tap transformer inside. Made to have different windings for several different possible voltages around the world. The 110 tap may be attached right in the middle of the 220 windings, so the number of coils have been cut in half. Or, the other way round? So the transformer is actually acting as a whole new device, a different ratio of windings to produce the same outcome for the device to consume. Switching from one to the other could in fact add 100 feet of coiled wire to the Input! Between the wall and the actual circuit. These are big differences! A change in impedance, adding or subtracting a very long length of wire in the transformer, one would think the piece of gear would definitely sound different?
Would not an amplifier sold in both the US and UK have exactly the same circuit boards inside and differ only in transformer / rectifier? And does this not make the question moot?
The point was, if a power cord can make a difference in sound, would not operating on a different frequency, transformer, etc. make a huge difference in sound? And I'm afraid I do know what I'm talking about, I've got a college degree in electronics and computer sciences. Current is measured in amps, voltage measured in volts, and resistance measured in ohms. Power is measured in watts, or amps times volts. Alternating current goes from positive to negative and back is one cycle, if it made one cycle in one second, that would be one hertz. So 50Hz changes polarity 50 times a second, and 60 Hz changes polarity 60 times a second. But none of that is the point of the post, just the fact that these differences are HUGE compared to swapping a power cord, they should be readily audible. 
That, is the only point I'm trying to make.