Thanks to everyone. Some very helpful comments.
I understand that the DC power provided by a component’s power supply is the same power that constitutes the component’s signal, and that therefore noise or distortion on the AC power line, if insufficiently filtered by the component's power supply, will become part of the signal.
What I’m unclear about is how SPECIFIC audible characteristics correlate with SPECIFIC AC/DC powerline anomalies. Put simply, HOW does bad power result in bad timbre, or bad imaging, or less resolution, etc.?
04-24-12: Almarg
…any and all of those numerous frequency components could, to some small extent, intermodulate with the audio signal, resulting in new spectral components at frequencies equal to both the sum of and the difference between the frequencies of any or all of the spectral components of the music and the frequencies of any or all of the spectral components of the noise or distortion.
This was extremely helpful, Al. I wasn’t really thinking in terms of frequency intermodulation, but when I do, it’s easier for me to understand how bad power results in less realistic instrument timbres. It's something like...
AC power frequency anomalies -> DC power anomalies -> INTERMODULATION of DC power and signal -> distortion of harmonic content -> less realistic instrument timbres
Because accurate harmonic content is essential to realistic instrument timbres, anything that distorts harmonic content, like the intermodulation of DC power and signal, will make instrument timbres less realistic. Sounds plausible to me.
So filling in the question marks to #3 in the OP is…
3. dc power/signal frequency intermodulation = less realistic instrument timbre
Assuming all this is correct, I’m still unclear about the explanation at the level of voltage and current. In particular, I'm unclear about the concept of "frequency intermodulation" with respect to DC power. Some dumb questions...
--Does "frequency intermodulation" basically mean that there are FLUCTUATIONS to DC voltage/current that are UNRELATED to the signal?
--Why are DC fluctuations described in terms of "frequencies" at all? Is it simply because the fluctuations occur at a certain rate per second? Or does the use of "frequency" to describe fluctuations in DC voltage/current also imply that DC can be understood as a WAVE, just like AC?
I have lots of additional thoughts/questions about resolution and imaging, but it would be helpful to stick to instrument timbres for the moment, or my head might explode.
Thanks,
Bryon