Has anyone tried this same setup? Is this a common thing to do?
While it may seem off that inserting an extra length of wire improves things I suspect a plausible explanation may be that the extender improves the quality of the connection between wall wart and socket. Wall warts are often heavy and pull down on the wall socket, in turn they're likely to pick up vibration further affecting the consistency of connection. Using an extender can allow you to position the wart/transformer away from the wall socket in an ideal location and hence deliver the improvements noted.
Interestingly I had a similar experience using a short extender where I needed some more length on a wart powered device and found far from making it sound worse it sounded a tad better (and I used generic non grounded extension cords) -- I'll look into making up some better short connectors so I can explore this further
Thanks for the tip. So you basically daisychained the C7 power cord with the existing wall wart using various adapters?
Exactly. It goes: Wall Outlet -> Power Cable -> Adapters -> Wall Wart -> DAC.
Whatver magical properties a power cable imbues are now being applied to the wall wart power supply and DAC.
As to how a power cable matters, that’s a whole ‘nother topic. There is a forum post called “Why Power Cables Affect Sound” that has over a dozen pages of discussion. One hypothesis that came up a few times is that a good power cable rejects AC noise and EMI interference. Could that account for the increased performance of the DAC? Perhaps.
You will be much better off if you get a LSU (linear power supply) with either toroidal or R-core transformer of proper wattage. There are many very nice ones out there on ebay, etc, with very reasonable prices ($50 - $100).
if you are more adventurous, you can go with raspberry/hifiberry or raspberry/allo-boss combo with amazing sq. All less than $150.
“Fleabay”?? Don’t dismiss those that quickly.
These are very nice:
They use the same components you will get from Mouser or Digikey (nichicon, vishay, etc.)
Also, I use these kits with excellent and quiet results. Needs toroid transformer and case:
I use a jell-pack battery with a charger. Total cost <$100. With many different voltages available just divide the VA rating by the current drain of the device and you'll have the hours of play before a charge. It's surprising how a small battery will power one of those devices for several hours. I bought a charger that works for many different voltages of lead acid batteries. The connectors are easy to find.
The noise floor then drops beyond anything you've ever heard. Nothing is cleaner than battery DC. I use if for wall wart powered phono preamps.
I have used the Chinese Power supplies in my System. they are a great improvement over a wall wart. I was reading one of the hifi forum one night on power supplies, and someone was talking about using a power supply that the ham radio people use. So i thought I would try one. It was a huge improvement over the Chinese power supplies, now I have six of them. I highly recommend you try this.
Trouble with a pre-built power supply is it’s a one-size fits all scenario. You would be paying for all the parts of a multi-voltage output design, but only using one voltage. That wasted money could have gone into fortifying the design for your specific application. My builds cost me around $35 for a phono preamp power supply built from scratch. I could make 3 of my power supplies for the cost of the chinese power supply, so going chinese is not a good solution from an economical or performance standpoint.
Thanks for all the good input about power supplies. I have a wall wart on my DAC and my turntable. I have always disliked them. They are always the first thing to fail. I unplug them, when not in use. I was wondering if these power supplies maintain a constant output voltage when the input voltage drops or do you need to go to a higher dollar supply.
Voltage regulation comes with a set range of tolerances on input voltage and current draw. You can do a headroom calculation if you know what the regulator circuit voltage dropout is and subtract that from the design output voltage, then finally subtract that from the minimum input voltage to the regulator. If the power supply is designed correctly for the application and input voltage, you will have a positive number. That is the headroom of the secondary voltage of the power supply’s transformer. Finally use the transformer output to input voltage spec, and divide the result by that ratio to determine the input voltage headroom. That’s a fairly simplified overview, but is close enough for purposes of estimation.