The "Snake Oil" Trope


Yeah I know, a controversial topic, but after 30+ years of hearing both sides and seeing how the argument has evolved over the years, I want to say my piece.

First, I want to debunk the idea of ever using the term, "Snake Oil" because it has been incorrectly appropriated and is not being applied genuinely. For a product to be "Snake Oil" it isn't a simple matter of, "it doesn't do what it claims to do." It has to contain a few more qualities. Chief among them, the materials or ingredients have to be fake, falsified, or non-existent. I have yet to encounter a single premium cable manufacturer who has claimed to use copper or silver and it was fake.

This would be an example of cable "Snake Oil" if it existed:

Company claim: "A 10 gauge speaker wire made of ten 9's pure silver, extracted from conflict-free mines, using NASA quality FEP dielectrics, braided in 24 strands of 17 gauge wire, all concealed in the newly developed element, Star-Spangled-Bannerite, that enhances and boosts all frequencies, repairing broken audio as it travels down the conductor."

Reality: Cutting open the wire you find 3 strands of 14 gauge aluminum wire, wrapped in Glad's saran-wrap, threaded through a 10 gauge rubber garden hose, covered in a fancy colored net.

My biggest problem with the nay-sayer community is the hypocrisy of their accusation that premium quality cables are "Snake Oil" when their charts, measurements and tests have the same level of skepticism they purport to debunk. Using "Snake Oil" to prove "Snake Oil?" Ask yourself the following questions when you next see some online or vlog rant about how cables don't make a difference and they have the measurements to prove it:

1) Did they actually connect the cables to speakers and listen?
2) If they made measurements, did they show you how those cables were connected when they conducted the tests?
3) If it is a vlog, did they show in the video live footage of them conducting the test or is everything after-the-fact?
4) How does the test prove quality and how does the author quantify "quality?"

99% of the time the answer is "no." You just see people posting pictures of charts that could have been made using any form of software.  Heck, I could make one in Photoshop that dictates any conclusion I want. The truth is, there isn't a single form of equipment or measurement software that tests the actual perceived quality or clarity of a signal.

For example, "that guy" from Audioholics posted a video bashing a $4000 Audioquest speaker cable.  He claims to have run it through tests and he posted pictures of graphs that he gave conclusions for.  Not once did he show how it was connected to the machines or equipment. More over, he claimed to have broken the cable, by easily snapping off the banana plug (made of pure copper coated in silver). Well, if that were true, then how could he have possibly connected the cable correctly to test it?  He also claimed the cable was on loan from Audioquest.  Red flag. Audioquest does not send out one speaker cable to test; they'd have sent out a pair.  He also wasn't at all concerned that he had broken a $4000 loaner cable.  Therefore, I suspect someone else broke their own cable and let "this guy" borrow it for a video. Lastly, he claims to test the effectiveness of the "DBS" system by showing you a digital read out on some other machine.  He claims to unplug the DBS system live...but...off screen, and the digital read out changes. That makes absolutely no sense, since the DBS system isn't tied to the actual conductors or connectors. It's a charged loop from end to end and only keeps the insulation's dielectric field charged. So unplugging it while a signal is being passed through the cable wouldn't change anything. Therefore,  the nay-sayer argument, in this instance, was nothing more than "Snake Oil" trying to prove "Snake Oil."

Another time, someone was given a premium XLR cable, but had no idea what an XLR cable was.  They didn't recognize the connector format; a red flag straight away!  Then goes on to claim all the different measurements they took from it and how it was no better than the free cables you get from manufacturers.  Well, if that is true, how was this cable connected to the equipment? If he didn't know what the XLR format was, then it stands to reason they didn't have an XLR input on the equipment they used to test. Therefore, how in the world was this an equitable or viable test of the quality if the cable's conductors weren't all being used correctly during the test? Not once did this person connect it to an audio system to say how it sounded. How do electrical measurements translate into sound quality if one refuses to listen to it?

My final argument against the nay-sayers is one they all have the most trouble with. They don't use the Scientific Method.  For example, where's the control in these tests? What system or cable do they universally *ALL* agree is perfect and that they test against? The systems and cables always change and are never consistent. Why is it that they argue for an A / B test, but aren't willing to set one up for themselves? As if it's someone else's responsibility because they refuse to be responsible for their conclusions. Why is it that they only test low end or middle grade cables, but never seem to run these tests on the highest levels? Why is it that the majority of nay-sayers never purchase any of this equipment to find out for themselves?

What I have discovered after 30+ years of arguing this topic, is that the nay-sayers just don't want to have to buy expensive cables.  Instead they seek out any form of cognitive bias they can find to use as justification to not buy it.  Then suddenly concern themselves with other people's purchase power and tell them not to purchase such cables, as if these people are spending their money. Or they claim that they should have spent all that money on better equipment. Touche', but if they bought better equipment, they'd still buy premium cables to push that better equipment. That's like saving your money to buy a Lamborghini, then deciding on buying 15 inch steel rims with narrow tires for it because wheels are wheels...they bought a better vehicle, so won't need premium tires...or premium gas because the engine is superior. *eye roll.* What it seems to boil down to is that they don't like the idea that just buying premium cables alone can surpass a high grade, well-engineered system. To borrow from my car analogy, buying premium tires for a 4-cylynder hatch back won't make it go any faster, but it will effect some performance, likely gas mileage and road grip. Using the same analogy, buying better cables is akin to buying a turbo kit, back-exhaust system, better suspension, better intake valves, better cold air filters, etc to make that 4-cylinder hatch back perform nearly as well as a stock   Lamborghini.

Final thoughts, "Snake Oil" salesmen back in the day weren't just interested in defrauding their customers, they wanted to do it with the least amount of effort. They didn't try to get authentic, high quality ingredients to make the oil look or taste better.  They used whatever was on-hand and as free as possible. Cable companies sure seem to go out of their way to acquire the best possible conductors and materials, and have R&D teams engineer complicated wire geometries and spend years finding ways to treat the cables, or develop active tech to impact the signal, just so they can make a few bucks. If the product had absolutely no impact on sound quality, at all,  it wouldn't take long for well-engineered systems to reveal their faults and the industry would tank, almost over night. Clearly, they haven't and it's because it isn't "Snake Oil" no matter how many times that old trope is trotted out.

One of the serious problems in this entire discussion is that the perception of "quality" is 100% subjective to the listener, the state of the equipment, the room it is being conducted in, and health of the listener. After years of auditioning my system to people, I realized it isn't a simple matter of asking, "How did that sound to you." You have to be very specific.  Ask, "Did you hear that specific sound?"  9 times out of 10, they'll say they didn't hear it.  So you play it again and point it out.  Then they light up and realize that no matter how many times they heard that song, they had never heard that particular sound.  Then they go and compare it to the car radio or through their device's ear buds and realize they cannot hear it or couldn't hear it as clear.  Then they come to respect what you're trying to achieve.




128x128guakus

Showing 1 response by zappas