What do we hear when we change the direction of a wire?


Douglas Self wrote a devastating article about audio anomalies back in 1988. With all the necessary knowledge and measuring tools, he did not detect any supposedly audible changes in the electrical signal. Self and his colleagues were sure that they had proved the absence of anomalies in audio, but over the past 30 years, audio anomalies have not disappeared anywhere, at the same time the authority of science in the field of audio has increasingly become questioned. It's hard to believe, but science still cannot clearly answer the question of what electricity is and what sound is! (see article by A.J.Essien).

For your information: to make sure that no potentially audible changes in the electrical signal occur when we apply any "audio magic" to our gear, no super equipment is needed. The smallest step-change in amplitude that can be detected by ear is about 0.3dB for a pure tone. In more realistic situations it is 0.5 to 1.0dB'". This is about a 10% change. (Harris J.D.). At medium volume, the voltage amplitude at the output of the amplifier is approximately 10 volts, which means that the smallest audible difference in sound will be noticeable when the output voltage changes to 1 volt. Such an error is impossible not to notice even using a conventional voltmeter, but Self and his colleagues performed much more accurate measurements, including ones made directly on the music signal using Baxandall subtraction technique - they found no error even at this highest level.

As a result, we are faced with an apparently unsolvable problem: those of us who do not hear the sound of wires, relying on the authority of scientists, claim that audio anomalies are BS. However, people who confidently perceive this component of sound are forced to make another, the only possible conclusion in this situation: the electrical and acoustic signals contain some additional signal(s) that are still unknown to science, and which we perceive with a certain sixth sense.

If there are no electrical changes in the signal, then there are no acoustic changes, respectively, hearing does not participate in the perception of anomalies. What other options can there be?

Regards.
anton_stepichev

Showing 16 responses by andy2

It always is interesting to see how people resort to simple and basic reasoning in trying to understanding something as complicated as how humans perceive music.
Not sure why the OP made it more complicated than it has to be.  Here is the bottom line:

1. The DAC is a mix-signal : has to operate in digital and analog domain.
2. The DAC needs a clock to clock out the data.  The clock ideally has to be clean but no real world clock is clean.  It is corrupted by noise form its own analog and digital signal.
3. If the clock is corrupted, then the signal being clocked out is also corrupted in the "time domain".  I don't mean the 1's and 0's are corrupted, I mean the timing edge of the output data are corrupted.

So it's simple as that.  
I guess it depends on what you meant by "IDENTICAL".  

If two files are identical, using the same compress format (i.e. wave or flac ...), then for the most part they should sound the same.

That is UNLESS, each file comes from a different source, for example, one comes from a USB or one comes from HD, the playback could sound different.  Each source may have different interference signature on the DAC clock, then the sound could be different.

BUT if everything is the same: same file, same source, same equipment, then I guess it's hard to see how the sound could be different.

If you really want to down that path ... to the nth degree then anything is possible.
The preliminary analysis tells us that there is no physical or material cause-and-effect relationship in the situation with optimizing the sound of audio files
If you're talking about different file optimization, then sure it would have effects on the sound.  


Of course if optimized then it may sound different.  During playback, the audio file has to be "decompressed" or if you will "processed" by the CPU.  Therefore it will have its own digital interference signature which will affect the DAC clock.

File formats such as .wave or .flac are all in compressed format.  They have to be decompressed by the CPU before the DAC can take the digital data and turn into analog.
^^^ I don't think you know what you're talking about.  You seem to be contradicting from what you wrote before.  I thought you were making sense in your previous post, but then you're saying something different so it's hard to argue with someone who seems to be schizophrenic.
First of all you need to calm down.

OK. Imagine that we have an original file (A), an optimized file (B) that is PHYSICALLY identical to A, and an ordinary digital copy of the file (C) that is PHYSICALLY identical to A, all located on the same disk.

Could you explain how B is identical to A, if B is an optimized version of A?  Seems like an contradictory statement.












B = A because they have the same checksum
4+7 = 3+8?  Don't look the same to me.

It doesn’t matter what seems different to you, it only matters what the computer understands the same.
I think your understanding of what is "the same" and what is "different" is too simplistic.  You're showing off of your "computer skills" seems a little too obvious.


As far as I know, there is no measurement out there that can tell the difference in sound from a paper driver, ceramic, aluminum, or magnesium drivers.  But you can clearly hear the difference with different driver materials.
“If it sounds good, it IS good.”
Or better ...

"If it sounds good and measures good, it IS good".

You at least want to make sure if it measures good.  But does not mean that if something measures good, it will sound good.  At the end you have to listen.


Music is not a single tone sinewave.  Every piece of musical instrument can produce a range of frequency from the lowest to highest.  A drum sound can extend all the way up to upper treble frequency.

Using a single tone sinewave to understand how people perceive music is crazy.
Of course, this is crazy, but who here says otherwise?

The smallest step-change in amplitude that can be detected by ear is about 0.3dB for a pure tone. In more realistic situations it is 0.5 to 1.0dB'"