Why are digital streaming equipment manufacturers refusing to answer me?


I have performed double blind tests with the most highly regarded brands of streamers and some hifi switches. None have made any difference to my system on files saved locally. I have asked the following question to the makers of such systems and almost all have responded with marketing nonsense. 
My system uses fiber optic cables. These go all the way to the dac (MSB). Thus no emi or rfi is arriving at the dac. On top of this, MSB allows me to check if I receive bit perfection files or not. I do. 
So I claim that: if your dac receives a bit perfect signal and it is connected via fiber optic, anything prior to the conversion to fiber optic (streamers, switches, their power supplies, cables etc) make absolutely no difference. Your signal can’t be improved by any of these expensive pieces of equipment. 
If anyone can help explain why this is incorrect I would greatly appreciate it. Dac makers mostly agree, makers of streamers have told me scientific things such as “our other customers can hear the difference” (after extensive double blind testing has resulted to no difference being perceived) and my favorite “bit perfect doesn’t exist, when you hear our equipment tou forget about electronics and love the music”!
mihalis
I can provide some clarification on what you are hearing. My experience comes from developing communications and signal processing software. Including developing custom Ethernet drivers for signal processing.

The following discussion is for the analog transmission of a digital signal (PCM/DSD). Specifically Ethernet cables and switches. Note it does not apply to the analog transmission of analog signals (ie interconnects and speaker cables.)

An Ethernet frame is transmitted as a series of pulses. The transmitting Ethernet transceiver will generate a pulse for each bit in the frame. The receiving Ethernet transceiver will transform each pulse into a 1 or zero bit. The bits will be accumulated into a frame and the checksum validated. If there was an error in generating the correct bit value from a pulse then the Ethernet transceiver will request a frame transmission. If there are no errors the the frame, its contents will be copied into some form of buffer data structure. Processing on those buffers will be initiated by an interrupt or polling algorithm. At this point in time you have an exact replica of the original transmitted signal.
The well tempered computer dot com web site has a graphic depicting a pulse signal. It does not depict the aberrations in rise time introduced by clocks/crystals. If any of the pulse problems (overshoot, ringing, droop or undershoot) caused by EFI/EMI or clocking errors result in a bit error then the frame will be re-transmitted.

So you are correct that well designed Ethernet cables and switches do not effect the quality of the sound. If the receiving device (DAC/streamer) allow electrical noise from the Ethernet cable to affect the sound then you have a poorly designed DAC/Streamer.
Also one should not confuse Toslink with Ethernet fiber optic cables they are completely different animals. Note that an electrical signal on an Ethernet cable will travel about 1/100th the speed of an optical signal on a fiber optic cable. Fiber optic cables are used primarily for speed and system security. They will not make a difference in the sound quality.
@welcher This is a nice explanation of file data transfer. How about streaming, such as TIDAL, TV sound, etc.. You cannot go back and retransmit so there must be some occasional loss or imperfections. For music I assume it is not as catastrophic as it is for data files.

Your statement on fibre not making any difference in sound quality is challenged by products like the Sonore Optical Rendu. Their claim to fame is that the fibre cannot transmit any analog noise that exists in some level with Ethernet (maybe also USB). Thus without fibre, analog noise can get into the DAC.

@milalis I was going to buy a Sonore Optical Rendu today but changed my mind and will get a DAC with built-in streaming for occasionally used  bedroom instead (Black Friday sales). I am of the opinion that streaming with fibre levels the playing field. Using a noisy computer is also not a problem with fibre in the streaming chain. My current streaming, a microRendu, is more than good enough. So today, a second DAC will get me more short term joy but going fibre in the main rig is the long term plan for that final streaming solution.
yyzsantabarbara,

Depending on the protocol / app, most audio over ethernet will retry, and most buffer to give time for retry.  If streaming locally, odds are you almost never need retry.
Audio2design, You're right - that's probably what he was talking about, otherwise it doesn't make any sense.  He says "My system uses fiber optic cables. These go all the way to the dac", hoping that optical connection, shielding DAC from electrical noise, makes incoming music always the same.  It is not why it is the same.  Streamed music doesn't come in real time.  It is just plain data that is pre-buffered.  As long as it is the same data in the same format everything depends on the hardware that creates S/Pdif stream/timing (and I assumed he uses different hardware for different providers).  What makes different streaming providers sound the same is the fact that they send data only. Asking them why it sounds the same (with the same hardware and the DAC) is like asking different bookstores why the same book looks the same.

It should sound the same, assuming same internet provider and hardware, same data, same data format, same streaming hardware, same DAC etc.  "Should", because there might be something else that I'm forgetting.

I looked at Toshiba Toslink drivers.  There are some, perhaps older generation, that go only to about 5 Mbit/s, but they have newer drivers that go over 100Mbit/s.  Such Toslink devices should suffer less from the system noise.  Jitter induced by slow transition time and noise at the receiver's (DAC) end would be reduced as well.
As the study of psychoacoustics evolves, it is becoming abundantly clear that A/B tests will become irrelevant.