ETHERNET CABLES


When using ethernet for hooking up streaming devices and dacs, what cat level of  ethernet cable should be used. Is there any sonic improvement by going to a  higher dollar cat 7 or 8 cable?

128x128samgar2

... confirmation bias is a huge issue for the untrained. 

Measurementalists enjoy no immunity to confirmation bias; it is no less an issue for them.

@fredrik222 ,

Just as I thought: a lazy authoritarian. The evidence is out there for anyone to see yet you won’t take a look. I’ve been around long enough to know that anything I post or link to will be rejected, outright.

I’ve done it enough times here that I won’t do it anymore. You’ll just flat out reject it due to your confirmation bias. Check out the search engine, above, and take a gander but, as I already said, you’re a lazy authoritarian.

Tell me freddy, are you satisfied with your cable TV picture? It comes approximately the same way as your music. Have you compared it to say, the blu ray equivalent? Is not the blu ray much, much better? (I hope and pray you do or this is one big meaningless waste of time)

Now, do you think that the signal you’re getting is as pristine as from a CDP? Do you think all the extra boxes and cables don’t add something to the mix? You really should look into some kind of deprogramming course.

All the best,
Nonoise

@cleeds 

Never said so, but what I did say was there is no theoretical improvement from going crazy and using fiberoptic and other crazy things for a 10 ft run in a residential applications. Simply put, if you hear an improvement, you are making yourself hear it subconsciously in a residential application. 

It's different if you were to run 300 ft of ethernet through a factory as an example. But if you do that and plug into your streamer, you have other issues with audible noise drowning out your music. 

If you want to learn about how cables actually work in Ethernet:

 

https://www.cablinginstall.com/home/article/16467568/the-myths-and-realities-of-shielded-screened-cabling

 

Everyone keeps focusing on the cable and noise that may couple into the signal on the cable, but the key here is what occurs after the signal leaves the cable in the Ethernet receiver. In a 3 or even 10 foot run, a tiny amount of noise can couple into the signal. However, in the receiver, the received signal is compared to reference or threshold values and based on this comparison an entirely new signal is created. The received signal (and any noise) is effectively discarded. The newly created signal is output at the correct voltage level without the noise. This new signal could be transmitted for another 300 feet (or some other distance based on data rate) or passed on to the internals of the streamer for further decoding. At each hop in the communication path, the signal is re-created and retimed thereby creating an entirely new and clean signal.

While analogies are often imperfect (and I am told, mine are often terrible), think of a piece of blank 8.5x11 inch paper sent through the mail. The post office may crumple it or bend it such that when received at your house, it is crumpled (noisy). However, the person receiving the paper can tell that it is blank piece of 8.5x11 inch paper (albeit crumpled) and they get a brand new piece of paper out of the drawer and toss the crumpled paper in the trash. The received paper was just used as a reference to know what size of new paper to pull from the drawer. Same with an Ethernet receiver. It compares the voltage value of the received signal to a threshold value, to create an entirely new signal (without noise) and the received signal (with noise) is discarded. This is unique to the digital domain and does not occur in the analog domain.

In a PAM4 system, typical of Ethernet, the signal is transmitted at one of four voltage levels, such as 0, 1, 2, or 3 volts. The received signal will vary some what due to noise and effects of the channel. For example a signal originally at 2 voltes, that is transmitted over a long cable run, could be received at 2.2 volts or even 1.7 volts. It would be compared to the four voltage levels and an entirely new clean signal at 2 volts outputted, which is the closest voltage value. The original signal is effectively discarded. This is the benefit of digital communication over analog communication over long distances (and short distances).

Based on this method of operation, the tiny improvement a better cable provides will not yield a different outcome when the signal is re-created.  A signal transmitted at 2 volts might be received with a shitty cable at 2.1 volts and with a amazing cable at 2.08 volts (small improvement).  Both will be compared to the threshold values (0 volt, 1 volt, 2 volts, 3 volts) and the receiver will output a clean 2 volt signal.  

Peace.

 

@nonoise  

My post was deleted when I asked you put up any evidence for your point. Regardless, there is no evidence for this. I've posted several links showing why noise in ethernet is a non-issue for residential applications. 

 

And you analogy about streaming TV vs Blu-Ray really shows your lack of understanding of anything relevant at all. The primary difference is bitrate for video and multichannel audio, and it has nothing todo with any type of noise at all. Cable TV is also using a lower bitrate, and is typically compressed with lossy compression. 

 

Here's one link that explains it to you: