Are my CAT5 and router my weak link?


I have paid a lot of money for my PS Audio PW DAC II with the bridge....as well as all of my interconnects, power cords and speaker cables. After all that, I have an inexpensive (relative to my system) wireless router that connects my computer to the PW DAC and CAT6 cables that are not too special. Are those components letting the signal come through fully? I am curious what others may have done.

Thanks
Jeff
jeffatus

Showing 11 responses by almarg

Any cable that is conducting high speed digital signals, such as the OP's CAT6 ethernet cable, firewire cables, USB cables, etc., and that is located in physical proximity to the audio system (e.g., in the same room), can radiate or couple RFI (radio frequency interference) into the audio system, with effects that although unpredictable could conceivably be both sonically significant and cable-dependent.

Cables that are conducting signals that are involved in the timing of D/A conversion, such as (apparently) the clock cables Jfrech is referring to, can of course be expected to be much more critical, as a result of noise pickup, ground loop, and impedance matching issues that can affect jitter.

If the CAT6 ethernet cable the OP referred to is in the same room as the audio system, he may wish to consider experimenting with inexpensive shielded ethernet cables, as member Bryoncunningham described doing in this thread. See the posts in that thread dated on and around 2-16-12. Inexpensive ethernet cables are commonly unshielded, but good quality shielded cables are also readily available at low prices.

Regards,
-- Al
Tubeking, I agree with everything you said. However, I question its relevance. The concern you appear to be addressing is successful communication of ethernet data, apparently over relatively long distances. The concern I was addressing is radiation FROM the cable into arbitrary circuit points in the audio system. Everything else being equal, that can be expected to be REDUCED if cable bandwidth is LOWER. Lower bandwidth = slower risetimes and falltimes = less RFI, everything else being equal.

Also, note that I did not say that results would NECESSARILY be better with the shielded cable. What I suggested is that it may be a worthwhile (and also very inexpensive) experiment. Which certainly turned out to be the case, per the thread I had linked to, in the system of someone who IMO is one of our most credible members.

Regards,
-- Al
Hi Kijanki,

Bryon's situation is indeed different in several significant ways. One being that there is no wireless link between the cable in question and the system. However, following conversion of the ethernet data to S/PDIF, which is performed by a Sonos, the S/PDIF data goes through a high quality re-clocker, that also provides galvanic isolation, before it is input to the processor in which D/A conversion is performed.

I am in general agreement with all of the technical comments in your posts above, except that I would emphasize that matters of degree are involved. And the matters of degree have no clearly definable threshold separating what may be significant from what is insignificant. For instance, concerning your comment that:
This noise (whatever the source is) has to be strong and at least 30MHz to make effective (1/10 wave) receiving antenna of typical 1m interconnect.
While the 1/10th wavelength criterion is a reasonable guideline to use in many contexts, I would be hesitant to declare that in the context of an audio system an antenna that is less than 1/10th of a wavelength will be sufficiently INEFFECTIVE to reduce noise pickup to insignificance. Especially when sub-nanosecond jitter effects are presumably significant at the point where D/A conversion is performed. And given also that low-level RF may significantly affect the performance of analog circuitry. There is yet another thread that has recently appeared about someone hearing radio stations while listening to a phono source!.

Also, in addition to the possible antenna effects of interconnects and speaker cables (with RFI picked up by speaker cables perhaps becoming audibly significant as a result of entering amplifier feedback loops), I would not rule out the possibility that RFI picked up in AC power wiring may find its way to circuit points in the components where it could have audible consequences.

Concerning the 1/10th wavelength guideline specifically, I'll mention that some of my antique AM radios, that are designed to work with external antennas and do not include built-in antennas, will receive non-local stations on even the lower part of the AM band, with good quality, using a piece of plain hookup wire just a few feet long as an antenna. In those cases the antenna is well under 1/100th of a wavelength. And some of those radios, that do not have well-shielded RF and IF sections, can pick up those stations with no antenna connected at all.

Concerning the RFI reduction resulting from twisting of the conductors carrying balanced signal pairs, while obviously that reduction will be very substantial, it too is a matter of degree, and will not be perfect. How imperfect will it be, at each of the many frequencies that may be involved, and how much imperfection has to be present before there may be audible consequences? Obviously I have no idea. But my point is simply that if the OP's ethernet cable is located in relative proximity to the audio system, the possibility that changing the cable to a shielded type could make a difference for the better does not seem to me to be beyond the bounds of plausibility.

Best regards,
-- Al
Do shielded cables require special connectors? I thought that plug has 8 pins (4 twisted pairs) with no room for the ground. Ground would have to somehow clip to chassis.
Hi Kijanki,

Excellent question. The connectors on all of the shielded ethernet cables that I have seen have metallic housings, that are presumably connected to the shield. And when inserted into the ethernet connector on all of the computers I am familiar with, that housing will contact a metallic tab which is in turn grounded to the chassis.

Best regards,
-- Al
I just keep thinking about all of the money I have spent for high end gear and their fancy connectors, etc. and want to know if a $120 router will be my weak link.
I doubt it. The physical separation between the router and the system would seem to make it very unlikely that RFI generated by the router would have any audible effects on the system. It is conceivable to me that differences in the risetimes and falltimes of the signals generated by different routers could result in VERY minor differences in noise conditions within the DAC, but I doubt that those differences would have audible consequences. Even if they did, there would be no reason to expect a more expensive router to necessarily be better in that respect, and it very conceivably could be worse.

Regards,
-- Al
Jeff, you've got it right, except that it would probably be a good idea to upgrade the router-to-switch cable, in addition to the others. Although that cable will no longer be in the signal path between the computer and DAC, it could still conceivably radiate or couple digital noise into pathways that ultimately lead to the DAC, or even to other parts of the system.

Considering the low cost that is involved, Bryon's suggestion of the switch seems worth trying, although whether or not it will make a difference for the better is anyone's guess. Among many other things, it would depend on how the characteristics of digital noise generated by the switch may differ from the characteristics of noise generated by the router; on the degree to which router-generated noise can propagate through the switch; how the risetimes and falltimes of the output signals of the router and of the switch compare; and on the sensitivity of the DAC to all of these things, if indeed it has any sensitivity to them at all.

Regards,
-- Al
Hi Bryon,

Note, however, the statement in Jeff's most recent post that:
I need the wireless router on because my iPad controls the computer wirelessly. The computer, via PS Audio's eLyric, then sends the data to the Perfectwave DAC.
So if I understand correctly, in his particular situation the connection between the router and the computer needs to be maintained during music playback.

Best,
-- Al
08-14-12: Jeffatus
Well, I wonder if I make the computer connection to router wireless, then connect the computer direct to the DAC (bypassing the switch altogether).... Does anyone see anything I missed?
That's a good question, given that it sounds like the computer is either a laptop or a desktop with a wireless adapter. Here are a couple of points to consider, though:

1)In doing that essentially what you would be doing is substituting one significant generator of digital noise (the computer) for another (the router). And quite possibly the computer is considerably worse in that respect than the router. Having a network switch between the computer and the DAC could conceivably still provide significant noise reduction benefit, with respect to computer-generated noise in this instance.

2)Changing to a wireless connection between the router and the computer might slow down your internet connection speed, depending on the speeds of the internet connection and the wireless link. If you frequently have occasion to download large files that might be a significant consideration.

Regards,
-- Al
Eniac, thanks for providing the info about the opto-isolator, which certainly seems like something that could be beneficial in many setups, and that apparently has no downside aside from its cost.

However, while it figures to be something that would eliminate groundloop-related noise, without further technical information on it (which I couldn't find via a Google search) I'm uncertain as to the degree of effectiveness it would have with respect to source-generated noise that may be riding on its input signal. For instance, some amount of stray capacitance will exist between the electrical parts of the device, that may to some degree allow noise to bypass the opto-coupler device. Also, I note that its intended purpose is described as surge protection.

Also, I would not necessarily conclude that it would make cable upgrade redundant, because the existing unshielded cables could conceivably radiate noise that would bypass the device. That could occur by radiation into the power wiring, or directly into system components or cabling. Bryon's experimental results would seem to support those possibilities.

Regards,
-- Al
Eniac, thanks for providing the link, which I read through. It doesn't seem particularly relevant, though, as what it is addressing is leakage current that occurs in response to the AC line voltage, at the AC line frequency. The concern here, of course, being mainly digital noise at very high frequencies. Also, since this device does not utilize any AC power, I suspect that its qualification against the EMDD is pretty much a formality, at least with respect to leakage considerations. Finally, I note the statement that:
Reducing leakage current within a power supply usually means eliminating or limiting the value of Class Y filter capacitors from live-to-earth and neutral-to-earth. It also demands that stray capacitance to earth is minimised through careful design. Unfortunately, the overall effect of these measures tends to compromise EMC performance, although minimising stray capacitance can reduce common mode noise.
So in the kinds of designs being discussed in the paper there can be a tradeoff between minimization of AC leakage and optimization of EMC (Electro-Magnetic Compatibility, referring to the effects of radiated interference).

That said, as I indicated earlier the device does seem like something that can be beneficial in some systems, and that has no apparent downside apart from cost. Thanks again.

Regards,
-- Al