I don't understand the reason for your question. You're telling us, in no uncertain terms, what the answer is. Have you ever been to the website Hydrogen Audio? I think you'll get the answer that you're looking for over there.
48 responses Add your response
There are two possibilities:
You have synchronous USB DAC: Timing is controlled by the computer and is so jittery that your cable cannot possibly make any difference.
You have asynchronous USB DAC: Timing is controlled by the DAC. Cable or computer jitter does not make any difference.
Impedance of the USB cable is irrelevant.
I never said cables don't matter. I said transmission should be error-free over a short USB cable. I would like to hear the reasons why you disagree. Your answer should be more substantiated than "I heard it was better". Not to say that your experience doesn't count. Of course, it does. But my question was "Can you explain it"?
I never said cables don't matter."
You're right, you didn't. And I never said that you did.
"I said transmission should be error-free over a short USB cable. I would like to hear the reasons why you disagree."
I never said that I disagree with that either. Of course you want the cable to transfer the signal error free.
"Your answer should be more substantiated than "I heard it was better""
No it shouldn't. Those are the results I came up with after listening to the cables. Anything else wouldn't be truthful. Also, I never said one was better than another. I was very careful to say different. Better is a subjective term that will vary from person to person.
"But my question was "Can you explain it"?"
No I can't. I wouldn't even know where to begin. Maybe someone that has experience with cable design can give you a better answer as to the why of it.
I have listened to my DAC with the vanilla cable that came with the DAC and without cables (just a thumb drive). No difference. That is how Asynchronous USB DACs should behave by design.
My question wasn't about the cable's effects on SQ. It was about failure: At what distance does a vanilla USB cable start to fail? I'm sure the answer varies with drivers and date rate. For example, we know that PCM should travel farther than DSD.
The reason I am asking is because I am considering a music server plus DSD DAC (ExaSound E22 or PS audio DirectStream) and would like to a general idea of how far away I can place the server from the DAC.
I have asked manufacturers this question. Still, hearing your specific experiences may be helpful.
03-09-15: AxleFive meters is the maximum cable length specified for USB 2.0. However the rationale for that limit, as stated in the following excerpt from Wikipedia's writeup on USB, does not seem applicable to situations that don't involve multiple hubs:
USB 2.0 provides for a maximum cable length of 5 meters for devices running at Hi Speed (480 Mbit/s). The primary reason for this limit is the maximum allowed round-trip delay of about 1.5 μs. If USB host commands are unanswered by the USB device within the allowed time, the host considers the command lost. When adding USB device response time, delays from the maximum number of hubs added to the delays from connecting cables, the maximum acceptable delay per cable amounts to 26 ns. The USB 2.0 specification requires that cable delay be less than 5.2 ns per meter (192 000 km/s, which is close to the maximum achievable transmission speed for standard copper wire).So 5 meters would seem safe in terms of error free transmission (perhaps not necessarily with respect to sound quality, though), but I don't know what limit can be expected beyond that distance.
As for the original question, "Can someone explain the need for expensive USB cables for short runs?": I can't explain the need for **highly** expensive cables, if any, but I believe I can explain some of the reasons why many people report sonic differences between USB cables, and why investing in something better than a generic cable figures to often be worthwhile.
As you alluded to in the OP, the reasons relate primarily to jitter, and possibly also to the effects of coupling of noise into analog circuits. Cable differences will affect signal risetimes and falltimes, waveform distortion resulting from impedance mismatches, susceptibility to groundloop-related noise (that depending in part on the resistance and inductance of the ground conductor in the cable), and the transmission of noise that is superimposed on the signal by the computer.
Depending on the quality and characteristics of the design of the DAC, all of those factors will to some degree affect the amount and the frequency characteristics of noise that may end up at the point of D/A conversion, thereby affecting jitter. In some designs it seems conceivable that analog circuits further downstream may also be affected. Paths by which noise can potentially couple from the USB port to the point of D/A conversion and/or analog circuits include grounds, power supplies, stray capacitances, radiation through the air, and other such paths that may bypass some of the intended signal path.
I should emphasize, though, that none of this provides a basis to assume a high degree of correlation between cable performance and cable price. And none of this provides a basis to assume that a comparison between a variety of specific cables will yield results that are consistent among different computers, DACs, and systems. And none of this provides a basis for attributing specific sonic characteristics to specific USB cables, as audiophiles often seem to do.
No Hype, IMO. I have compared $2 Monoprice USB cables to the Wireworld Starlight Platinum and I can easily hear the difference. Much more detail whereas the $2 cable sounds colored and flat. Since the WW is really too expensive, I settled on the Cabledyne Silver Reference USB cable for about one third the price (sounds just as good, IMO).
I don't have tons of money to blow on expensive cables and I retain a healthy amount of skepticism when it comes to cable claims. That said, I was unprepared for the huge improvement in sonics when I went from the (4% silver) Pangea USB-PC to the (100% silver) Pangea USB-AG. I know digital is all about 1's and 0's but the improvement was immediatey noticeable. I have no other explanation other than to flatly state the 100% silver cable was so much better.
"My question wasn't about the cable's effects on SQ. It was about failure: At what distance does a vanilla USB cable start to fail?"
You have an odd way of asking questions. Reading your original post, I would have never guessed the above quote is what you are asking for.
"I'm sure the answer varies with drivers and date rate. For example, we know that PCM should travel farther than DSD."
I'm not sure how you came up with that. PCM data can be transferred in more than 1 way. 75ohm coax, Toslink, 110ohm balanced, AT&T ST fibre optic, I2S, USB.....
The only topic that's more pointless and inciting than a ' do cables make a difference'. ....is a 'do digital cables make a difference'...it's pointless. You will get the same answers and rationale you always get from each camp. This is one you simply must find out for yourself.
That being said...you question came across as more of a statement.
My OP asked about need not preference for expensive USB cables. This could have been posted more clearly. Need is about functionality. Preference is about whatever floats your boat including SQ, looks, durability, and reliability. But, I am glad that we are talking about both. Because if it comes down to bit errors (I think it does), what I first viewed as a preference may actually be a need. I can further elaborate on this by posing a question and then answering it.
If bits are either 1s or 0s, how can USB cables provide a gray scale in SQ?
Bit errors due to insufficient margins.
I believe digital audio has the misfortune of being a real-time application that uses designs intended for non-real-time applications. In real-time applications, there is one chance to get it right. Large margins are specified for all component designs in order to prevent bit errors. Typical USB applications (data transfer) are not real-time. They do not require large margins because bit errors are acceptable. You have the luxury to either retransmit until the received packet is error-free or you apply forward error correction to fix the error. We do not have this same luxury in audio. While music will flow with bit errors, those bit errors distort the analog wave shape (lower SQ).
Of course, isolation is also an issue. Al has an excellent post above that discusses isolation. But in this case, the cable solution is a bandage not the issue. Therefore, while I fully agree with you Al, I left this for another time.
My guess is that one USB cable may provide better SQ over another if it has better analog properties that prevent eye closure and transmit fewer errors. Having said that, what is good enough? Is it the cable that costs another $50 or $500? I think the cable industry intentionally does us a disservice of keeping us in the dark in order to sell into that ignorance. The test would be simple: Pick a nominal setup and test for data rate vs bit error rate. Then you pick an acceptable bit error rate (let's say your typical song is 4 minutes so you pick BER < 1 per 5 minutes) and select the least expensive cable that does not exceed that bit error rate.
Axle, asynchronous DAC controls the timing. Data coming from the computer is placed in the buffer. Every frame computer adjusts number of samples in the frame based on buffer under/overflow signal from the DAC. DAC takes data from the buffer and writes it into D/A converter using internal stable clock. Because of that jitter does not even apply here. It is possible tough that ambient or computer electrical noise can enter DAC thru the cable. USB cables carry power that is not needed and can be source of such contamination. Ethernet is pretty much the same story - data comes without timing in packets so cable should not matter, but people reported improvement when moving to better shielded cables. I suspect that the same thing takes place - cable picks-up ambient electrical noise and injects it into DAC affecting internal clock thus jitter. Jitter converts to noise in frequency domain.
I don't have USB DAC so my observations are only theoretical. I assumed that DAC is asynchronous. Synchronous DACs, where computer controls timing supposed to be pretty bad since computer clock is very jittery.
The main problem is jitter. Clock jitter is a form of signal modulation (similar to FSK) that produces sidebands. These sidebands are at very low level but also very audible since they are not harmonically related to root frequency. Eventually with many frequencies (music) jitter produces a lot of sidebands resulting in noise that is proportional to signal level and hence undetectable without signal.
"My OP asked about need not preference for expensive USB cables. This could have been posted more clearly. Need is about functionality."
"Having said that, what is good enough? Is it the cable that costs another $50 or $500? I think the cable industry intentionally does us a disservice of keeping us in the dark in order to sell into that ignorance."
If you're just concerned about the function of USB, and things like maximum cable lengths before the signal starts to degrade, there really isn't any industry conspiracy. USB specs are published like any other format. Now if a private company wants to go out and make high end, expensive USB cables, who's to stop them? And why would someone stop them? (I'm assuming the people making these high end cables are not breaking any laws, such as copyright infringement, or similar type of offense.). So, as long as you stick to whatever standards the format requires, you shouldn't have any problems from a functionality standpoint, regardless of cost. Even a cheap cable should be fine. I found this website for you to look at. Its a business that sells cables, but they go over the requirements for the different versions of USB cables (USB 1, USB 2, USB 3, etc...)
Thanks Sd542. I was originally thinking functionality only because bit errors should be zero in the audio range. But digging deeper, I realize that may not be true. Some USB cables may result in fewer errors than others. In which case, it's not only about functionality but also about SQ.
Cable manufactures aren't doing anything illegal. But I would call it unethical to mislead the customer for profit.
Hi Kijanki, I agree with everything you said except for this one thing: "...jitter does not even apply here".
Just because the source clock isn't used inside the DAC doesn't mean that it has no effect. The DAC clocks data only after it reaches the DAC. The source clocks data up to that point. Source clock jitter combined with cable jitter may err a bit before it reaches the DAC.
For short cable runs, bit errors induced by the cable are unlikely. For long cable runs, bit errors are inevitable. Therefore, the longer the cable and more marginal the source, the more the cable matters for both function and SQ.
Of course, as you mention, a cable may also help SQ by isolating the signal from various noises.
I have two extra pair of Wireworld Starlight and one Pangea AG USB cable. I cannot tell the difference between these and the basic monoprice cables except that the more expensive ones seem more sturdy. Sound-wise, not a whit of difference. I wish I were able to so I didn't feel dumb spending money for what is basically a give away item.
Here is a link showing a good USB eye vs bad eye.
Here is a USB chip vendor's failing eye diagram.
One eye is open. The other two eyes are closed without a long cable. The good eye doesn't need any help. Any cable will do. The bad eyes need all the help they can get. The cable is critical. Reality is that you aren't searching for that magic touch (whatever that means). You are searching for the cable that does the least harm. Silver will 'sound' better because it is faster. But silver won't sound any better for the open eye.
Lse thanks for the info tho sorry for your situation having spent the extra money with no difference in sound, you have helped me as I will not spend the extra because I have the Monoprice usb already. Also fwiw I think it sounds terrific and I know there is a difference between cables as I have a cable (unmarked) that is far better than a generic usb that I used to use (more body, fuller bass). The Monoprice has terrific bass and smooth detailed highs.
Axle, cable bad enough to corrupt bits would be disaster
since each frame contains checksum and would be dropped. As
I understand it each frame is delivered every millisecond
and starts with unique bit sequence signifying start of the
frame (SOF) followed by music samples (about 2x44
for redbook playback) and then followed by the checksum.
When even one bit is wrong checksum won't match and DAC will
drop whole frame (2x44 music samples) since it does not
resend frames. On the other hand I see possibility of
vendor specific design that resends frames eliminating bit
Kijanki, Your point about checksums is interesting. Let's say that a packet is dropped. What would take the place of the dropped packet? Would it be that the contents of previous packet are duplicated? Dropped packets are problematic regarless of how they are manged. The only good solution is o resend the packet.
With a buffer large enough for several minutes of audio, there would be ample time to resend packets. Does USB audio comply fully to USB standard specifications? The answer to that question would go a long way. Thanks.
Axle, I'm assuming, that without resending it would be 1ms gap. Not bad if it happens once but can sound pretty bad if frames are dropped often. I found some info and interesting computer audio FAQ site:
Thanks for the link. I like it.
There are four transfer modes: control, interrupt, bulk, and isochronous. It seems that bulk transfer (as long as the buffer is large) is the way to go.
Interrupt and bulk data transfers conclude with a handshake packet to provide confirmation that the data was received, or request that it be re-sent if it was not. Delivery of this data is therefore guaranteed, even if the time taken to deliver it is not.
But there is no guaranteed access to the bus in bulk trasfer mode. If you want guaranteed access to the bus, then you must use isochronous mod.
"With isochronous data it is not possible to retry a failed transaction. Since only one slot is allocated to the pipe during each frame, resending the data would delay transmission of the succeeding data samples, upsetting the time element of the data delivery. Consequently no handshake packet is sent and the data must be accepted as is.
Bulk mode can be used for a dedicated music server because the bus is free. But isochronous mode is required for computers. Question is, do you have a choice? If not, you have to accept the default mode of the DAC, which is likely isochronous.
The answer to the question are bit errors recovered? and therefore "do cables matter?" is it depends on the transfer mode.
Axle, Bulk Transfer looks good but I wouldn't mind Asynchronous with resend. As long as all data gets to buffer jitter doesn't matter since buffer removed timing. New timing is recreated with new stable internal clock. The problem still is computer noise coupled thru (to DAC or capacitively to other cables) or radiated by USB cable. By limitation of cable length (5m) computer has to be close, and that is undesired. I moved computer across the room to the different phase outlet and plugged all my audio gear into power conditioner (with filtering). I keep my cables as short as possible. ICs are 0.5m XLR. Music is send as data in packets over wireless. This data does not have timing. It fills large (few seconds) buffer in my Airport Express. Airport Express is only decent (258ps p-p) with the jitter but in combination with jitter supressing Benchmark DAC1 produces sound that is amazingly clean. With other DACs (like NOS), reclocker could be used.
I am using the IFI usb power supply. That alone made a big difference. I then purchased a pure silver dual headed usb cable on Audiogon, (I forget the name but it is located outside the USA).
For the other connections, I purchased short runs of the Pangea pure silver AG usb cables.
It all mattered, and at each step the sound did improve with a purer sound.
I know the Zero's and Ones should all be the same, but these do sound different.
Kijanki, I like the Airport Express implementation. I tried AE with both optical and analog outputs. I'm pretty sure that my Meridian G68 processor uses a FIFO buffer with synchronous clocking. Therefore, the AE jitter is transferred. In your case, the clocking is asynchronous and jitter transfer is zero. But even in your case, AE noise can enter the DAC through the analog (but not the optical) cable.
Getting back to my OP about functionality vs SQ. I just received an email from George at ExaSound. Working with George is a pleasure. ExaSound DACs use error correction. Therefore, if it is functioning, it is optimal SQ. This is how it should be. DACs that don't implement error correction and use isochronous transfer mode, have to live with whatever transmission they receive. This is not how it should be. Hence my OP about functionality vs SQ. There are some gray areas, even for digital.
Noise is an additional factor. If a cable introduces source noise or EMI/RF to the DAC, then SQ could be significantly impacted. However, the difference between a bad and good cable is neither a steep nor expensive order.
Optical is a good choice. Locate the source far away from the DAC, use an optical cable, and you have no noise. Use an asynchronous DAC with error correction and you have no jitter and no errors.
No noise, no jitter, no errors! What more could you ask for? Too bad optical is SPDIF. I guess there is one more thing to ask for: optical USB.
"No noise, no jitter, no errors! What more could you ask for?" While those are good goals to theoretically try to achieve, I'm sorry to say that in the real world you will never completely eliminate noise, jitter, or errors. You can minimize them by good engineering design. Sorry to be the bearer of bad news, but don't shoot the messenger!
Bill, I hear you. But I am optimistic that we are getting to that point where they won't matter.
Optical has dispersion and the O/E converter generates noise. Therefore, you are technically correct that you can't eliminate noise altogether. But optical completely isolates noise from the source, as well as EMI and RFI that are otherwise introduced through an electrical cable.
All digital generate their own internal jitter, including asynchronous DACs. Therefore, you are technically correct that you can't eliminate jitter altogether. But you can completely isolate source jitter.
Bit errors will always exist. But they can be corrected.
In summary, we have ways to correct bit errors 100%, to completely isolate source noise, and to render jitter irrelevant. What we don't have is all three in one design ... yet.
Of course, this is limited to the DAC. Even if the DAC signal is pristine it can be contaminated by EMI/RFI upon leaving the DAC.
But that's a different world we are talking about.