Jitter and 75ohm cable length

I have read a number of papers on how cable length plays a role in Jitter between transport and DAC. After all of the dust settled I arrived at no sound conclusion, on paper, so I decided to use the ears of my 17 year old budding Audiophile to settle this by LISTENING! My transport is a Wadia 171i (WAV/LossLess files)and my DAC is a Cambridge AZURE 840C. I had three cables in my test, my 1M Kimber D-60 illuminations, a 3' HAVE/Canare and a 6' HAVE/Canare. All three cables sounded good, but in the end the victory landed on the 3' HAVE/Canare by a fair margin followed by the Kimber and last the 6'HAVE/Canare. In my readings I came across a number of articles saying you should use at least 1.5M of cable to reduce reflections in the cable so as to not harm the clock signal, yet an RF engineer said this was a bunch of "Bunk" and 1M would be better, in fact he said the shorter the better. So, forgive my verbosity, what are your thoughts and experience in this area? My 3' $25 HAVE/Canare beat up my $390 Kimber, I believe due to proper honest 75ohm terminations vs standard RCA connectors, and as far as length goes, at least in my system, 3' was by far the best. Thanks!
To really tell you would have to have the same cable in each length. I have not found any particular length to sound better; I have tried .5M, .75M, 1M, 1.2M, 1.5M and 2M but these were not all the same cable. The best one out of the 8 or so I currently have is a Clearer Audio pure silver with silver NextGen WBT RCAs; it is .5M. Would it sound better in 1M or 1.5M, your guess is as good as mine. Sounds very good as it is.
My experiment was via two Kimber Orchids(.5m and 1.5m) There was no contest, as the longer cable was much more open and liquid.
I am using a pure silver Neotech coax which is 2 feet long and sounds great. Not sure if 4 feet would sound better or not, suppose I could convince myself of almost anything....
I compared a 3' and 6' length of the same brand of a digital coax and liked the 3' better. The 6' was darker (mellower?) and might have been my choice if my system was on the bright side. They were extra cables I had laying around and was experimenting with the iPod/dock/DAC interface. They are copper and I ultimately went back to my 3' silver cable.

Like the fact that you used what you heard and not what you read to decide.
I came across a number of articles saying you should use at least 1.5M of cable to reduce reflections in the cable so as to not harm the clock signal, yet an RF engineer said this was a bunch of "Bunk" and 1M would be better, in fact he said the shorter the better.
I am an EE with multiple decades of experience designing high speed digital circuits, and also significant experience designing RF circuits (none of it for audio). I say it is definitely not "bunk." However, there are several factors that introduce a degree of system dependency and unpredictability into the issue, so 1.5 meters should be taken as a general guideline, which will not always be optimal. And in some cases the length won't make any difference.

Also, if a very short length is practicable, say 6 to 12 inches, that should be at least as good or even better. It is the intermediate lengths that are the concern.

The basis of the length concern is not to "reduce reflections." The magnitude of the reflections is determined mainly by the closeness of the impedance match between the cable and its connectors, the input impedance of the dac, and the output impedance of the transport.

The point to optimizing length involves the TIMING of the reflections, or more properly, the arrival time at the dac input of reflections of the original signal arrival at the dac input that have re-reflected from the transport output. What needs to be avoided is re-reflection arrival time at the dac input that coincides with the mid-point area of the risetimes and falltimes of the original signal arrival, which is where clocking occurs. If that were to occur, the resulting waveform distortion would be likely to cause a significant increase in jitter. All of that timing is directly dependent on the length of the cable.

Therefore the lengths that should be avoided are dependent on the risetime and falltime of the output signal of the transport, which are normally unspecified, and can be expected to vary significantly among different transports. This paper by Steve Nugent, which you've probably seen, is based on the assumption that those risetimes and falltimes are around 25 nanoseconds. I assume that is a good rule of thumb, but I would not expect it to be precisely consistent across different makes and models.

Also, the amount of time required for the signal to propagate from one end of the cable to the other will vary among different cable designs, because propagation velocity is dependent on the dielectric constant of the particular cable.

Also, jitter can be contributed to by noise caused by ground loop effects between the transport and dac, which, if present to a significant degree, can be expected to worsen as cable length is increased. Conceivably that effect could outweigh the timing consideration in many setups.

Also, different dac designs differ widely in their jitter suppression capability, with some of them (such as the Benchmark) being nearly completely immune to jitter on the incoming signal.

Finally, if the degree of mismatch among all of the impedances that are involved is insignificant, the whole issue becomes moot.

So as I see it the recommendation to use either a very short length, or a length of 1.5 meters or a little longer, has a sound technical basis, and while not always applicable, can be expected to be applicable more often than not.

-- Al
A few examples of "in practice" don't discredit the validity of the "in theory," but this is why the theory should only be used as a guide.
I should note that I'm referring to a sound (pun alert) technical theory and not just a best guess.
Back again with a twist. I repeated my experiment with the three cables however I introduced a "reference", by that I mean my CDP. To keep things on a fair playing field, I made a CD-R copy of a decent recording I own, Jazz at the Pawnshop, I then copied the same original to my iPod in WAV format. This time I used my Ears and listened at length to each cable and compared it to the playback of the copied disk to the Wadia 171 iDock. First, it was quite eye opening how different the HAVE/Canare 3' cable compared to the 6' of the exact same wire, something is really going on here. Bottom line, comparing the sound of each cable compared to the CDP, the 3' HAVE/Canare sounded almost exactly as the CDP with the exception of slightly more Base from the Wadi/iPod and may, I need to be careful here in that the Wadia/iPod may actually have sounded slightly better than the copied CD. So, the 3' Have/Canare proved to produce the best sound to my ears with the 6' version and Kimber D60 trailing far behind. I wonder if it could be possible to have the D-60 reterminated with true 75ohm connectors, but for now the front seat goes to the $25 HAVE/Canare. Thanks! Looking forward to your thoughts!
Clever experiment, taking advantage of the fact that the CDP has digital inputs!

I have no knowledge as to how practical it might be to have the Kimber cable re-terminated with Canare's, but why bother? If the 3' HAVE/Canare sounds just about identical to playback via the CDP's built-in transport (with no cable and no S/PDIF interface involved), that would seem likely to be as good as it gets!

-- Al
The best 75ohm cable I've ever heard has been a Moray James digital cable and this designer swears by 1.5m length. It clearly beat my half meter Canare and Kimber Illuminati cables....in MY system. Here's the twist, the MJ cable was voiced with the same type of Meitner DAC I owned. I suspect part of the magic has to do with system synergy. However, I still use the Moray James cable for the streaming audio feed to my PS Audio Perfect Wave DAC. It has the best bass and soundstage I've heard and recently beat an XLR Harmonic Tech Platinum digital cable. Your ears will guide you to the right destination. I believe finding the right cable/link is key to good digital sound reproduction.
Notes on the 840c

First, it has both balanced and unbalanced outputs. The balanced make for quite an improvement. Choice of cable? Gigantic, as you well know. I'm not a big cable experimenter, but the 'as issued' unbalanced went to the trash and I bought some Mogami balanced.

Secondlly, the digital inputs of the CA840c are somewhat MORE prone to jitter effects from the source than many other players / DACs. To that end, CA has issued several Software Revisions, none of which they'll send you anymore, wanting to (either or both) keep people from bricking there players OR drive business back to the dealer. The last revision I had wanted XP and the computer MUST have a serial port and you need a Null-Modem cable.
I was unable to get an Apple AirportExpress to play properly with the toslink input of the player. Simply too much jitter.

I'd recommend looking at the software rev of the player and examining the downstream cabling used, since that won't be a trivial effect.
Thank you Al and Vhiner. Canare has it's own set of cable and terminations, where HAVE seems to use their choice of cable with the Canare terminations, hence, HAVE/Canare. Specifically, HAVE uses GEPCO International VSD2001 High Definition 75ohm Serial Digital Coax (this is printed on the cable) I'm not sure if Canare uses the same cable, but what I can say, again everybody has system dependent variables, in the end I agree with you (Vhiner)completely, follow your ears and don't drink by the label. As far as my 3' HAVE/Canare, it's the best cable I can not hear and that sums it up for me! Thanks for the feedback!
Magfan, I had to send in my unit once for repair about a year or so ago where the transport had problems loading a disk. When I received the unit back they provided a note where they said in addition to the repair they upgraded some power supply caps so the unit, as they say, "reduced the in-your-face presentation" (whatever the heck that is supposed to mean). No mention was made of a firmware upgrade, but I am suspicious they did upgrade the firmware but wanted to keep it low-key. My current version is; 01/069/1.1 is this the latest firmware you are aware of? I have all of the facilities to upgrade but have never received information that one was available. I did call CA and they told me there was only one upgrade since the CDP was released and it as they claimed did nothing to improving the sonic qualities of the unit and was specifically targeted for the Apple AirportExpress claiming the Jitter from it was so severe at times the 840C could not get a lock, so they somehow through the firmware update opened the window and made the unit more forgiving of received data. Is this how you understand the issue? Am I on-base about this or had I been fed a bunch of bunk? Lastly, do you know if the version I have is the latest and if not how do you find it and what is it supposed to improve?? Thanks!
CA no longer sends out the file. It was 'zipped' and contained a readme and the installer and some other stuff.
It is on my laptop but I've never mustered the guts to DO it. My laptop is Windows 2000 or ME....dual boot. NOT XP, which is what they recommend.

You have more information than I. If they have only issued one software upgrade, than I've got it. I probably also have the 'old' caps.....

To find out you press some front panel buttons on the player......01/67/1.2 is in my player and found by repeatedly pressing the 'menu' button.
I don't know what the latest version is 'supposed' to be.

And Yes, I think you are on base about the AE jitter issue. The Stereophile test showed it to be awful. The AE is unclocked, as I understand it. I also make sure my computer is doing nothing else when I stream music via iTunes.
"The Stereophile test showed it to be awful."

Not exactly. High jitter appears on analog outputs only. Digital is about 10x better (258ps vs 2400ps) and that's what Stereophile stated:

"The noise floor has dropped by 4–5dB, the word-clock jitter to a respectably low 258ps, which is actually better than the case with the standalone D/A processor driven directly by my PC's S/PDIF output (provided by an RME PCI card).

Considering that the AirPort Express's analog output is basically a freebie function added to a computer Wi-Fi hub, jitter aside, its measured performance is quite good. The beauty of this unassuming component, however, is its S/PDIF data output, which allows the AirPort Express to assume a respectable role in a true high-end audio system."
Kijanki, if one suspects they have a Jitter problem with their system, how might it manifest it's impact on the analog output. I had read an article indicating that high Jitter would make vocals sound overly warm, which actually may be desireable? Is there anything else you can add to the impact of sound reproduction? Thanks!


Jitter is basically a noise in time domain. Applied to one frequency it creates sidebands at very low levels. In spite of low levels (less than -65dB) sidebands are audible being not harmonically related to root frequency. Now, take whole bunch of frequencies (music) and you'll get whole bunch of other frequencies at very low level - basically a noise. Amplitude of this noise is straight proportional to amplitude of music and without music (gap) is zero - therefore undetectable.

My first impression of Benchmark DAC1 that suppresses jitter was that sound was too clean (some people call it sterile or analytical). I had impression that some instruments had to be missing from the recording. I also understand that noisier (or distorted) signal sounds more lively the way that distorted guitar sounds more dynamic than clean jazz guitar at the same volume. Other than that sound is on neutral side - I would not attribute any warmth or lack of it to jitter. Imaging is more focused but perhaps a little narrower.

I don't see warmth as desirable quality. Benchmark technical director John Siau said that overly warm gear can negatively affect sound of instruments with complex harmonic structure like piano making it sound almost like out of tune. On the other hand cold sounding (expanded odd harmonics) gear is much worse. I had problem of brightness until I replaced speakers with aluminum dome tweeters. New soft dome Hyperion HPS-938 are wonderful - neutral and never bright on any CD. Sibilants are still very audible but always clean and natural.

I think that main difference between oversampling or upsampling DACs and NOS DACs is not the sampling itself but filtering. Traditional linear filtering adds pre-echo to impulse. Our hearing is very sensitive to it and getting rid of filter altogether (NOS) or using apodizing filter (extending post echo) might be a good thing. Here is some info on the subject:

I don't care about any jitter on the analogue out.......It is is jitter on the digital signal which renders the 840c unusable. OR, is the 840c simply sensitive to even that moderate level of jitter....?

From the Stereophile test: This is analogue? not digitial?
"The AirPort Express stumbled when it came to its measured jitter performance—hardly surprising, considering it has to derive its 44.1kHz word clock from an asynchronous, probably encrypted datastream"

Now, with the above stated jitter at 258ps, I'm back to ground zero as to why my 840c doesn't like the dig out of the AE.

other questions....
Would the ATV be any better? (Apple TV). Would the DACMagic be prone to the same problem as the 840?
If one suspects they have a Jitter problem with their system, how might it manifest it's impact on the analog output.
I second Kijanki's excellent comments, and I would add the following thoughts, which I composed before seeing his response:

IMO it's very unpredictable, and the symptoms will vary widely depending on the spectral characteristics of the jitter, which can be expected to be a very complex mix of discrete frequency components and broadband noise-like components. Some of those components will be correlated with the values of the 1's and 0's defining the music data (which is different than being correlated with the music itself), and some will not be.

Perhaps essentially all that can be said is that there will be a loss of clarity, and an increase in distortion.

The following papers may be helpful. The second one, although highly technical, conveys a sense of how complex it all is, and by implication (as I see it) that the effects of digital cables and digital interfaces should not be thought of in the same kinds of ways that we use in describing the effects of audio frequency analog cables, e.g., "overly warm" (notwithstanding the fact that a given cable may create that perception in a given specific system):



-- Al
Let me add to Al's great post. As he stated, typical transport has transition times in order of 25ns. Threshold resides most likely at the half of that - 12.5ns while impedance boundary, that causes reflection will reside on the other end of the cable. Sharpest slew rate change, causing reflection, is usually at the very beginning (knee). From that point signal travels forth and back (reflection) over distance of 2x1.5m=3m with speed of about 60-70% of light speed - let say 0.2m/ns. Reflection will return in 15ns missing time-wise threshold point. I would use 1.5m-2m length or less than a foot where transmission line effect is non-existent yet. Rule of thumb says that we're dealing with transmission line when transition time is less than 8 times propagation delay (one way). It would imply that typical 25ns transport digital cable becomes transmission line when propagation is longer than about 3ns being equal to about 0.6m. It sounds strange but good cable should be very short or 1.5m-2m. When transitions are slow we don't have much of reflection induced jitter problem but rather noise induced jitter (noise affecting threshold point). When transport has fast transitions noise induced jitter is reduced but reflection induced jitter is dominant requiring very good cable. Long cables in addition add to noise pickup so whole thing becomes system dependent. Same cable might sound great with one system but no so great with the other. Many people report better results with Toslink, in spite of slow transitions, perhaps because of noisy environment or ground loops that coax might create.

Magfan, I commented only on Stereophile findings. AE sounds very clean in my system but I use it with Benchmark DAC1 that is jitter suppressing. It is possible that your AE is bad but it is also possible that one sent for evaluation to Stereophile was extremely good (selected?). It is even possible that Stereophile measured wrong - who knows.
Apparently, the AE/840c jitter 'thing' was a known issue. CA issued a software update, which I've got here.......somewhere, but have been too Chicken to install.
It uses a null-modem cable and the requested OS is Windows XP, which my laptop doesn't have. The procedure sounds simple, but than again, were dealing with confusers, here.
I'm sure that CA would be thrilled if I 'bricked' my player. Also, CA no longer sends out the update to individuals, or so I've heard. That COULD be because too many people failed in the update and were really....angry.
I like the AE, even the analogue output. Using my iPod Touch as a remote is just icing on the cake.
You write about jitter induced at the impedance boundry, but what about a simply poorly clocked system? Shouldn't that be added in to the total system jitter? Or perhaps multiplied. if you are sending out a poorly clocked signal into the cabling with the reflections, it sounds like you are compounding the problem.....
Magfan, CD data stream is asynchronous. It is also jittery because of less than perfect CD printing and reading plus quality of the transport and system noise. What is needed to reduce jitter is either to create stable clock for D/A converter based on average datastream rate locking both with PLL (Phase Lock Loop) - solution used in most CDP or ignore completely datastream rate and reclock it with fixed stable clock in Asynchronous Rate Converter (Benchmark DAC1).

We can add to this jitter introduced in A/D process, that cannot be removed no matter what you do. At the very beginning a lot of analog recordings got digitized with less than perfect (jittery) clock and the only way out is to digitize it again if analog master tapes still exist.

The cable length of 0.6m without transmission line effect, that I calculated, applies to 25ns transition time assuming that driver delivers constant slew rate. There are drivers that do that but very often leading "knee" has higher slew rate. Because of that I would perhaps limit such cable to half of that (0.3m). Above that careful matching of characteristic impedance is recommended. This characteristic impedance has very strange definition. It is impedance of infinite cable or finite cable terminated with its own characteristic impedance - which sounds a little like Catch22. For all practical purpose it is simply SQRT(L/C) implying particular geometry.

I'm not sure what happens with balanced cables. Impedance is 110 ohm and voltage levels are much higher but at the same time slew rate is likely higher and reflection induced jitter taking over noise induced jitter. Maybe Al can help here?
I'm not sure what happens with balanced cables. Impedance is 110 ohm and voltage levels are much higher but at the same time slew rate is likely higher and reflection induced jitter taking over noise induced jitter. Maybe Al can help here?
I'm not sure either, mainly because I don't have any specific knowledge of what risetimes/falltimes/slew rates tend to be for typical AES/EBU outputs. My suspicion is the same as yours, though, that those parameters are likely to be faster than for typical S/PDIF outputs.

Also, while on the one hand the higher AES/EBU voltage levels (assuming the particular equipment in fact conforms to the AES/EBU voltage standards) and the balanced operation would seem likely to help with respect to noise-induced jitter, on the other hand I would expect that in many or most cases balanced digital cables will provide less accurate control of characteristic impedance than a good 75 ohm coax will typically provide.

Excellent elaboration in your posts, btw, on the distinction between noise-induced jitter and reflection-induced jitter, and the competing tradeoffs that result.

Best regards,
-- Al
Thank you Al, and thanks for the links especially the second one. I suspect that balanced cable might be superior in noise rejection simply because of common mode rejection on the receiver side but also because wires are twisted what might be superior to any shielding. Twisting wires exposes them evenly to interference (capacitive or electromagnetic) leading to cancellation while shielding have serious limitations. I mentioned it before, but non magnetic shielding does not protect against EMI (magnetic in nature) but fortunately induced noise travels (to ground) on the outside of the cable - shield, because of skin effect. This beneficial skin effect is good at high frequencies but less than perfect at the lower frequencies where cable is still long enough to become effective antenna (antenna is practically ineffective below 1/10 of wavelength). Understanding of this should lead to understanding that shortest cable is the best cable. Statement that digital cable should be at least 1.5m is not complete - it should state instead "as short as possible but not shorter than 1.5m". In ICs (or speaker cables) twice shorter means twice better. Sales people often recommend 1m IC vs 0.5m IC because that's what they have in stock - absolutely no other reason.
Can someone here give me a hand with the math and it's implications with respect to two different cable lengths from the exact same manufacturer. Much of the math in this forum is in meters, the typical standard. One of my cables is 3' in length and the other 6' in length and strangely they sound different. Specifically, the 3' cable sounds very fast slightly bright (or hyper detailed) where the 6' cable sounds a bit warmer, maybe slightly rolling off the top, or maybe just cleaner at the top. I've talked about these cables earlier choosing the 3' cable as the champ, but now I'm finding the 6' cable easier to listen to. How can this be explained and may the 6' rout look better on paper and on the ears - thanks!
Rpg, 1 meter is about 39.37 inches. And of course 12 inches = 1 foot.

Therefore 3' is about (12 x 3)/39.37 = 0.91 meters, which DOES NOT conform to the length recommendation.

6' is about (12 x 6)/39.37 = 1.83 meters, which DOES conform to the length recommendation.
I've talked about these cables earlier choosing the 3' cable as the champ, but now I'm finding the 6' cable easier to listen to. How can this be explained and may the 6' rout look better on paper and on the ears.
Chances are that less jitter is present with the 6' length, but as explained earlier that is not a certainty. Also, as explained in the section entitled "Jitter Correlation to Audibility" in the Steve Nugent paper I linked to a few posts above, depending on its spectral (frequency) characteristics jitter can sometimes be euphonic in character, and/or mask or compensate for inaccuracies elsewhere in the system.

So the bottom line obviously is to try to decide based on listening, but I would say that if the cables sound different but neither can be determined with confidence to be "better," chances are it would be best to go with the longer one.

-- Al
Al, thanks, I appreciate your work in helping me understand this. I think in the end, as I may have mentioned, the 6' cable is easier on the ears. With this cable manufacturer, I can order any length, is there an "ideal" length, or are there too many system dependencies to determine this?

With this cable manufacturer, I can order any length, is there an "ideal" length, or are there too many system dependencies to determine this?
Summarizing all that has been said, I would put it that a length of 1.5 meters (which is virtually the same as 5 feet, to within less than an inch), stands a better chance of being ideal than any other length. But as have been cited there are many system dependencies and cable type dependencies which conceivably could result in other lengths being better in some cases.

In this case, if you decide that the 6' cable you have is preferable to the 3' cable, my suspicion is that changing the 6' cable to a 5' cable of the same type would be unlikely to make a significant difference.

-- Al
Al, thank you very much - most appreciated!
Another twist. I noticed that my web of IC's (All unshielded Kimber silver SE and one balanced)could use some housework, so I spent quite some time rerouting and spacing the cables resulting in what I would call "neat and tidy. Then it happened, the great sound I had achieved through many learnings on this forum were gone, my transport sounded lousy and in the end I was back at the beginning! I discovered something about my DI-IC, it does not like to be close to any AC lines or any lines for that matter as well as not liking to be forced into a configuration that does not follow it's natural form as manufactured. Long story short, I ended up finding my 3' cable slightly outperforming my 6' line because I "think" I was able to doge the bullets mentioned even though the cable is double sheilded (it was very close). Now I'm wondering, after all of this effort might a Toslink cable be a better solution? I currently possess only one midgrade Toslink IC and it's in my AV set-up. I have heard that the process of converting the signal to light and then reconverting it back to an electrical signal at the DAC results in low bandwidth problems, but might it be immune to most of the potential problems one can encounter with wire? Interested in your thoughts. I'm heading off to my AV set-up to assess what kind of nightmare I'm looking at in retrieving an Acoustic Research Toslink cable. Thanks all!
In the ending, the Toslink was a poor solution to any 75ohm cable I tried. I don't want to begin suggesting I can do the math on cable length and it's contribution to Jitter, I can only finish this by saying that my 36" length of HAVE/Canare 75ohm digital cable with Canare 75ohm terminations resulted in my Wadia 171i sounding as good as my CDP with only the very slightest of differences, and that is as good in my opinion as one can get. Gone are the spinning disks and hello little music server! My thanks to all that have contributed to this forum!
One last Cliff Note to compress this entire forum. The bottom line is if your using a 75ohm coax between your transport and DAC, it's length should be 12" or less, or at least 1.5M and above, but nothing between these two lengths. I tried everything with the exception of 12" or less. What I found was my 1.75M cable could not compare in sound quality to my 36" cable from the same manufacturer. Now maybe my system is unique in some way. I would trust the words of those who have contributed to this topic, but in the end, trust your ears!

this a huge thread lift but i have searched almost the whole internet for reliable info if 1,5 meters length for a word.clock BNC 75 ohm cables is the way to go to restist any refections or is the shortest one the better choice like on a circut board?
(I am using dCS Puccini DAC + Puccini Master U-Clock ( word.clock ) + the new dCS Network Bridge Streamer ) 

( The EE at dCS responded the shorter the better and i asked if you did not get reflections and phase loop problem , but his oppinion was shorter the better word.clock sync BNC cable) 

Please replay and chade som light on your thaughts. 🙏🏻🙏🏻

// Fredrik
Hello Beolab,

I would proceed based on the comment by the dCS person, which pertains to your specific equipment, rather than on the more general guideline of 1.5 meters.  As I said in an earlier post in this thread:
Summarizing all that has been said, I would put it that a length of 1.5 meters ... stands a better chance of being ideal than any other length. But as have been cited there are many system dependencies and cable type dependencies which conceivably could result in other lengths being better in some cases.  
Also, it's worth noting that the impedances of BNC connectors are more accurate than the impedances of RCA connectors.  And I would suspect that the impedances of your dCS equipment that the cable would be connecting are more accurate than the impedances of a lot of other digital equipment.  Both of those factors will minimize the reflection effects which are a key factor in the rationale for 1.5 meters.

-- Al
Thanks for your advice! 

What is your oppinion on this interesting cable from Van Den Hul 

Have a great Sunday

/ Fredrik
Or another alternativ could be to use a true 75 Ohm Canare BNC SDI 12 GHZ UHD video cable, or what is your thaughts about this? 


Or would you choose a more specific word clock cable like Laird with Neutric BNC even if it got vorser specs on paper ? 


Or the old but proven Apogee Wide Eye 75 Ohm Canare Cable: 

Thanks for some guidence before ordering. 


Hello Fredrik,

As far as I can tell from the descriptions and specs all of those seem like reasonable choices. However the indication that the Van den Hul cable has a minimum safe bending radius of 70 mm strikes me as a bit worrisome. Also, I’m guessing that it costs a good deal more than the others, while not necessarily providing better performance in your application.

Perhaps a good way to proceed would be to order both of the cables that are sold by B&H, since they cost so little, and determining which one performs best in your system, if in fact there is any difference at all.

Good luck. Regards,
-- Al

Thanks for your support and advice, i think the most superior cabel spec wise is the Canare 12 Ghz 4k cable, but the only thing that makes me hesitate is that the cable is optimized for 12 Ghz frequencies when it comes to  video signals.

Will probably buy them all three and see, but i need three of every cable.