Question Effect of USB 3 cable length at high speed

Pimpom

Distinguished
I'm in a situation where, over the next 3-4 weeks, I have to use the rear USB 3 ports of a desktop with various external storage devices. The setup works fine when the devices are directly plugged in to the rear ports. I get up to 1100 MB/s on the USB 3.2 Gen2 ports and >500 MB/s with Gen1.

When I use a USB 3.0 extension cable, however, devices like thumbdrives (up to 250 MB/s) still work normally but SATA SSDs show reduced - sometimes erratic - performance. NVMe drives are barely detected and always hang the computer. This happens with both USB 3 Gen1 and Gen2 ports.

I did the tests with 1.5m (~5 ft) USB 3.0 extension cables as they're the only ones I have. Is a shorter cable likely to do better?
 
Is a shorter cable likely to do better?
It should since the data would have to travel less. The longer the cable the higher the latency/resistance. Don't take my word for it, source(borrow, not buy) reliably built cables and test it out yourself.
 
Last edited:
Don't take my word for it, source(borrow, not buy) reliably built cables and test it out yourself.
I would've already done that if I knew of one I can borrow but I have little hope of finding one. I've called a few friends but no one has it.

In fact, a USB 3.0 extension cable of a convenient length is quite rare in my country. They re mostly either too short (30cm or less) or too long (1.5m upwards). A 50cm (~1.5 ft) cable would be ideal - if it works for the purpose.
 
For USB3 systems now the current naming is USB3.2 with a Genn suffix:

Gen1 is the original, max data rate 5 Gb/s and max data cable length 3 m (~10 ft). It CAN work reliably with cables using the USB3 version of the old Type A connectors and with newer Type C connectors.

Gen2 is faster at up to 10 Gb/s and is supposed to operate OK with cables up to 3 m (~10 ft). You REALLY should use only cables with Type C connectors on them (both ends). Cables with Type A connectors will work, BUT that may reduce their data speeds to the Gen1 level. On this point, many mobos include Gen2 ports with Type A sockets to make their use easy for users, but that is NOT what the specs require.USB3.2 Gen2 devices should be made only with Type C ports to ensure they can achieve their specified max data rate if you use the right cable.

Gen2x2 is up to 20 Gb/s and MUST use cables with only Type C connectors. These must not exceed 0.8 m (2.6 ft) to achieve that max speed.

USB4 (the newest iteration) uses the same cables as USB3.2 Gen2x2.

Users also have found (not too surprising) that, aside from the connectors, the internal cable details have an impact, and this factor is NOT easy to see or to decipher from written descriptions. But users with problems of slower data transfer rates than expected often have found that better-quality cables solved the issue.

If you must use longer cables than those specs you have two options.

1. Use a USB3.2 Gen2 or better HUB with its own power supply module and with proper cables to break up the total cable length into two runs of acceptable length.

2. Buy a USB3.2 Gen2 or better Active Extension cable that has a small amplifier built in to boost the signal part way along. This draws a small amount of power for the amp from the 5 VDC power lines in the cable.
 
Last edited:
No, it's cable quality. C'mon, rig up a multidrop oscilloscope or at least a cable tester so you can get and keep going good.

That said, 1.5 m is tops without going to F/O for top speed, and maaaaybe the mobo can drop some deep USB state qualia? Crack them code vibes if Gigabyte hasn't already fronted the best **** [Moderator edit to remove profanity.] tool for the job.
 
Last edited by a moderator:
Access to an oscilloscope and/or real cable testers (that go beyond continuity testing) is not something I would expect being available to most end users.

Much easier and straightforward to simply try other known working (at speed) USB cables.

Overall, cable quality (for any cables) is becoming more of an issue I think.
 
  • Like
Reactions: Paperdoc
A cable is effectively a long thin capacitor (with extra inductance from the length).

As you increase the switching frequency, data rate, you alter the capacitive reactance value. This is measured in Ohms and is effectively a resistance between the plates/cores.

The formula is 1/(2 x pi x frequency x capacitance)

Pi and capacitance are fixed for a given cable so the variable is frequency. The capacitance for a cable is quoted in nF per km and although the capacitance is small the frequency is large, reflected in the data rate.

So at DC the cable acts as would be expected, the resistance of the cable is the attenuation factor.
At the operational frequency required for 20Gb/s the capacitive reactance will tend towards a low value. This will allow the sharp edges of a square wave to leach through the insulation/dielectric and degrade the signal. A square wave is the sum of even harmonics of the fundamental frequency. Higher frequency elements are lost first and eventually the signal quality is compromised as the rise/fall time of the edges increases.

Shorter cables have less capacitance and dc resistance. Lower capacitance increases the capacitive reactance value presenting a greater impediment to the signal frequency. Less leakage.

Quality cables will be manufactured to a specification which includes dc resistance, capacitance per distance and inductance per distance.
 
Sorry about the delay in replying. Different time zones. Just came back from Sunday service.

Ralston18 is right in that an oscilloscope and sophisticated test gears must be out of reach for most end users of computers. Advising someone with an unknown background to use such equipment for a rather mundane purpose is presumptuous.

OTOH, as someone who's been in electronics for over 50 years, I do have multiple oscilloscopes. Still, I'd rather rely on the combined knowledge and practical experience of members here instead of setting up something to test picosecond delays, asynchronicities and amplitude and edge degradation of GHz pulse trains. In addition, measuring the lumped resistance, capacitance and inductance of a cable doesn't tell the whole story of how it behaves with complex waveforms. I'm not going to cut up a cable to measure distributed parasitic elements.

Another option is to cut an existing cable to, say, 50cm and solder the ends to the plug, and see what happens. At my age though, with eyesight that's been degrading over the past few years, that will be a last resort if I cannot find a readymade cable of the desired length.
 
  • Like
Reactions: Paperdoc
I'll add a little to what Stuff and Nonsense said above (all of that correct). Digital data signals travelling though a cable certainly are composed of a rapid sequence of square waves. Two factors affect the quality of the signals at the receiving end - amplitude and distortion. Simple resistance reduces amplitude, but that is negligible in a short cable and is not affected by frequency. However, there is another aspect of this that IS impacted by frequency, and that is the "conductivity" of the dielectric material between the wires. I did a lot of research work decades ago on molecular motions and the major tool we used was measurement of the effective capacitance and conductivity of dielectric materials in a capacitor as a function of frequency of the signal. In fact, the "dielectric constant" of any material is NOT a constant, and this has been well known for maybe a century. At low frequencies it is very nearly constant. But as signal frequency rises what happens is that the molecules of that dielectric begin to absorb energy from that alternating electric field because their normal rate of rotation matches (nearly) the rotation of the field. This causes those molecules to become slightly more excited, but the macroscopic impact on the circuit is that the effective capacitance of the dielectric is reduced, and its conductivity increased. So the capacitor at such a high frequency acts as if its capacitance is lower than we thought, and its conductance is higher. The capacitance change affects the speed of the signal passing though the cable, while the conductance change reduces the signal amplitude. Since the square waves are actually composed of many harmonics of the fundamental signal frequency, the higher-frequency components of the wave are altered more than the lower-frequency ones, and the square wave is no longer square - it is distorted as well as reduced in amplitude. These factors become larger as the signal frequency is raised, so the impact of signal distortion and reduction is much greater as the data transmission rate is raised. In the USB system, the data transmission rate was at 0.48 Gb/s for USB2, and now we have 5, 10 and 20 Gb/s rates in USB3.2, and 40 Gb/s in USB4 and some Thunderbolt systems.

By the way, all of this certainly foretells limits on data rates on wires. In my research days we worked with pure sine waves up to 150 GHz because these molecular motion mechanisms become less important when the molecules simply are not moving that fast. But that also means that trying to send an electrical signal through a wire or even a metal waveguide tube at much higher frequencies becomes more difficult. And remember that 160 GHz is only the fourth harmonic of a signal at 40 GHz.

Back to digital signals. One huge advantage of such signal systems is that they are much less impacted by waveform distortion than analogue systems that depend on clean sine waves. At the receiving end the distorted square wave does not need to be exactly right. The receiver "examines" the wave and decides on only two options. Either it is "ON" if the amplitude is over some value (say, 60% of max voltage) or it is OFF if the signal is lees than, say, 40% of max. (Those two numbers are NOT real - just made up to make the point.) That does leave a small central "Grey Zone" where the decision is not clear, and that means that the receiver WILL make the decision but that might be wrong, and the data is wrong. But in the vast majority of cases of the data is received correctly. Further, the system also uses other error checking techniques to see if the data received makes sense. If it does not, it requests a re-sending of the data to try again. This usually solves the random error problem, but it does slow down the overall data transmission rate. However, that whole process of square wave distortion (both wave shape and amplitude) gets worse at higher frequencies so the demands on cable materials become more important.

What factors are involved? The exact material used for the insulation on the wires is vital - certain types of polymers have more or less absorption of energy due to the motions of their molecules. The presence of small impurities in those materials has a big impact on their conductivity and high-frequency response. The thickness of the insulation on each wire and the resulting impact on spacing of the wires is important. For the average buyer NONE of those can be seen! So we are left with two options. We can rely on the maker or seller to tell us whether or not this cable will perform properly at the required data rate. Or we can look for user reviews and hope most such users are knowledgeable enough to use the cables as intended and give accurate comments. And then we have to hope that a given maker and seller keep on using the same materials and manufacturing processes so that user reviews still are relevant.
 
Last edited:
Regarding:

"We can rely on the maker or seller to tell us whether or not this cable will perform properly at the required data rate. Or we can look for user reviews and hope most such users are knowledgeable enough to use the cables as intended and give accurate comments. And rthen we have to hope that a given maker and seller keep on using the same materials and manufacturing processes so that user reviews still are relevant."

👍

Fully agree.

And also applies to all products and services - not just cables.
 

TRENDING THREADS