News USB 4 Version 2.0 Announced With 80 Gbps of Bandwidth

Inthrutheoutdoor

Reputable
BANNED
Feb 17, 2019
255
66
4,790
6
I’m still waiting for “Super USB 3.2 Gen 2 Turbo Ultra Remix” in 2025.
Nope thats coming out next month, so you might as well wait for the SupaDupa MegaMax Quadruple oversampling Hexagonal isolinear chip-powered petagigaplexed version, which is due out in mid 2023-ish....hehehehehe :eek:
 
Reactions: domih
Jul 29, 2022
75
4
35
0
X670 / X670E support USB 4.
Intel boards tend to go with Thunderbolt 4, which is inclusive of USB 4's capabilities and a little bit better in every way.
whats the difference between thunderbolt and regular usb?
 

domih

Commendable
Jan 31, 2020
94
72
1,610
0
I think someone that use to work at Capcom got in at the USB Promoter Group. I’m still waiting for “Super USB 3.2 Gen 2 Turbo Ultra Remix” in 2025.
An alternative is to replace SuperSpeed, with UltraSpeed < HyperSpeed < UberSpeed < LudicrousSpeed for the incremental higher speeds above 5 Gbps.

But I admit that "Turbo Ultra Remix" is a pretty good one :)
 

Eximo

Titan
Ambassador
whats the difference between thunderbolt and regular usb?
Plenty of info about that out there, same for USB 3 and Thunderbolt 3.

But the basics would be that Thunderbolt 4 and USB 4 both support 40 Gbps, but thunderbolt certification comes with a minimum speed of 32Gbps whereas you label something USB 4 as long as it meets the standard, could be even be as low as 20Gbps. Also a slight difference in power requirements, with thunderbolt devices supporting 15W vs 7.5W of USB4. Thunderbolt supports better daisy chaining of displays. Probably more stuff, but suffice to say that it doesn't matter to most end users.
 
Reactions: LuxZg

psycher1

Distinguished
Mar 7, 2013
100
1
18,715
11
Imagine a future with no HDMI or DP cables. Only USB connections to monitors. I like it, sounds clean.

I'd still rather imagine VR workspaces, where you just pull up and move apps and documents around with your hands in a virtual space. But cleaner workspaces on traditional monitors are nice too.
 
Reactions: bolweval

truerock

Distinguished
Jul 28, 2006
243
19
18,695
1
I think the article is in error. Thunderbolt 4.0 runs bi-directional 40gbps (40gps + 40 gbps).
Thunderbolt 4.0 can run 80gbps in one direction to support DisplayPort 2.0 - but, it requires an active USB-C cable, just like USB 4.0 v2.0.
Apparently, USB 4.0 version 2.0 and Thunderbolt 4.0 will be almost identical.
 
Last edited:
Reactions: Krotow

Kamen Rider Blade

Distinguished
Dec 2, 2013
587
305
19,260
0
It should've been called USB 5.0

What's with this dumb naming scheme and going back and adding USB4 Version 1.0 into existence?

Whomever thought this was a good naming scheme, deserves repeated "Kicks in the Crotch" from everybody who hates this dumb naming system.
 

truerock

Distinguished
Jul 28, 2006
243
19
18,695
1
I was thinking USB 4.1 myself because it's only a speed change but I share your sentiment!
I think maybe USB and Thunderbolt may be trying to conform to each other - such that USB 4 and Thunderbolt 4 will be almost the same thing.

Maybe USB 5.0 and Thunderbolt 5.0 will be exactly the same.
 
Reactions: Krotow

InvalidError

Titan
Moderator
It's time to get rid of SATA and replace it with internal USB (Type-E) drives
I wouldn't recommend that since USB carries lots of unnecessary overhead for internal devices (legacy USB, USB-PD signals, alt-mode pairs, etc.) which is bound to cause issues with people plugging their front panel cables into motherboard ports that don't support the extra crap people may grow to expect them to support. A SATA successor only needs SATA4/PCIe 4.0x1 and dumb 12V 2A power.

Another reason to give "SATA4" its own new connector is the 12VO future where USB's default 5V and PCIe/NVMe/SATA's 3.3V won't be "free" anymore. Everything will eventually require a 12V update with new connectors to prevent plugging 3.3V/5V devices into 12V ports.
 
Reactions: Thunder64

escksu

Reputable
BANNED
Aug 8, 2019
878
351
5,260
0
...Really. They couldn't just call this USB 5?
Well, the folks at USB are basically "retards" or simply out to troll.

They created the incredibly confusing USB 3.1 Gen 1 and USB 3.1 Gen2.... makes you wonder isn't USB 3.0 the 3rd generation of USB??

Wait, there is more, with new 3.2, it gets even worse....

USB 3.2 Gen 1, USB 3.2 Gen 1x2, USB 3.2 Gen 2, USB 3.2 Gen 2x2

Yes, you still have the superspeed and superspeed PLUS terms....

Luckily the logo used is alot easier to see. You now get additional words (5,10,20) so you know that means the speed. Ignore the rest....
 
Reactions: shady28

escksu

Reputable
BANNED
Aug 8, 2019
878
351
5,260
0
whats the difference between thunderbolt and regular usb?
They are entirely different interface even though a thunderbolt connector will accept a regular USB one (not the other way round).

TB is way faster than USB 3.x. Its 40Gbps compared to 20Gbps for USB. TB can also be used to connect monitors, Its also sitting on 4 lanes of PCIE 3.0. So, yes, you can connect even a graphics card using TB. Thats why there are graphics card docking stations for TB.

USB 4.0 blur the boundary between USB and TB. USB 4.0 is now alot like TB, also 40Gbps. There is also USB 4.0 PCIE mode so technically you can connect a GPU via USB 4.0. AFAIK, they are also compatible with one another. So, yes, you can plug a TB 3.0 device into a USB 4.0 port and it should work.
 

InvalidError

Titan
Moderator
People have a hard time telling the diff between current SATA III and PCIe 4.0 SSDs.
Software needs to be rewritten for concurrent IO before they can fully leverage SSD bandwidth and access speed. If DirectStorage becomes the norm with future games, SATA will become a definitive gaming no-go. Bumping SATA to "SATA4"/PCIe 4.0x1 wouldn't be much of a stretch since both AMD and Intel chipsets are sharing FlexIO/HSIO lanes between PCIe 4.0 and SATA already.

The time to cook up new standards is before a killer app comes about and makes the whole industry realize it has slipped 2-3 generations behind where it needs to be as server companies discovered once SSDs became cost-effective and lead to the current rapid succession of PCIe 4.0, 5.0 and 6.0 after nearly a decade of 3.0 being the norm.

Almost nobody needs 80Gbps USB4.0-version2-gen1, yet here it is, 5-10 years ahead of when I may actually care.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
351
5,260
0
It's time to get rid of SATA and replace it with internal USB (Type-E) drives
No, there is no point doing that. SATA provides more than simply connector for drives.

SATA is also compatible with SAS, this means you can plug SATA Drives into SAS controllers (not the other way round).

SATA controller is also provided by chipset so you do not need additional converters/controllers. Many ports too and with/without RAID too.

SATA is also an extremely mature standard. IF you look at ALL, yes ALL the portable HDDs today. They are ALL using SATA drives with an additional SATA-USB converter. No company is even trying to design a dedicated USB controller for their HDDs.

If you look at portable SSDs, they are also using nvme to USB converters (some SATA to USB). Again, no dedicated USB controllers.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
351
5,260
0
Software needs to be rewritten for concurrent IO before they can fully leverage SSD bandwidth and access speed. If DirectStorage becomes the norm with future games, SATA will become a definitive gaming no-go. Bumping SATA to "SATA4"/PCIe 4.0x1 wouldn't be much of a stretch since both AMD and Intel chipsets are sharing FlexIO/HSIO lanes between PCIe 4.0 and SATA already.

The time to cook up new standards is before a killer app comes about and makes the whole industry realize it has slipped 2-3 generations behind where it needs to be as server companies discovered once SSDs became cost-effective and lead to the current rapid succession of PCIe 4.0, 5.0 and 6.0 after nearly a decade of 3.0 being the norm.

Almost nobody needs 80Gbps USB4.0-version2-gen1, yet here it is, 5-10 years ahead of when I may actually care.
For direct storage, I don't see it becoming popular anytime soon. If ever..

The problem isn't about direct storage itself. Its about GPUs and nature of games. Most graphics cards are limited to just 8GB of RAM. Although there are 16GB ones, its mostly limited to high end models and they are very expensive.

Games today uses alot of memory esp. for high resolution textures etc... This means there isn't much memory left for direct storage. We have seen the performance penalty incurred when graphics card memory is insufficient for the game.

Anything is that with the multicore CPUs we have today, is there really a need to use GPU for decompression? Also, its very easy to add RAM to PCs, impossible to add for graphics cards (ignoring hacking and soldering). So, wouldn't it make more sense to use main memory to store textures?
 

JamesJones44

Prominent
Jan 22, 2021
220
141
760
0
Anything is that with the multicore CPUs we have today, is there really a need to use GPU for decompression? Also, its very easy to add RAM to PCs, impossible to add for graphics cards (ignoring hacking and soldering). So, wouldn't it make more sense to use main memory to store textures?
Do you mean use system RAM instead of GPUs having their own RAM?
 

InvalidError

Titan
Moderator
The problem isn't about direct storage itself. Its about GPUs and nature of games. Most graphics cards are limited to just 8GB of RAM.
That is a non-issue since the current version of DirectStorage always loads compressed assets to system memory first and then the GPU picks it up from there.

Anything is that with the multicore CPUs we have today, is there really a need to use GPU for decompression?
GPU fetching compressed asset data over PCIe: ~200ns latency before the GPU can immediately put the returned data to use
GPU having to request decompression from the CPU: microseconds of latency between IRQ raised and driver service routine, possibly microseconds more waiting for a scheduler window, microseconds of context switches to the process handling the API call, microseconds of doing the actual decompression job, microseconds more telling the GPU where to find the decompressed data so it can load it over PCIe.

Having the OS and CPU with tens of microseconds of latency and uncertainty sit between the GPU and its compressed assets in system memory is a very bad idea if you like uniform frame times. In a scenario where the GPU is so low on VRAM that it needs to continuously fetch compressed assets from system memory, I bet it would cause a tragic performance loss and massive CPU load.
 

ASK THE COMMUNITY

TRENDING THREADS