• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

PCI Express 4.0 to Support 16 Gigatransfers a Second

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
>PCIe 4.0?But will it run Crysis?
[citation][nom]dark_knight33[/nom]2010 Called, even they don't want the joke back, they just want you to stop using it.Why is it every article about a new piece of fast hardware has to have some idiot using the geek version of "Git'er done!"??[/citation]
It's not the joke that is funny, it's pissing people like you off that makes that still funny. But seriously, I wonder how Crysis will run on the PCIe 4.0 cards?
 
[citation][nom]aaron88_7[/nom]It's not the joke that is funny, it's pissing people like you off that makes that still funny. But seriously, I wonder how Crysis will run on the PCIe 4.0 cards?[/citation]

After careful extrapolation, the answer is 237.
 
[citation][nom]soldier37[/nom]Well maybe by then EA will fix the issues BF3 has currently and its certainly not my PC. 2 GTX 580s SLI 3gb versions, 8gb 1600 , 2600k @4.5, 2560 x 1600 res ultra setting maxed.[/citation]
Hey how come you didn't mention anything about your liquid cooling and 5KW PSU? Maybe throw in the UPS too? 😀
 
[citation][nom]applefairyboy[/nom]What the hell is a Gigatransfer?[/citation]
1,073,741,824 data transfers across the PCIe 4 bus. What, were you trying to be witty or sarcastic or something? Or are you really so thick you couldn't put giga + transfer together on your own? Is it wrong that I expect more than this from a site specializing in hardware?
 
PCI-E 3, 4, 5, etc...this means nothing. Look at the performance difference between 16x and 8x on PCI-E 2.0...there is none. All of the games are using the graphics cards' on board memory. No system RAM will ever be as fast as what's on your card. This is a bunch of worthless bullcrap.
 
[citation][nom]AbdullahG[/nom]PCI-E 4.0 has yet to be developed. PCI-E 3.0 already has. We are just waiting on GPUs to finally utilize it.The 16 GT/s bit rate has only been approved anyway, not developed...[/citation]

I understand that the point I was making was if they can do 16gt/s with essentially the same power specs (and Im assuming socket length etc) as 3.0 why didnt they do that for the 3.0 spec? I mean if its the same and they knew and/or theorized they could do 16gt/s why not do that with 3.0 instead of waiting for 4.0?

It would be like seeing the chevy volt concept and knowing it goes 40miles on a charge and planning on buying one. then before the thing hits the market they announce in 2 or 3 years with the same battery they'll be able to go 80miles on a charge. It just makes people want to wait for the new spec to upgrade. Though to be fair I dont think people will be THAT concerned if their socket is 3.0 or 4.0 when we havent used all of 2.0's bandwidth IIRC.
 
I seriously cannot even begin to imagine why we would even need PCIe 4.0 at this point (for gaming). I really hope the next-gen consoles (excluding the WII U of course) will at least ATTEMPT to give it something to talk about. I wonder what kind of transfer speed will be needed for a video card that is equal to 4 GTX 580s. I'm kind of curious if PCIe 3.0 would be sufficient for that. I really don't know.
 
I run bf3 on a q6600 oc 3.0ghz 8gb 667 (4x2) ram with a palit 2gb 460 336 core card without glitches. maybe people need to clean up their systems and stop torrenting while playing? or have ravaged the crap out of their registry with all the lamer hacks they install for bayware?
|
pci 3.0 on a newer system with a 560ti 2gb 384 core and 6gb 3x2 1066 with i7-930 oc'd 3.86 with a physx 9800 gtx+ 512mb running even sweeter.

my friend got a $6500+ pc with 2x580 3gb cards, yadda yadda, and I couldn't see a difference. once you shoot into multiple 2650W monitors, I think you might but barely. even rage was kickin' it with single card systems built right.

pcie4 will be great for realtime dx11+ renders from 3ds and gaming :) 16 giga what's?
 
There are some interesting comments on here. That actually aren't really pertaining to the matter of PCI 4.0. It's not a huge deal for me yet, whenever I upgrade again I'll look out for it, but it probably won't be for at least 4 years from now most likely. lol
 
[citation][nom]masterofevil22[/nom]PCI-E 3, 4, 5, etc...this means nothing. Look at the performance difference between 16x and 8x on PCI-E 2.0...there is none. All of the games are using the graphics cards' on board memory. No system RAM will ever be as fast as what's on your card. This is a bunch of worthless bullcrap.[/citation]
GPUs still need data from CPU. Ever ran a 6990 on a 1x PCI-E connection?
 
[citation][nom]dreamer77dd[/nom]i think they should change the connection to fiber optics as PCI express is4.0 is not a big leap forward.[/citation][citation][nom]willard[/nom]1,073,741,824 data transfers across the PCIe 4 bus. What, were you trying to be witty or sarcastic or something? Or are you really so thick you couldn't put giga + transfer together on your own? Is it wrong that I expect more than this from a site specializing in hardware?[/citation]

If I was going to guess, I'd say it was a BTTF reference [where Marty asks, "What the hell is a gigawatt?"].



 
To all that think that this kind of bandwidth is just for graphics cards is stupid!! This will be of great use to those that have tons of servers with 40Gb Infiniband, you can use all of the bandwidth easily with one card until that is PCI-e 3.0 and 4.0. In the WAN networks keep getting faster as well as convergence in Data Centers this will be very valuable tech. Not just for games open your mind!!!
 
With that bandwidth, build a pcie card with cpu, gpu, shared gddr5 ram in the 2-3gb range, ssd slot, two 8 pin power connectors, and external power supply brick, something about the size of the xbox 360's, and then make it work in any pcie 2.0/3.0/4.0 compliant system, and then figure out a use for it. I'd read about it... hehe... n3rd pr0n.
 
[citation][nom]rosen380[/nom]Why not take the thing you described, stick an HDMI and USB ports on it and call it a computer?[/citation]

That's boring. Having two computers, one inside the other, with the bandwidth of pcie 4.0 to talk to each other...

I don't know you so it may not apply, but it was a challenge to those folks who like to spend their days hacking hdcp and the like for the greater good of humanity, and bragging rights.

Edit: Look, I know I didn't flesh out all the mundane stuff like hdmi/usb, because those are just assumed to exist, at least in my mind, since listing gpu. It's a news comment, not a thesis, so give me just a little credit... hehe... its truncated. I know well lets just pop in a tesla and use that to compute, or a standard high-end gpu for calcs cuz those are expensive enough as it is. I just think with everything shrinking in greater magnitudes, its feasible to assume that all of this hardware will fit on a card by the time pcie 4.0 is ready for the masses. The other point is sometimes its just crazy fun to build something and let it loose on the masses to figure out how to use it to any significant effect.
 
Status
Not open for further replies.