ASUS Rampage IV Extreme PCI-E 3.0 Support

Nic88

Honorable
Apr 23, 2012
38
0
10,530
Quick question: will the ASUS Rampage IV Extreme X79 fully support PCI-E 3.0?
I might upgrade to a 3-SLI 680 configuration in the future.

Also, for PCI-E 3.0 will there be a difference between Ivy and Sandy-E? If yes, what will it be?
 
Ivy bridge has PCIE 3.0 natively, whereas SB-E does not so you get a bit of performance boost.

yes it does have PCIE 3

"Expansion Slots, 4 x PCIe 3.0/2.0 x16 (x16; x16/x16; x16/x8/x16"-From ROG website.
 


How much is a bit?

Apologies, I've explained myself badly. I understand that it supports PCI-E 3.0.
My questions are:

1) will it take full advantage of it?
2) is it going to support PCI-E 3.0 32x?

Thanks for you answers!
 
PCIE 3.0 16x is just like PCIE 2.0 32x, because pcie 3.0 is double the bandwidth of pcie 2.0. Yes it will take full advantages as long as you get an ivy bridge CPU, but to be honest the performance difference isnt that much and it's not worth it if it means u have to swap mobo/cpu.
 
I'm actually trying to buld a future proof gaming rig, so I need to consider all upcoming innovations.

Will there be a PCI-E 32x?
 
Just off to say this...no such thing as a gaming rig. It it best to invest 1k-1.5k each 1-2 years, rather than to spend 4k-5k ect all in one big burst. But to answer your question....PCIE 3.0 is "32x"...compared to PCIE 2.0 16x
 


Understood. I'm going to spend 2500£ on a new PC because at the moment I haven't got one. Was thinking about i7 3930k, 2 680 SLI, 16GB 1866MHZ on an ASUS Rampage IV Extreme.
 
I'm still not clear on one thing. I understand that there are 2 versions of PCI-E 2.0. These should be 16x and 8x. How many version of PCI-E 3 will there be?
 
no point getting 6 core CPU. although you may feel good about buying the elite platform, they will actually not provide any benefits in gaming, so in essence you're throwing away 2-3 Cores and $300 bucks compared to an i5 2500k/3570k/i7 2600k/3770k

^PCIE will technically be 32x,16x, but it is still labeled as 16x,8x. Don't over think this, it'll be more confusing the more you think lol. PCIE 3.0=PCIE 2.0 doubled in bandwidth. So SLI 680 in PCIE 3.0 8x,8x will be equal to PCIE 2.0 16x,16x.
 
So will my ASUS Rampage IV Extreme be able to support PCI-E 32x?
I get your point, but I was thinking to get a Socket 2011 GPU because the are many 2011 mobos out there with four PCI-E 3 slots.
I can find 1155 mobos with as many 16x PCI-E 3 slots.
 
Yes, yes yes it wil. As long as it support pcie 3.0...it supports "32x" lol. Mate, just think of 32x as exactly 16x.

The Asus rampage V extreme and Formula should be released soon, if you can't wait, get your X79 if you can, go for that.
 

The R4E is PCIe 3.0 compatible, the original SB-E spec (was) PCIe 3.0 (8 GT/s), the AMD 7000 series runs PCIe 3.0 on the SB-E, currently nVidia has disabled PCIe 3.0 on the SB-E -- BUT there are registry modifications where it will run PCIe 3.0.

The IB 16 lanes of PCIe 3.0 vs SB-E 32 lanes of PCIe 3.0 on AMD & 32 lanes of PCIe 2.0 on nVidia. AFAIK the IB 'should' be validated PCIe 3.0 by nVidia and I hope to see the same on SB-E since (see below) it clearly can run PCIe 3.0.

This is a copy/paste from another similar post I replied to:
The issues:
1. Most LGA 2011 have some sort of BIOS PCIe 3.0 update, requiring an update:
X79 Extreme9 (BIOS version 2.0 is the latest, but no ref to PCIe 3.0 update) - http://www.asrock.com/mb/download.asp?Model=X79%20Extreme9&o=BIOS
2. nVidia must validate PCIe 3.0 on SB-E. To date it's limited to PCIe 2.0:
"*GeForce GTX 680 supports PCI Express 3.0. The Intel X79/SNB-E PCI Express 2.0 platform is only currently supported up to 5GT/s (PCIE 2.0) bus speeds even though some motherboard manufacturers have enabled higher 8GT/s speeds."

The hack may or may not work...

Frankly, PCIe 2.0 vs PCIe 3.0 on today's GPU's does zip *, and with 2-WAY PCIe 2.0 x16/x16 on LGA 2011 is the same bandwidth as 2-WAY PCIe 3.0 x8/x8 on IB + LGA 1155 with GEN3 or Z77's. PCIe 3.0 becomes relevant once 4K Monitors became available to the 'Consumer' and in mass production, but by then it's going to be time to upgrade your PC & GPU(s).

PCIe 3.0 vs 2.0 - http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/49646-amd-radeon-hd-7970-3gb-review-21.html
* EXTREME PCIe 3.0 vs 2.0 - http://www.evga.com/forums/tm.aspx?m=1537816&mpage=1&print=true

Registry fix (hack) to force the GTX 680 to run in PCIe 3.0 mode; see - http://tinyurl.com/7awc9lv (RMPcieLinkSpeed, DWORD = 0004, etc).

The HD 7000 series does run in PCIe 3.0 mode on LGA 2011.

You may also want to read - http://www.techpowerup.com/162942/GeForce-GTX-680-Release-Driver-Limits-PCI-Express-to-Gen-2.0-on-X79-SNB-E-Systems.html
GeForce GTX 680 supports PCI Express 3.0. It operates properly within the SIG PCI Express Specification and has been validated on multiple upcoming PCI Express 3.0 platforms. Some motherboard manufacturers have released updated SBIOS to enable the Intel X79/SNB-E PCI Express 2.0 platform to run at up to 8GT/s bus speeds. NVIDIA is currently working to validate X79/SNB-E with GTX 680 at these speeds with the goal of enabling 8GT/s via a future software update. Until this validation is complete, the GTX 680 will operate at PCIE 2.0 speeds on X79/SNB-E-based motherboards with the latest web drivers.

Ivy Bridge and Z77 Chipset (Panther Point):
z77-overview.png


HD 7900 Series running PCIe 3.0 on SB-E:
c5ca51e8_12751680.png


GTX 600 Series running PCIe 3.0 on SB-E (fix applied):
006.jpg
 
Unfortunately, doesn't change the fact that nVidia is pissing me off. Most of my extreme client builds are off the SB-E/LGA 2011 platform. Since it's a PITA to get GTX 680's, and the 4GB GTX 680 isn't available (yet), I've been recommending the 3GB HD 7970.

I've asked Chris to run both a story on PCIe 3.0 vs PCIe 2.0 and to bring to light the nVidia GTX 680 disables PCIe 3.0 on SB-E. Sometimes you need to spotlight a problem to invoke change; good or bad it needs to be settled.
 
So you are basically saying that eventually SB-E will support Nvidia PCI-E 3? Will it be as good as IB? In fact, for a 680 3-SLI gaming rig would you suggest IB or SB-E?
What is the difference in FPS between PCI-E 2 and 3 for a game like BF3?
Are there different kinds of PCI-E 3 (i.e.: 8x, 16x, 32x)?
I got the below from an ASUS mobo specs: would you be able to tell me the difference between the 2 PCI-E 3 "versions"?

2 x PCIe 3.0/2.0 x16 (dual x16) *1
1 x PCIe 3.0/2.0 x16 (x8 mode) *1

Would such a mobo suppport 3-SLI on PCI-E 3?
Sorry for the constant hammering and thank you for your patience.
 
First, 3-WAY SLI of GTX 680's on a single (1080p) monitor is IMO absurd and a total waste i.e. Big-time Overkill.

Therefore, if the 'plan' is a single HD (1080p) monitor IB + (1) GTX 680 and no worries for quite some time.

Q - How many Monitors??
Q - Is 3D Vision part of the plan?
 
Thanks for your assistance.

"First, 3-WAY SLI of GTX 680's on a single (1080p) monitor is IMO absurd and a total waste i.e. Big-time Overkill." - I totally get your point. I was thinking to play all games at Ultra, 1080p, 16x AF, 4x AA till 2014/2015 60+ FPS (I'm an AA maniac 😛).

"Therefore, if the 'plan' is a single HD (1080p) monitor IB + (1) GTX 680 and no worries for quite some time." - I already checked a lot of 680 GTX benchmarks, and I'm sure that I can't play BF3 at the above settings at 60 + FPS with one card.

"Q - How many Monitors??" - One at 1080p.

"Q - Is 3D Vision part of the plan?" - Nope.

Please advise and thanks again!








 
Moreover, is a full speed PCI-E 3 3-SLI achievable with the below slots?

I got the below from an ASUS mobo specs: would you be able to tell me the difference between the 2 PCI-E 3 "versions"?

2 x PCIe 3.0/2.0 x16 (dual x16) *1
1 x PCIe 3.0/2.0 x16 (x8 mode) *1


 

Listen, most of the time I get accused of pushing too many GPU(s) or too high of a GPU class in this forum. I 'get' higher AA values and etc so get/buy whatever you want.

Regarding:
2 x PCIe 3.0/2.0 x16 (dual x16) *1
1 x PCIe 3.0/2.0 x16 (x8 mode) *1

*1 - I assume the '*1" is referencing some footnote on the spec page.

2 x PCIe 3.0/2.0 x16 (dual x16) - I am assuming this is an X79 spec, it means the PCIe caries full bandwidth of 16 PCIe lanes. If the BOTH the CPU and GPU is PCIe 3.0 compliant then you get effectively double the bandwidth of PCIe 2.0. There's no GPU out there that can saturate PCIe 2.0 x16 lanes.

1 x PCIe 3.0/2.0 x16 (x8 mode) - those are 'shared' PCIe slots to the PCIe 3.0/2.0 x16 slots, in other words: 1. shared slots never carry in this case the full x16 lanes, 2. In most cases once occupied PCIe 3.0/2.0 x16 + PCIe 3.0/2.0 x8 = both (shared) PCIe slots receive x8/x8 PCIe lanes of PCIe 2.0 or PCIe 3.0 depending upon the Lowest speed of either the CPU or GPU i.e. slowest wins.