News Gainward Lists Four Nvidia RTX 30-Series GPUs With Full, Detailed Spec Sheets

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Considering the only consumer platform with PCIe Gen4 support is AMD right now, it's not going to be pretty. I did the 9900K vs. 3900X with a variety of GPUs, and even with Gen4 and Navi 1x, the 3900X trailed in just about every instance. Ampere will push the bottleneck even more to the CPU side of things. Hopefully Zen 3 will close the gaming gap with Intel before Rocket Lake arrives.

We dont have any GPU that can saturate 16 lanes of Gen 3 PCIE yet. Gen4 would help in 8 lanes mode in case you use SLI/crossfire or use the other slot for SSD drives.

I think we should compare Gen4 VS GEN3 PCIe GPU in crossfire mode
 
Last edited:
Value is a completely personal thing. Many people will be perfectly fine with the price, which is why Nvidia will charge what they will charge. I'll wait for the reviews before determining whether it is worth it or not.

Nah , it is the bad competition ... AMD is way behind thats the true reason. I Know some people will even Game using Titan cards in SLI ... but not the majority of high end Gaming PCs would waste money where it is not Justified.
 
Last edited:
We dont have any GPU that can saturate 16 lanes of Gen 3 PCIE yet.
Given the 4GB RX5500's scaling between 3.0x8 and 4.0x8 in cases where it is trailing the 8GB version, I'd say we do have a GPU that could benefit substantially from 4.0x16 but it has been gimped by a 4.0x8 interface.

The GTX1650S having 3.0x16 is likely the main reason why it holds up much better in scenarios that crush the 4GB RX5500.

So we do have at least two GPUs that benefit substantially from having 3.0x16 or equivalent bandwidth and would likely benefit from more. They may not be GPUs that enthusiasts are interested in but they are there.
 
Given the 4GB RX5500's scaling between 3.0x8 and 4.0x8 in cases where it is trailing the 8GB version, I'd say we do have a GPU that could benefit substantially from 4.0x16 but it has been gimped by a 4.0x8 interface.

The GTX1650S having 3.0x16 is likely the main reason why it holds up much better in scenarios that crush the 4GB RX5500.

So we do have at least two GPUs that benefit substantially from having 3.0x16 or equivalent bandwidth and would likely benefit from more. They may not be GPUs that enthusiasts are interested in but they are there.

Do you mean using the extra bandwidth to access the System memory in case of GPU memory shortage ?
 
Given the 4GB RX5500's scaling between 3.0x8 and 4.0x8 in cases where it is trailing the 8GB version, I'd say we do have a GPU that could benefit substantially from 4.0x16 but it has been gimped by a 4.0x8 interface.

The GTX1650S having 3.0x16 is likely the main reason why it holds up much better in scenarios that crush the 4GB RX5500.

So we do have at least two GPUs that benefit substantially from having 3.0x16 or equivalent bandwidth and would likely benefit from more. They may not be GPUs that enthusiasts are interested in but they are there.
PCIe Gen3 x16 ~= PCIe Gen4 x8

I think there may be other factors with the RX 5500 XT 4GB, though it would make sense that a GPU with only 4GB VRAM is more likely to benefit from a faster PCIe interface than one with 8GB or more VRAM.
 
Without context, that statement is pointless. You must include games, resolution and framerates.
I think there is no need to get further information besides what he mentioned. Fine means he/she don't find issues running a GTX 1080 Ti for at whatever framerates, resolution, etc, at this point. "Fine" does not mean you need to run all games at >100 FPS or at 4K resolution, etc. Likewise, my GTX 1080 Ti and GTX 1660 Super runs all the games I play smoothly. Till it starts to lag, then that may be the time I start considering getting a new graphic card.
 
If you want to buy a 3080, you should check the 2080 TI and hoping a price drop after the reveal... Because 3080 = 2080 TI =)
I feel the RTX 3080 may be faster and perhaps even cheaper, assuming there is no discount to the RTX 2080 Ti. Even assuming the same price, the RTX 3080 may still be a better deal. For the RTX 2080 Ti to be attractive, the price needs to adjust accordingly to its relative performance against the RTX 3080 in my opinion.

Anyway, I do feel the RTX 3080 to be quite power inefficient against the 3090. While losing close to 20% of CUDA cores, and possibly lost of some RT and Tensor cores, the TGP is 30W less than the 3090. So I feel perhaps the RTX 3070 may be a better buy, though the specs are still unclear as of now.
 
  • Like
Reactions: JarredWaltonGPU
Considering the only consumer platform with PCIe Gen4 support is AMD right now, it's not going to be pretty. I did the 9900K vs. 3900X with a variety of GPUs, and even with Gen4 and Navi 1x, the 3900X trailed in just about every instance. Ampere will push the bottleneck even more to the CPU side of things. Hopefully Zen 3 will close the gaming gap with Intel before Rocket Lake arrives.

Well said. Both CPU families are lacking and quite behind the GPU advancement. Whomever gets their crap together will make a ton of money...and I personally hope it's AMD. Plus I own quite a bit of their stock lmao
 
Do you mean using the extra bandwidth to access the System memory in case of GPU memory shortage ?
Yes. How fast the 4GB RX5500 can access system memory is the only thing that explains the massive performance uplift between 3.0x8 and 4.0x8 where it significantly lags behind the 8GB versions which are themselves mostly indifferent to 3.0 vs 4.0.

PCIe Gen3 x16 ~= PCIe Gen4 x8
Hence the "or equivalent".
 
If you want to buy a 3080, you should check the 2080 TI and hoping a price drop after the reveal... Because 3080 = 2080 TI =)
Or you could wait until October and try to land a 3070 which is expected to equal the 2080ti in performance for half the price plus significantly better ray tracing and DLSS 3.0. There is zero reason to buy a 2080ti at this point, unless you get a ridiculous deal on the used market. The 3080 will be a tier above the 2080ti in performance.
 
  • Like
Reactions: JarredWaltonGPU
Given the 4GB RX5500's scaling between 3.0x8 and 4.0x8 in cases where it is trailing the 8GB version, I'd say we do have a GPU that could benefit substantially from 4.0x16 but it has been gimped by a 4.0x8 interface.

Sure, the less onboard memory a card has, the more it will benefit from 4.0. Only problem being, the cheaper your video card is, the less likely you're going to shell out for a 4.0 motherboard. The 3090 with 24GB of fast RAM and better compression from Nvidia is going to see little to no benefit from 4.0.

relative-performance_3840-2160.png


At 4k, a 2080ti gets 2% going from 3.0 x8 to 3.0 x16. Only loses 6% dropping all the way to x4. What kind of miracle performance boost are we expecting from the 3000 series going from 3.0 to 4.0?
 
Reading all this FUD makes my head hurt.... Everything right now is pure speculation. What if's. Pie in sky. Or, in plain ole 'merkin, <Mod Edit>. Tomorrow we get the reveal. Then we get a month or so of paper launches, followed by some serious price gouging. Then AMD launches their lineup. After which we get more FUD, tempered with a few independent benchmarks, and about Q1 or 2 the dust settles and prices vs performances become stable.

I am building a new PC, from the ground up. So, I will be watching this play out very carefully, and will (I am sure) hear from spouse more than once "It is a dining table, not a workbench!" before this is all over.
 
Last edited by a moderator:
Reading all this FUD makes my head hurt.... Everything right now is pure speculation. What if's. Pie in sky. Or, in plain ole 'merkin, <Mod Edit>. Tomorrow we get the reveal. Then we get a month or so of paper launches, followed by some serious price gouging. Then AMD launches their lineup. After which we get more FUD, tempered with a few independent benchmarks, and about Q1 or 2 the dust settles and prices vs performances become stable.

I am building a new PC, from the ground up. So, I will be watching this play out very carefully, and will (I am sure) hear from spouse more than once "It is a dining table, not a workbench!" before this is all over.
When Gainward 'accidentally' posts full specs a day before the official reveal, it's hardly FUD. We'll get more details tomorrow, sure, but I wager the 3090 and 3080 specs seen here are spot on.
 
Sure, the less onboard memory a card has, the more it will benefit from 4.0. Only problem being, the cheaper your video card is, the less likely you're going to shell out for a 4.0 motherboard.
You need x16 today because the majority of systems that low-end GPUs will land into only have 3.0 and need x16 to give them all the system memory bandwidth they can possibly give them to help low-VRAM GPUs along.

For 4.0, it won't be long before it makes its way to cheaper boards. There is no real reason for motherboards that only support 4.0 from the CPU to cost significantly more as many boards going as far down the motherboard ladder as the A320s were found to be capable of handling 4.0 before AMD locked it out using AGESA updates. Many higher-end boards failed to run 4.0 simply because they had 3.0 repeaters, analog switches or other active components instead of being dumb wires like most mid-range boards are. All you really need is slightly tighter tolerances on PCB quality which you already need for improved memory compatibility and stability anyway.

The motherboard price premium "for 4.0" from the CPU is almost entirely arbitrary. I wouldn't be surprised if the real primary reason AMD blocked it was to prevent decent lower-end boards from making high-end boards look bad by beating them to supporting 4.0 and doing so for free.
 
It is going to be interesting to see how the performance stacks up and prices change over the next 6 months but I am pretty sure that if the 3090 is 1400$ at launch you will be hard pressed to find it for under 1600$ from Asus, MSI etc....

I feel no massive urge to replace my 2080..... my next upgrade will be to a Ryzen 4000 series CPU. Big Navi and maybe even Intel's new graphics offerings will also play a big role in how the market moves.
 
@JarredWaltonGPU - NOT Gainward... The first five or so comments. As to the specs, sure. For Gainward's offerings I am pretty certain that they are, as you put it, 'spot on'.

I think my take on how this will all play out is, also, 'spot on', but probably a bit optimistic in the time frames.

BTW, what happens to vendors of nVidia that break their NDA?
 
The term has been around some years, and I didn't really delve into PC hardware until 2011-2012.

As I remember, the term was often used to describe electric and hybrid vehicles. These products create the impression that their manufacturers are environmentally friendly, even though they're mainly making fossil-fuel vehicles.

Using "halo" to describe prestige products is new phenomenon. Somehow the meaning has shifted from social responsibility to self-indulgence.
 
I'll upgrade my GTX 1070 after 10 years. This 3080 will be affordable at that time or I'll buy new series but with similar power.
 
Last edited:
BTW, what happens to vendors of nVidia that break their NDA?
Nvidia expresses its displeasure. Maybe a fine, but it happens so often that I doubt it. I mean, if there's clear proof of intentionally leaking things, there would be legal possibilities. More likely if Nvidia was angry enough someone at Gainward or wherever would end up fired as the scapegoat.
 
Nvidia ways scares customers alot , who on earth will pay $1499 for RTX 3090 when Nvidia will for sure release a super card a year after ?
There are people who are probably sitting on something like a GTX 980 or older that would really like to upgrade now. Besides that, there's always "something better around the corner." If you can't commit to an upgrade, you'll be sitting on it forever.

I demand that Nvidia gives customers a way to upgrade to any "super" card that will appear in the future , some trade in program.
Why? AMD has done this too. The RX 500 series was basically the "super" RX 400 series.

in the Past flagships were $400 (ATI 9700 pro ) .. it was bearable .. now we are talking about $1499 and when you demand people to pay that much you should give them the best for more than six to nine months. Or At least a trade in option.
The cost of the transistor hasn't gotten down much in relation to how many we're willing to stuff in a give area. The Radeon 9700 Pro had about 110M transistors, compared that to the 2080 Ti that has 18600M transistors, a 169x fold in count. The cost per transistor though hasn't had the same effect, and depending on which source you look at, has only dropped about fourfold in price at best, flat lining around the 28nm node.

More over , we know that Nvidia "super" cards were kept just to compete against AMD and not that it is a real "refresh"
Again, so was the RX 500 series to the RX 400 series. At least NVIDIA doesn't try to make it sound like an entirely brand new generation of cards.
 
The cost of the transistor hasn't gotten down much in relation to how many we're willing to stuff in a give area. The Radeon 9700 Pro had about 110M transistors, compared that to the 2080 Ti that has 18600M transistors, a 169x fold in count. The cost per transistor though hasn't had the same effect, and depending on which source you look at, has only dropped about fourfold in price at best, flat lining around the 28nm node.

sorry , trying to justify overpricing the cards wont help at all . Electronics are the only thing that goes down in price over time ...

we used to buy Simple Calculator with few transistors for $2000 , now you get them for $1

which means also that transistors manufacturing cost went down 2000 FOLDS !

and you dont have any information about their base cost... it is a kept secret.
 
I'm currently running an overclocked 1080 Ti and it's doing just fine for me so far. I stayed away from the 2000 series since it seemed like the only way to get a real peformance increase would've been the 2080 Ti. Even the 2080 Super wasn't THAT much faster than what I have, even without raytracing turned on. If the 3000 series can give me a card that offers framerates similar to what I get now, but with all the RT goodness enabled and without costing me $1000 or more, then I'll take a look.
 
  • Like
Reactions: sstanic
One thing these insanely priced GPUs do is trickle last gens top-tier cards into the used/eBay supply chain. I think I've only paid top dollar for a brand new, just released card once in my 3+ decades of computer/component buying.
Otherwise, I bide my time and browse eBay when I want to upgrade, or buy last gen's top-tier when it goes on super sale. My current Vega 64 was still expensive when I nabbed it off eBay 18 months ago but once I slapped on a Morpheus Vega cooler and OCd the crap out of it I was happy with the price/performance. 😉