News Nvidia will increase shader counts but not ROPS on RTX 50-series GPUs — except on the lowest tier GB207, according to leak

Admin

Administrator
Staff member

edzieba

Distinguished
Jul 13, 2016
588
591
19,760
ROP-scaling is more closely tied to output resolution than render complexity, as the ROPS are dumping their pixels into output buffers rather than working in intermediary shading. If you're busting out triple UHD monitor setups running at 240Hz each then ROP count may be something to take note of, but otherwise not.

Remember when Pixel Fillrate was a figure of note in GPU performance? Remember how it isn't even a footnote now and hasn't been for years? ROPs are what contribute to pixel fillrate.
 
D

Deleted member 2731765

Guest
These are all Fake "made up" specs. Can fully confirm this after talking to other sources (AIB board channel forums, and Benchlife members).

"Harukaze5719" just copied the data from Kopite7kimi's leaked tweet (which in itself is wrong), and made some of his own predictions, and then assumptions regarding ROPs.

Nothing is final.

So don't bother scratching up your head too much on this leaked specs data, or any SM or ROP theory. This is literally the 4th-5th time the said "Kopite7kimi" leaker has changed his own prediction/Tweet.

Earlier he himself debunked the 512-bit Memory bus rumor on the flagship GB202 die. And now, 512-bit is back on track. Of course, the full die might have a 512-bit interface, but Nvidia will never use the FULL die for any gaming GPU.

And then recently, he himself tweeted that the GB202 RTX 5090 might sport a 448-bit Memory Bus instead. So it's all confusing and messed up right now.

Expect a dozen more random Tweets like these to spawn, till these cards arrive in shelf, with each Tweet having some changed/altered info than before!


EDIT:

Those total number of cores on the new Blackwell Gaming GPUs are just based on the "assumption" that Blackwell lineup will also use Ada's "128 cores per SM count". But these are not yet final/official.

Hence, we get GB202 with 24,576 Cores.
 
Last edited by a moderator:

mac_angel

Distinguished
Mar 12, 2008
671
148
19,160
I kinda feel like they are purposely gimping the gaming GPUs by a LOT, on purpose, to stop companies from buying and using them for AI and other professional applications; forcing them to pay hugely premium prices for full working chips.
 

valthuer

Prominent
Oct 26, 2023
185
185
760
These are all Fake "made up" specs. Can fully confirm this after talking to other sources (AIB board channel forums, and Benchlife members).

"Harukaze5719" just copied the data from Kopite7kimi's leaked tweet (which in itself is wrong), and made some of his own predictions, and then assumptions regarding ROPs.

Nothing is final.

So don't bother scratching up your head too much on this leaked specs data, or any SM or ROP theory. This is literally the 4th-5th time the said "Kopite7kimi" leaker has changed his own prediction/Tweet.

Earlier he himself debunked the 512-bit Memory bus rumor on the flagship GB202 die. And now, 512-bit is back on track. Of course, the full die might have a 512-bit interface, but Nvidia will never use the FULL die for any gaming GPU.

And then recently, he himself tweeted that the GB202 RTX 5090 might sport a 448-bit Memory Bus instead. So it's all confusing and messed up right now.

Expect a dozen more random Tweets like these to spawn, till these cards arrive in shelf, with each Tweet having some changed/altered info than before!

Sometimes, i wonder: is there any truth to any of these rumours, at all?

And i'm not just talking about the specs.

Can we even begin to trust the speculated release dates?
 
Last edited:
I would completely expect this if AMD continues with their current strategy of pricing their cards too close. People will not be happy, but they will still buy it, especially because nVidia is making more with their AI cards than GeForce could ever bring in.
 

baboma

Respectable
Nov 3, 2022
284
338
2,070
>Only thing I feel of this new gen is the GDDR7 will be the same with 30% price increase.

At low/mid, ie 5060/5070, every rumor indicates continuing "cost optimization" trend by downspec'ing to fit within specific price ranges. Ballparking the rumored nerfs for 5060 vs 4060, 5060 should still stay at $300 mark.

While users obviously don't like it, it's a sound business move in light of, yes, higher priority for AI. It's practiced by both Nvidia and AMD, and presumably Intel for Battlemage.

At high-end, ie 5090, rumored specs show substantial perf increase, and proportional price increase should be expected. I'm ballparking $2K MSRP for 5090, with street pricing possibly higher.

Again, good business sense. High-end users are price insensitive, and will pay more to get the best perf. Low/mid users are price sensitive, so downspec'ing to maintain specific price points is the norm.

As usual, Videocardz piece on this is clearer w/ accompanying charts comparing Blackwell vs Ada SKUs. It's still rumor status, so while specific figures may be off, the gist gleaned above should be fairly on point.

https://videocardz.com/newz/geforce-rtx-50-blackwell-gb20x-gpu-specs-have-been-leaked
 
  • Like
Reactions: jp7189
I'm waiting for NV or even AMD to price themselves out of the market. One day, they'll release a generation that costs too much even for die hards to swallow and somebody like Intel or even one of the Chinese companies will come along a range capable of doing everything maybe a tad slow but with massive price drop. When NV/AMD realise and cut the legs off their own pricing, hopefully it will be too late to recover sale numbers and they will learn for the next generation. As consumers, we put with a lot of gouging, but eventually we come to our senses and fight back, electric vehicles being a prime example, car companies assign stupid high prices and now can't sell them !
 

baboma

Respectable
Nov 3, 2022
284
338
2,070
>I'm waiting for NV or even AMD to price themselves out of the market.

That won't happen. There'll be parts for every existing price point. Only at the high end should there be price increase.

Just as with 3K->4K, 4K->5K progression will have marginal perf increase, probably ~5-10%, not enough to entice gen-on-gen upgrades, but enough for 3K->5K upgrades.

Enthusiasts are upset, because they can see how the sausage is made, viz the downspec'ing. For regular users, it's business as normal. They get a "good enough" perf bump, at same price points as previous generation. The disappointment expressed here aren't shared by regular buyers.
 
  • Like
Reactions: 35below0
D

Deleted member 2731765

Guest
Sometimes, i wonder: is there any truth to any these rumours, at all?

And i'm not just talking about the specs.

Can we even begin to trust the speculated release dates?

Well, that's a bit tricky question to answer, but yes, there is some truth to them. Depends on what type of leak/rumor we are dealing with. Not all leaks are same though.

But any leak and/or rumor for that matter shouldn't be fully trusted, no matter how reliable it's source is. Though, if the poster/leaker has at least some sort of proof to back up their claim, then it's okay to trust it.

But in this case the specs are random 'guesstimates', and are just made up. And Kopite7kimi's leaks are unpredictable. He himself keeps on changing the specs, so the data remains inconsistent.

Anyone with prior knowledge and experience with gaming GPUs, and who has been keeping track of the development/architecture, can make similar assumptions and guesses.

Even trusting the speculated release date part is tricky. Because even Nvidia themselves won't be having a concrete or set timeline for the release of these GPUs, because this is totally dependent on the current and the future market trend.

And recently in Computex there has also been a change of mind/plan as well, regarding the release date of GPUs, so the previous rumors are sort of debunked for now.

But yes, the gossip that Nvidia might release the RTX 5080 before the 5090 is correct, because this has happened in the past as well.

Also, the article mentioning the fact Nvidia won't be increasing the ROP count is not correct. This is just pure nonsense.
 
Last edited by a moderator: