News RTX 5070 Ti rumor points to 8,960 CUDA cores and 300W TDP — Blackwell GPU may use the same GB203 die as the RTX 5080

I just checked TPU’s GPU database and the 4080 Super has 10240 cores, the core count in the rumoured specs show a paltry increase . I do hope the 5080 doesn’t disappoint…

Hopefully, it will be better than 4090 and it will come with 24 GBs of GDDR7 VRAM.

That will give many people the incentive to upgrade.
 
  • Like
Reactions: bolweval
I figure they will gimp it with a 192-bit mem bus, 12GB of VRAM, and save the 18GB model for the Super version.
Seriously, the 4070 Ti is already 256 bit bus, so 5070 Ti is not regressing, it's on GB203, same as 5080. I agree the regular 5070 will be 12GB on 192bit bus initially and Super version will get 3GB GDDR7 does and get 18GB with a $100 price increase.

The 5070 Ti is an upgrade compared to the 4070 TI. The 4080 had 27% more cores than the 4070 TI, yet the 5080 has only 20% more cores. 5070 Ti should see a healthy boost that is relatively larger than 5080 assuming it's not gimped with much slower memory relative to 5080.
 
I just checked TPU’s GPU database and the 4080 Super has 10240 cores, the core count in the rumoured specs show a paltry increase . I do hope the 5080 doesn’t disappoint…
I bet they will disappoint in the price department, with AMD not really competitive in the high end they will surely be very overpriced…

well kind of a reminder of if gaming at ultra quality is even remotely worth it
 
  • Like
Reactions: Elusive Ruse
Question: Will the 5080 have 33% more performance to go with it's 33% higher power draw? If not, then they aren't scaling performance properly.
Looking at the modest CUDA core increase and the same TSMC 5nm fab, I am apprehensive that you will see 30% performance uplift in most cases. The usual improvements are likely in newer RT Tensor cores and optical flow accelerator for framegen. I suspect they may introduce more bespoke technology on Blackwell because there is very limited room for performance improvement just through hardware improvements. The use of DDR7 may contribute to some performance gain, but knowing Nvidia, they may simply use the opportunity to cut down on the memory bus.
 
  • Like
Reactions: jp7189
so a 4080 released at 1200$
and 5070 is closer to the 4080 than the 4070
and we'll have tariffs.

will this be the 1st 70 tier gpu to cost a grand?
No it wont. Because the si called “4080” is in reality already a xx70ti gpu, so it already hit the 1k mark.
 
I can already see Nvidia funneling pc gamers to 5090.

It worked with me, last time they did it, with 4090. 😂😂
I do wonder how long can this strategy work, it's not like outside of crypto mining and lockdown there will be ppl keep upgrading every gen or two with a $2000 price tag. Especially when the AI bubble looks to be bursting soon
 
I do wonder how long can this strategy work, it's not like outside of crypto mining and lockdown there will be ppl keep upgrading every gen or two with a $2000 price tag. Especially when the AI bubble looks to be bursting soon

At this point, even if the AI bubble does burst (which i very much doubt it will happen), Nvidia have well established themselves in other areas as well, so i don't think they'd face any kind of fundamental threat to their long term survival.

Well, at least not to the point of reverting to their 2017 GPU prices, when one could buy a flagship card, like 1080 Ti, for $700.
 
Last edited:
If AMD's pricing is more reasonable, Nvidia will be funnelling GPU buyers to their main competitor...

I very much doubt any of the two has the intention of lowering the prices.

After all, consumer GPUs represent a rather insignificant source of profit, compared to their total earnings.

But, even if AMD did that, they would first have to drastically improve their cards in the Ray Tracing sector, in order to make them more attractive.

When people opt for Nvidia, they partly do it under the perception that they 're buying from the best GPU company around. To a certain extent, they are not wrong for thinking this way.

From that perspective, AMD's unwillingness to compete for the high-end, doesn't help much either.

But, like Nvidia, they just don't care at this point.

PC gaming, is an ever-shrinking market and there's much more money to be made elsewhere - AI, to be exact.
 
  • Like
Reactions: KyaraM
Get your concern, but math indicates a 25 power draw... 320 to 400 is 80 watts. 80/320 is 0.25.


I agree with your math, but my comment was based on the article contents:


Compared to the RTX 5080 specs Kopite shared earlier, the RTX 5070 Ti represents an enormous downgrade in specs. The RTX 5080 has 20% more CUDA cores (10,752) and, as a result, could pull 33% more power (400W TBP).
(Emphasis mine)
 
No. It's rumored to be around a 4090 in performance. That would be about 25% faster at 4k. TDP is peak power draw, so it doesn't tell you much about typical in game power draw.
but I am keeping skeptism about the performance, they have a trend to use latest DLSS to compare to the old DLSS on last gen IIRC