News GeForce RTX 3090, RTX 3080 Specifications Reportedly Exposed In Vendor Spec Sheets

sstanic

Reputable
Aug 6, 2016
31
8
4,535
0
3090 seems like a xx80Ti with a bit more VRAM, but 3080 definitely seems like a xx70. Nvidia seems to just have 'upgraded' the naming scheme to justify higher pricing. They probably learned that lesson with the RTX2000 series..
 
Jun 12, 2020
7
3
15
0
I think before I plop down 600-800 dollars on a GPU I will wait until AMD brings out RDNA2 cards. Why pay these premium prices until we see what both AMD and NVIDA has to offer. I will probably not purchase a card until I see the reviews on both 3000 series and whatever AMP calls theirs-and I need one! I am still on a Pascal card and its not an 1080 TI. If I could go back 4 years I would have purchased the 1080 TI. It was a bargain compared to Turning and Ampere even with their performance increases.
 
Reactions: Kridian and gg83

spongiemaster

Prominent
Dec 12, 2019
615
292
760
0
3090 seems like a xx80Ti with a bit more VRAM, but 3080 definitely seems like a xx70. Nvidia seems to just have 'upgraded' the naming scheme to justify higher pricing. They probably learned that lesson with the RTX2000 series..
The 3080 has the same number of cuda cores as the 2080ti, and we know there will be IPC improvement, so the 3080 will be faster in rasterized graphics. Rumors indicate it will crush a 2080ti in raytracing which is where Nvidia's focus is now, and it has 25% more memory than a 2080. How does that sound like a xx70?
 

spongiemaster

Prominent
Dec 12, 2019
615
292
760
0
I am staying with my 2080 Ti، its enough for my 4K60. But 7,552 CUDA for Titan? Seriously? I think it will be the first time that a Titan will be worth its price.
It's only there if Nvidia needs it, which won't be likely. If AMD can challenge the 3090 with an RDNA2 refresh next year, we may see Nvidia drop the hammer with a 3090Ti. Otherwise, I would expect a 3080Ti to slot between the 3080 and 3090 in a refresh next year.
 

sstanic

Reputable
Aug 6, 2016
31
8
4,535
0
The 3080 has the same number of cuda cores as the 2080ti, and we know there will be IPC improvement, so the 3080 will be faster in rasterized graphics. Rumors indicate it will crush a 2080ti in raytracing which is where Nvidia's focus is now, and it has 25% more memory than a 2080. How does that sound like a xx70?
Yes, I am aware of all that, and hopefully it is even better, but usually xx80 and xx70 were the first two to be introduced. It used to be a midrange chip, and a slightly cut down midrange chip, which would then be marketed by Nvidia as high-end. If these rumours are correct, this time they are also introducing a midrange chip and its slightly cut down version, but with different names. Number of CUDA cores and overall performance increase is naturally due to 2 years of development, just as it always used to be, so we see smaller node, more VRAM etc. but it is still essentially a midrange chip.

So why would they keep chip development on the same-ish track but change the marketing names? There could be various reasons, introduction of future products in between, after they see how AMD's chips are doing etc. One obvious reason is also that newer smaller nodes cost significantly more than old 16-12nm, probably more than 75% more, so they get more performance but at a higher cost. And since the market didn't really like the large price hike with RTX2000, it seems logical to go with more performance=more money=higher ranking name, since the buyers' eventual discontent will be subdued significantly that way. And so the 3070 becomes 3080, and 3080 Ti becomes 3090..
 

mdd1963

Polypheme
VideoCardz leaks Gainward's GeForce RTX 3090 and GeForce RTX 3080 Phoenix Golden Sample graphics cards.

GeForce RTX 3090, RTX 3080 Specifications Reportedly Exposed In Vendor Spec Sheets : Read more
24 GB of VRAM, it said? Wow!

That would seem largely wasted at 'only' 4k, I'd think; so, short of few folks attempting to tinker with/run 8K gaming, having a bunch of idle VRAM at only 1440P hardly seems like money well spent. (Well, short of a handful of folks thinking twice the VRAM must be twice as fast)
 
Reactions: RodroX

RodroX

Estimable
I hope theres a lot more going on under the hood, cause increasing the cuda cores alone and a faster memory wont justify a huge price increase. Not to mention the important power requirements (at least on this papers).

I think nvidia didn't got the memo about PC tech going smaller, been more efficiente while keeping the same or lower price.
 

ASK THE COMMUNITY

TRENDING THREADS