Discussion NVIDIA Ampere RTX3000 series Discussion Thread(Updated Launch Specs)

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
NVIDIA Ampere is rumored to be announced in August and launch in September at Computex. It is said to be decent improvement over Turing with around 40% performance improvement through out the lineup. With the launch of RTX3080Ti(most probably that is what the RTX2080Ti successor is going to be called) we can expect 4K 60fps+ in 90%+ titles at ultra settings which was not possible with present high end cards well still 4K 100fps+ will out of reach for majority of demanding titles.

GA102(TITAN, RTX3080Ti):
5376 CUDA Cores
24GB GDDR6

GA103(RTX3080):
3840 CUDA Cores
10GB GDDR6

GA104(RTX3070,RTX3060):
3072 CUDA Cores
8GB GDDR6

GA106(RTX2660Ti,RTX2660):
1920 CUDA Cores
6GB DDR6

GA107(RTX2650):
1280 CUDA Cores
4GB GDDR6

I doubt there will be non RTX cards even on low end, I guess all the cards will be Ray-tracing enabled and come packed with RT Cores.
The info out there is very little but the potential leaks of specs or rumors are interesting. We did expect even more than this but 40% performance improvement over Turing is not too bad. I hope there is at-least comparatively bigger jump in Ray-tracing performance compared to Raw performance making Ray-tracing bit more mainstream.

Are we finally getting to the point where a single GPU can handle 4K@60Hz without any limitations.

UPDATED SPECS AFTER LAUNCH
RTX3090(TITAN)
10496 CUDA Cores
24GB GDDR6X
$1499

RTX3080
8704 CUDA Cores
10GB GDDR6X
$699

RTX3070
5888 CUDA Cores
8GB GDDR6
$499

That is pure Insanity in comparison to what we were expecting. Interesting. Lets see how the Performance goes in real world. On paper the specs are pure Insanity. If it delivers similarly in real world then we are in for a big WIN.
 
Last edited:
Jun 6, 2020
2
1
15
After looking at Linus 8K Gaming RTX3090 Video. I cannot accept that RTX3090 is just 10-20% performance improvement over RTX3080. The performance jump cannot be that small.

You can approximate the performance quite accurately from number of CUDA cores and their frequencies. 2080 Ti had almost 50% more CUDA cores than 2080, yet performed "only" around 30% better in games. 3090 has a mere 20% more CUDA cores than 3080, so the theoretical maximum gain in pure performance would be 20%, but as 2080 Ti example showed, it's always less than that in practice, so around 10-15% seems like a good guess.
 
NVIDIA are fools stating that it is only 10-15% better in performance than RTX3080. They were probably calculating the Difference Backwards(Taking RTX3090 as 100% and Calculating difference of RTX3080 which will be 10-15%)

RTX3090 is Good 20-30% at 4K than RTX3080 and 50-60% better than RTX2080Ti.
For the RTX2080Ti crowd who are into High-end GPUs the RTX3090 is not bad as many were thinking.

Is it worth Double The price. IT IS. IT IS NOT.

For uncompromised Gaming experience it is worth extra expenditure. But I think it will be more worthy when prices go down a bit.
 
NVIDIA are fools stating that it is only 10-15% better in performance than RTX3080. They were probably calculating the Difference Backwards(Taking RTX3090 as 100% and Calculating difference of RTX3080 which will be 10-15%)

RTX3090 is Good 20-30% at 4K than RTX3080 and 50-60% better than RTX2080Ti.
For the RTX2080Ti crowd who are into High-end GPUs the RTX3090 is not bad as many were thinking.

Is it worth Double The price. IT IS. IT IS NOT.

For uncompromised Gaming experience it is worth extra expenditure. But I think it will be more worthy when prices go down a bit.
Tom's own review showed that the maximum it was better than the 3080 was 20% in like 2 games and averaged about 15% faster.
 
Tom's own review showed that the maximum it was better than the 3080 was 20% in like 2 games and averaged about 15% faster.
Actually, it only averaged 10.5% faster at 4K in their standard test suite, and even less at lower resolutions. : P

In an "expanded" set of tests adding mostly games supporting raytracing and DLSS it averaged 14.4% faster at 4K. Those titles were probably best-case-scenario options that Nvidia likely suggested though.

So yeah, pretty much fitting that 10-15% faster description that Nvidia themselves claimed the other day, probably leaning more toward the lower end of that range for most of today's AAA games, and even lower still for older or otherwise less demanding titles. This card is very much a "Titan" going by a different name. I guess some might consider paying over twice as much for slightly more performance to be a reasonable option, but they would probably be better off donating the difference to starving ocelots or something.
 

TRENDING THREADS