News Nvidia GeForce RTX 3090 and GA102: Everything We Know

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
$1600 for a gaming GPU is not acceptable , we are to be blamed , we allowed Nvidia to sell at crazy prices when we did not refuse to buy their products. I think Apple ditched them and went all AMD for their crazy Prices nothing else .

Come on AMD , we need competition ...
 

mr_tan

Distinguished
Sep 20, 2006
28
2
18,535
Again, people pulling numbers out of thin air ($1600). No one knows how much these cards will cost, especially in a slow economy where value may be more important for many than reckless spending. Lets wait for launch day to see where this lands.
 

SkyStormy

Reputable
Nov 10, 2015
26
0
4,540
$1600 for a gaming GPU is not acceptable , we are to be blamed , we allowed Nvidia to sell at crazy prices when we did not refuse to buy their products. I think Apple ditched them and went all AMD for their crazy Prices nothing else .

Come on AMD , we need competition ...
This time Intel is going to be in the competition too.
 
Aug 16, 2020
21
6
15
If it's above 1200, Playstation, xbox will be happier because people won't have the budget to buy it especially during COVID where economic is not stable and workers won't spend one month salary for playing time to time... Are these cards only for "elite" ?
 

Gurg

Distinguished
Mar 13, 2013
515
61
19,070
I doubt that intel will compete against the flagships in their first Gen ... well will see ...
No but what TH has indicated is Intel's integrated graphics are going to make lower tier discrete graphic cards obsolete. Their new CPUs are packing graphic capabilities equivalent to Nvidia 1050-1060 GPUs which comprise 25% of the Steam hardware in use in July not even counting the AMD equivalents. Intel is going to squeeze AMD and Nvidia from the bottom up. It wouldn't be hard to imagine Intel's next iteration could land in 1080ti equivalency.

The new MS W10 upgrade is supposed to include better switching between integrated and discrete GPUs allowing the CPU to send only higher complexity GPU demands to the discrete GPU.
 
Last edited:

Norwegian_Nurse

Distinguished
Dec 31, 2007
2
3
18,515
I for one have a hope all this 21 number stuff translates into they finally release HDMI Version 2.1 , With variable refresh rate, quick frame transport, auto low latency mode and the like. I'm a home cinema nerd after all. HDR 10+ would be great too.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
My hopes:
  1. No Founders Edition 'tax' -- just launch at the base price and let partner cards do whatever (like the 20 Super cards)
  2. No more than 1-2 months between releases
  3. Pricing similar to 20-series, so 3090 would take over for 2080 Ti
Of those three, only number two seems likely to happen. One and three are long shots at best.

We pretty much agree here. Order from most likely to least likely for me would be 2,3,1. #2 I expect to happen.

#3 is tricky to predict because it looks like the model numbers aren't going to match up with previous generations. I expect there to be cards in the same price ranges as Turing with an additional higher tier above the $1200 range that's closer to $2000 than the $2500 the RTX Titan was. What the names of the cards will be, I don't know and I don't care. If there is a $700-800 card that beats a 2080ti in rasterized performance, and crushes it in ray tracing performance, I don't care if it is a 3080, or 3070, or whatever else. Price/performance matters, not the names.

No chance for #1 if the leaked dual sided PCB and complicated dual sided cooler are the real FE coolers. It will cost more, and it should cost more, than the basic lowend traditional designs.
 
  • Like
Reactions: JarredWaltonGPU

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
If it's above 1200, Playstation, xbox will be happier because people won't have the budget to buy it especially during COVID where economic is not stable and workers won't spend one month salary for playing time to time... Are these cards only for "elite" ?

Then again, a lot of people have cut down on their recreational spending. Money saved from not going on an oversea vacation is more than enough for Nvidia's top-end product. There could be some serious "revenge buying" this holiday season.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
No but what TH has indicated is Intel's integrated graphics are going to make lower tier discrete graphic cards obsolete. Their new CPUs are packing graphic capabilities equivalent to Nvidia 1050-1060 GPUs which comprise 25% of the Steam hardware in use in July not even counting the AMD equivalents. Intel is going to squeeze AMD and Nvidia from the bottom up. It wouldn't be hard to imagine Intel's next iteration could land in 1080ti equivalency.

The new MS W10 upgrade is supposed to include better switching between integrated and discrete GPUs allowing the CPU to send only higher complexity GPU demands to the discrete GPU.

I dont think that intel will replace the integrated graphics in their Desktop CPU at all , they will release special CPU's at higher prices (Alot higher) with their new GPU and in Notebooks only.

Want proof? look at their Iris integrated Graphics , it never made it into the whole line , while it is twice faster than current integrated graphics ... find me a desktop CPU with integrated Iris Graphics ???

And Iris was introduced back in 2012 ... Eight years !!!

HD Graphics in desktops will continue .. The new integrated GPU will be in Notebooks only and maybe some special Desktop CPU but for sure not across the line , the desktop CPU will stay HD graphics.
 

colson79

Distinguished
Apr 16, 2012
71
13
18,635
The 2080 Ti was one of the biggest ripoff's Nvidia ever released. It was double the cost of the 1080 Ti and only 25% faster. I'm really hoping the 3090 is at least 50% faster and the same price as the 2080 Ti. Nvidia is moving down to 7nm for this generation which should provided a higher yield per wafer which makes them cheaper to produce.
 

bigdragon

Distinguished
Oct 19, 2011
1,113
559
20,160
Pricing is the biggest concern to me right now.

If Nvidia continues their current pricing or upward pricing trend, then I'll be going with one of the new consoles. I don't see the point of paying top-dollar for just a graphics card when I can get a full console providing a similar gaming experience for far less money. I'm also worried about just how bad this fall and winter will be in terms of economic damage due to COVID19. Yeah, I've saved money by not going on trips, but elevated grocery prices have already chewed through that money.

If Nvidia drops their pricing or AMD/Intel adds some much-needed competition to the GPU market, then upgrading my PC's graphics looks more appealing than a console. I'm not paying a premium to play the same games (and miss out on some great console exclusives).

Show me that pricing.
 
  • Like
Reactions: nofanneeded
The 2080 Ti was one of the biggest ripoff's Nvidia ever released. It was double the cost of the 1080 Ti and only 25% faster. I'm really hoping the 3090 is at least 50% faster and the same price as the 2080 Ti. Nvidia is moving down to 7nm for this generation which should provided a higher yield per wafer which makes them cheaper to produce.
Totally agree with the first part about RTX 2080 Ti being priced too high. The second bit about 7nm, though ... Indications (TSMC doesn't disclose pricing details on their contracts) are that 7nm costs about 70% more than 12nm. Meaning, if a chip is the same size (because it's more complex and has bigger transistors), it will cost 70% more. TSMC and other fab companies also don't publicly disclose yields, though all indications are that the 7P lithography is doing pretty well (better than Intel's 10nm+ at any rate). Yields do factor into the price increase, of course.

Basically, if GA102 ends up as a 700mm2 die size, it will be significantly more expensive to produce than the TU102 chips used for RTX 2080 Ti. In fact, a 450mm2 chip would probably cost as much to produce as the TU102 due to the higher 7P costs.
 

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
If Nvidia continues their current pricing or upward pricing trend, then I'll be going with one of the new consoles. I don't see the point of paying top-dollar for just a graphics card when I can get a full console providing a similar gaming experience for far less money.

Hard to sneak a game console past the wife. A video card, on the other hand, is an indispensable piece of equipment :)
 

bigdragon

Distinguished
Oct 19, 2011
1,113
559
20,160
Hard to sneak a game console past the wife. A video card, on the other hand, is an indispensable piece of equipment :)
I used to think that way too! However, PC hasn't had that "killer app" that demands powerful graphics since the days of Crysis. There's also far less opportunity to create mods and custom content for games given how hostile the industry has become towards modders in the past decade. And, let me be completely honest: I suck at modern multiplayer games. All of these factors combine to reduce the value of graphics cards to me.

I don't see Cyberpunk 2077 as being the PC's next Crysis given that it's a multi-platform game. I also don't see a developer embracing Unreal Tournament-like user content in a big, popular, mainstream game. The things I love about the PC platform have been on the decline for a long time. It's hard to justify an expensive new GPU when the gaming experience is nearly identical to the consoles, aside from the input method.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
Totally agree with the first part about RTX 2080 Ti being priced too high. The second bit about 7nm, though ... Indications (TSMC doesn't disclose pricing details on their contracts) are that 7nm costs about 70% more than 12nm. Meaning, if a chip is the same size (because it's more complex and has bigger transistors), it will cost 70% more. TSMC and other fab companies also don't publicly disclose yields, though all indications are that the 7P lithography is doing pretty well (better than Intel's 10nm+ at any rate). Yields do factor into the price increase, of course.

Basically, if GA102 ends up as a 700mm2 die size, it will be significantly more expensive to produce than the TU102 chips used for RTX 2080 Ti. In fact, a 450mm2 chip would probably cost as much to produce as the TU102 due to the higher 7P costs.

This is all Guessing , because you dont "know" the base cost of anything and how much profit Nvidia is asking. I dont think that this is the process cost at all. it is the high margin Nvidia is asking for ANYTHING. the relation is not linear at all . and we did not see that when other Chips makers moved to 7nm AT ALL ...

you r70% more than 12nm does not translate into 70% more selling price at all ... if they are asking some stupid 300% profit.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
I used to think that way too! However, PC hasn't had that "killer app" that demands powerful graphics since the days of Crysis. There's also far less opportunity to create mods and custom content for games given how hostile the industry has become towards modders in the past decade. And, let me be completely honest: I suck at modern multiplayer games. All of these factors combine to reduce the value of graphics cards to me.

I don't see Cyberpunk 2077 as being the PC's next Crysis given that it's a multi-platform game. I also don't see a developer embracing Unreal Tournament-like user content in a big, popular, mainstream game. The things I love about the PC platform have been on the decline for a long time. It's hard to justify an expensive new GPU when the gaming experience is nearly identical to the consoles, aside from the input method.

Consoles are not good for strategy games sadly ... because the console does not come with a Mouse ad a standard controller like the GamePad ... I allways wished that all consoles had Mouse AND Joystick together , but it did not happen . and when the controller(mouse) is not standard the devs will not waste time on coding for it.
 
Aug 18, 2020
1
1
15
Totally agree with the first part about RTX 2080 Ti being priced too high. The second bit about 7nm, though ... Indications (TSMC doesn't disclose pricing details on their contracts) are that 7nm costs about 70% more than 12nm. Meaning, if a chip is the same size (because it's more complex and has bigger transistors), it will cost 70% more. TSMC and other fab companies also don't publicly disclose yields, though all indications are that the 7P lithography is doing pretty well (better than Intel's 10nm+ at any rate). Yields do factor into the price increase, of course.

Basically, if GA102 ends up as a 700mm2 die size, it will be significantly more expensive to produce than the TU102 chips used for RTX 2080 Ti. In fact, a 450mm2 chip would probably cost as much to produce as the TU102 due to the higher 7P costs.
GA102 is Samsung 8nm. Only GA 100 is at TSMC 7nm
 
  • Like
Reactions: spentshells

wgulker

Commendable
Jan 2, 2020
11
2
1,515
That would be awesome if it happens. I’m trying to brace myself for far less awesome prices, though.
My guess is that if someone is going to pay $1300 for a gaming card that is 50-100% faster than a 2080Ti, they would also pay $2000 to feed their obsession. It would be interesting to know how many more they would sell at $1300 vs $2000. I tend to believe they sales number difference would not be as large as you would think. At these price points, this is a very niche market.
 
  • Like
Reactions: JarredWaltonGPU
I used to think that way too! However, PC hasn't had that "killer app" that demands powerful graphics since the days of Crysis. There's also far less opportunity to create mods and custom content for games given how hostile the industry has become towards modders in the past decade. And, let me be completely honest: I suck at modern multiplayer games. All of these factors combine to reduce the value of graphics cards to me.

I don't see Cyberpunk 2077 as being the PC's next Crysis given that it's a multi-platform game. I also don't see a developer embracing Unreal Tournament-like user content in a big, popular, mainstream game. The things I love about the PC platform have been on the decline for a long time. It's hard to justify an expensive new GPU when the gaming experience is nearly identical to the consoles, aside from the input method.
I guess if you're okay with 30fps.
 

SkyStormy

Reputable
Nov 10, 2015
26
0
4,540
Imagine you bought rtx 3090 today, then after a couple of months later its Super version is released with 20% of more performance? it would be a slap on your face. so don't buy any Nvidia card until its super version is out.
 
Imagine you bought rtx 3090 today, then after a couple of months later its Super version is released with 20% of more performance? it would be a slap on your face. so don't buy any Nvidia card until its super version is out.
That's extremely unlikely. Because even if you take the 2080/2070/2060 Super launch, those cards came out at least six months after the initial non-Super GPU. To be specific:

2060 Super came out six months after the RTX 2060. This was purely to counter AMD's RX 5700/5700 XT launch. (Also, price was $50 higher.)
2070 Super release was nine months after the RTX 2070.
2080 Super release was ten months after the RTX 2080.

Your 20% more performance claim is way off as well. The 2060 Super is 13% faster than the RTX 2060. The 2070 Super is 12% faster than the 2070. The 2080 Super is only 7% faster than the 2080. I suppose you could be referring to the 1660 Super, which launched seven months after the 1660. Or the 1650 Super, which came out five months after the 1650, but those are in a very different category from the RTX 3000 launch -- budget to midrange, not high-end and extreme.

Plus, the RTX 2080 Ti never got an updated 'super' version, and the 3090 is clearly going after that top-of-the-line classification. I don't think you'll see any 3000 series 'super' cards until at least 2021, if ever -- that or the Super branding will be present at launch. I'm still anticipating one of the cards being called "Ultimate" or similar, though -- RTX 3090 Ultimate would certainly make the "Ultimate Countdown" accurate.

If you think you'll buy an RTX 3090 GPU, the best advice I can give is to buy as soon as the price reaches a level you're comfortable paying -- or just don't buy it at all, because it's too expensive for your budget. For those that have the money, the RTX 2080 Ti cost $1200 at launch, and for nearly two years the only faster GPU available was the $2500 Titan RTX. I strongly suspect RTX 3090 will be the top Nvidia consumer GPU for at least a year, if not two.

Still not worth $1500-$2000, but if you must have the 'best,' that's where it's likely to hit your wallet.
 
  • Like
Reactions: Avro Arrow