News Nvidia GeForce RTX 3090 Ti Officially Launches, Starting at $1,999

Comparing the 3090 and now the TIE sibling to the "proper" Titan line is just hurting nVidia itself. Talk about a self-inflicted wound... Until they don't release a driver that unlocks proper CUDA and OPs for the GA102 die, then it will still be a "kiddies" toy. This is something Linus already reviewed back when it was compared the first time and nVidia, it seems, took the hint and didn't draw any more parallels.

Anyway, that's just besides the point of this card. People that can make use of it, will get it regardless. As someone that doesn't need this card, I can only thank that I don't. $2K starting price is just too bonkers for not being a pro card. But in benchmarks it'll do or die. Looking at the numbers, this thing will be at best 10% better than the 3090 while using 100W more and over ~30% more in price ($500 from MSRP, but probably like $1K on the streets).

I'd also love to see "room" temperature tests. I am really curious to see how this thing increases room temperature and how it affects it inversely. We're getting to the point at which where you live is going to matter, which is quite frankly bonkers.

Regards.
 
Comparing the 3090 and now the TIE sibling to the "proper" Titan line is just hurting nVidia itself. Talk about a self-inflicted wound... Until they don't release a driver that unlocks proper CUDA and OPs for the GA102 die, then it will still be a "kiddies" toy. This is something Linus already reviewed back when it was compared the first time and nVidia, it seems, took the hint and didn't draw any more parallels.

Anyway, that's just besides the point of this card. People that can make use of it, will get it regardless. As someone that doesn't need this card, I can only thank that I don't. $2K starting price is just too bonkers for not being a pro card. But in benchmarks it'll do or die. Looking at the numbers, this thing will be at best 10% better than the 3090 while using 100W more and over ~30% more in price ($500 from MSRP, but probably like $1K on the streets).

I'd also love to see "room" temperature tests. I am really curious to see how this thing increases room temperature and how it affects it inversely. We're getting to the point at which where you live is going to matter, which is quite frankly bonkers.

Regards.
If you live near the arctic circle you could get a gaming pc and a heater all in one.
 
Comparing the 3090 and now the TIE sibling to the "proper" Titan line is just hurting nVidia itself. Talk about a self-inflicted wound... Until they don't release a driver that unlocks proper CUDA and OPs for the GA102 die, then it will still be a "kiddies" toy. This is something Linus already reviewed back when it was compared the first time and nVidia, it seems, took the hint and didn't draw any more parallels.

Anyway, that's just besides the point of this card. People that can make use of it, will get it regardless. As someone that doesn't need this card, I can only thank that I don't. $2K starting price is just too bonkers for not being a pro card. But in benchmarks it'll do or die. Looking at the numbers, this thing will be at best 10% better than the 3090 while using 100W more and over ~30% more in price ($500 from MSRP, but probably like $1K on the streets).

I'd also love to see "room" temperature tests. I am really curious to see how this thing increases room temperature and how it affects it inversely. We're getting to the point at which where you live is going to matter, which is quite frankly bonkers.

Regards.
Nvidia still did mention the Titan RTX in its Reviewer's Guide, rather than the 3090 -- which makes sense, as this is only a modest bump from the 3090.
 
This card is, in a word, dumb. It should come as little surprise though as Nvidia is going to milk as much money as they can out of those chips. They'd be better off in my opinion to make more 3080 cards seeing that's where the bulk of the high end will buy. The miniscule differences between the 3090 and this card just don't justify the monetary outlay, even for e-peen braggarts.
 
What exactly are you referring to here, FP64 (double precision) performance? Titans have always used the same drivers available to regular gaming cards.
I had to go and refresh my memory, because I do remember the Titan being better at FP compute than the regular GF cards. That changed slightly with later gens of it, but the difference was made via driver locks, mainly. Linus already proved that, so I won't there again. In certain aspects for compute, the Titan cards are just different, or at least used to.

So, in short from what I can see, while they do use the same driver suite, there's still differences on how they behave.

Regards.
 

TJ Hooker

Titan
Ambassador
I had to go and refresh my memory, because I do remember the Titan being better at FP compute than the regular GF cards. That changed slightly with later gens of it, but the difference was made via driver locks, mainly. Linus already proved that, so I won't there again. In certain aspects for compute, the Titan cards are just different, or at least used to.

So, in short from what I can see, while they do use the same driver suite, there's still differences on how they behave.

Regards.
Yes, the FP64 performance is limited by drivers/FW, but that isn't new. The majority of Titans have had equivalent compute (including FP64) performance to regular Geforce cards (or only slightly higher, if they have a few extra cores).

The only consistent, defining characteristic of Titan cards that I have found is that they cost more than all the regular Geforce cards at the time they are released. All other definitions I've seen people come up with with respect to what constitutes a 'real' Titan (usually in contrast to a RTX 3090) are broken by at least one previous Titan.
 
Last edited:

Phaaze88

Titan
Ambassador
"Guys/gals, please help me, my room gets like a jungle after a few rounds of gaming. Temps are good, so I don't know why this is happening.
Plz reply immediately, any help is appreciated, thx!" :D


On a more serious note: Living in the subtropical southern US - even if I did have that kind of money to blow - the 3090 was already a no-go, this thing is even worse.
An undervolt is only going to help so much...
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
Biggest disappointment is the power consumption. I do expect 2-5% performance increase over 3090 but 100W (~27%) increase just for that?

At this rate, 1000W graphics card is not that far away....
 

samopa

Distinguished
Feb 12, 2015
202
55
18,660
This card is, in a word, dumb. It should come as little surprise though as Nvidia is going to milk as much money as they can out of those chips. They'd be better off in my opinion to make more 3080 cards seeing that's where the bulk of the high end will buy. The miniscule differences between the 3090 and this card just don't justify the monetary outlay, even for e-peen braggarts.

In the first production wave, the silicon yield is not so good, hence to salvage some "defective" chip, they create 3080, but as times goes by and the production yield improve, many chip is "perfect", so its better to sell them as the 3090 TI.

Considering that the 3080 and 3090 TI using same chip, if the yield is good, it doesn't make sense to sell good chip (capable to be 3080 TI or better) into "crippled" chip (the 3080), so naturally the 3080 production will be decreased, and the 3080 TI and 3090 TI will be increased.

They could disabled some part of "near perfect" chip to make 3080, but this is not without risks. Some years ago they try to do this kind of process (disabling some part of GPU), because the yield is so good but demand is high on lower level product, and it does not take a long time for someone to "modded" that card, enabling the disable part, effectively make that card function like higher level product.
 

Geef

Distinguished
A better idea than getting a 3090 Ti would be to instead buy a separate computer and set it up next to yours and you can invite friends over to game with you sometimes.
 
In the first production wave, the silicon yield is not so good, hence to salvage some "defective" chip, they create 3080, but as times goes by and the production yield improve, many chip is "perfect", so its better to sell them as the 3090 TI.

Considering that the 3080 and 3090 TI using same chip, if the yield is good, it doesn't make sense to sell good chip (capable to be 3080 TI or better) into "crippled" chip (the 3080), so naturally the 3080 production will be decreased, and the 3080 TI and 3090 TI will be increased.

They could disabled some part of "near perfect" chip to make 3080, but this is not without risks. Some years ago they try to do this kind of process (disabling some part of GPU), because the yield is so good but demand is high on lower level product, and it does not take a long time for someone to "modded" that card, enabling the disable part, effectively make that card function like higher level product.

I'm aware of all that and not a neophyte when it comes to the technology and processes, or capitalism for that matter. What you have failed to consider is that most people haven't the desire to spend $1500+ for a 3080Ti or above. They're looking for value to a degree and high FPS. The 3080 will do that and do it well for a long time. The advantage that anything above that may offer isn't really worth it monetarily speaking for the average consumer. They'd move more product as 3080s than they would if they were higher priced SKUs.
 

watzupken

Reputable
Mar 16, 2020
1,016
514
6,070
I feel this product is released for a few reasons,
  1. As a pipe clearer - With next gen RTX cards rumoured to use up to 600W, thus, as Igorlab mentioned, this is basically to have the foundation ready for the newer chips,
  2. To reduce power requirement shocks - Again with next gen RTX cards using up to 600W or more, Nvidia probably wanted to lessen the negative publicity of an inefficient card. Imagine if the 3090 Ti did not exist, the highest reference power for the RTX 3090 is 350W. Jumping to 600W at the top end is almost double. Whereas if you have a card with 450W, that increase may not sound as bad.
Having said that, I think we have reached a point that it is becoming unsustainable. What I mean is that the power draw is so high which directly impacts heat output. Even with a RTX 3080, it is heating up my computer room really fast. I can imagine the amount of heat that gets dumped into the atmosphere is far worst for the 450W and higher powered cards. Even with water cooling, it does not solve the problem because while it keeps the card cooler, the same amount of power is drawn and heat dumped out. We used to worry about heat getting trapped in the case, but now we also need to worry about heat getting trapped in the room.
 

WrongRookie

Reputable
Oct 23, 2020
627
42
4,940
As a content creator myself...yeah I don't think I'm gonna get this one. Obviously, this card is for content creators who are very large in size. This isn't for those who are small in size and frankly, there are better alternatives.

I am however shocked that this one requires more watts than what most GPUs would use. it begs the question of whether to get a 1200w or an 850w...What were they thinking with this one?
 
  • Like
Reactions: Why_Me
Looking at the numbers, this thing will be at best 10% better than the 3090 while using 100W more and over ~30% more in price ($500 from MSRP, but probably like $1K on the streets).

i'm not seeing 10% improvement, it will probably closer to 5%-7%; one hell of a premium for a modest improvement in performance; in some specific use items it will show a bit better (10%-15%) then this, but in most usage situations i think you'll be hovering around 5%-7%
 

lyrx

Distinguished
Aug 2, 2015
9
1
18,515
I read about cars like a Lamborghini, Ferrari, Mcclaren, and Bugatti like a read about a $2000 video card. Those of us that made billions for the companies like Sony that owned Everquest and Blizzard that created World of Warcraft by paying the monthly subscription remember that we had hours of fun with a $300 video card. Nvidia has priced us me out of RPG gaming.
 
  • Like
Reactions: jacob249358
I read about cars like a Lamborghini, Ferrari, Mcclaren, and Bugatti like a read about a $2000 video card. Those of us that made billions for the companies like Sony that owned Everquest and Blizzard that created World of Warcraft by paying the monthly subscription remember that we had hours of fun with a $300 video card. Nvidia has priced us me out of RPG gaming.
What is forcing you to play at high resolutions at ultra fast framerates?
 

samopa

Distinguished
Feb 12, 2015
202
55
18,660
I'm aware of all that and not a neophyte when it comes to the technology and processes, or capitalism for that matter. What you have failed to consider is that most people haven't the desire to spend $1500+ for a 3080Ti or above. They're looking for value to a degree and high FPS. The 3080 will do that and do it well for a long time. The advantage that anything above that may offer isn't really worth it monetarily speaking for the average consumer. They'd move more product as 3080s than they would if they were higher priced SKUs.

They had already moved the product fast enough that even 3080TI still sells 30-50% above from MSRP, this tell you that demand still way over supply, and they does not have enough inventory.

In my place, whenever any particular store have inventory of RTX3090, usually it would not last longer than few hours. SO your statement that "most people haven't the desire to spend $1500+ for a 3080Ti or above" is not proven (at least in my area). partly because they do not have many choices and partly because many of them willing pay higher price for the value they get.
 
They had already moved the product fast enough that even 3080TI still sells 30-50% above from MSRP, this tell you that demand still way over supply, and they does not have enough inventory.

In my place, whenever any particular store have inventory of RTX3090, usually it would not last longer than few hours. SO your statement that "most people haven't the desire to spend $1500+ for a 3080Ti or above" is not proven (at least in my area). partly because they do not have many choices and partly because many of them willing pay higher price for the value they get.

There isn't any value to be found in any of these cards. None. If they were at or near actual MSRP and not grossly overinflated, perhaps then would there be value. Admittedly, I had a friend pick me up an EVGA 3080 and I paid $1200 for it and that included a bit of a travel fee for doing so. Do I like the card and the FPS it gives? Yes. Did I need it? Not really. I have a perfectly good 1080 now collecting dust until I build a HTPC rig from my old parts. Should I have waited seeing that prices are coming down somewhat? Probably... but I had the money and it wasn't that big of a deal for me. That's not to say it would be the same for a whole host of others who are still sitting on 900 or low tier 10 series cards.