Jarred, how could you make so many obvious mistakes in one post? Did you think that you were replying to some teenage noob? You didn't watch the video, clearly, because some of your "points" are completely nonsensical as the video demonstrated quite clearly.
It's a successor to the Titan price bracket.
You say Titan price bracket and I say RTX 2080 Ti price bracket:
RTX Titan - $2,500
RTX 2080 Ti - $1,200
RTX 3090 - $1,500
So, you're off by $1,000 and I'm off by $300. I'm sorry Jarred but the theory to which I subscribe is $700 more accurate than yours. The RTX 3090 is not priced like a Titan, it's priced the way that an RTX 3080 Ti could easily be based on Turing's pricing structure if nVidia wanted to squeeze rich people a little more.
RTX 3090 lacks the Titan drivers, which only matter for a few specific professional use cases. If you need those professional features, there's still Quadro -- or Nvidia A6000,
Now you're denying the exact purpose of the RTX Titan's existence. Those drivers are what made a Titan a Titan instead of a gaming card. You were ALWAYS able to get a Quadro, hell, the Quadro is older than the Titan but the Titan was still made. Also, the Titan of today would have 24GB of RAM because with only one exception, Titans doubled their VRAM every generation. For the RTX 3090 to be a successor to the Titan, it would need 24GB of VRAM.
Yes, they'll make even Titan cards look relatively affordable
You've just contradicted yourself. You said it was in the Titan price bracket. Now you're saying otherwise.
Quadro has drivers that enable additional features / enhancements for professional workloads that not even Titan cards got.
That's not news, that has ALWAYS been true and it hasn't stopped nVidia from making Titans and selling them like mad. Therefore, as an argument, it doesn't hold water.
So, for the pro market that needs those features, Nvidia gets to milk them a bit more, while at the same time selling a slightly lower priced Titan that won't cannibalize the pro market sales.
Just how is $1,000 cheaper only
slightly lower priced? That's a massive 40% reduction in price. I'm sorry Jarred but that argument doesn't hold water either. Slightly less expensive would be $2,250 (10% off) or maybe even $2,000 (20% off). Nobody in their right mind calls a price reduction of 40% a slight reduction.
Well, outside of pros that don't need CATIA or SolidWorks or similar enhanced performance.
So, outside of people who buy Titans.
Jarred, you have basically said the following:
- The RTX 3090 is in the Titan price bracket, except that it's $1,000 below that.
- You've said that the Quadro is the reason that the Titan wasn't released but the Quadro has always existed and it didn't stop nVidia from making Titans.
- You called a 40% reduction in price only slightly less expensive.
- You said that this card is for people who are not the people who have historically bought Titans.
Jarred, I don't know who you're trying to convince but it sounds like you've just bought into the nVidia BS to the point that you're parroting it without giving it its due analysis. It's like you're trying to make up excuses for the RTX 3090's existence and I don't understand why. Your post is a total mess and I would be embarrassed if it were mine, especially if I was in your position.
You did the same thing when you told me that there's NO WAY that AMD didn't tease their most powerful video card at their Ryzen 5000 launch. You dismissed my speculation that it was in fact the RX 6800 XT that was teased as impossible and unrealistic. Well, you were wrong and it was confirmed by AMD that it was, in fact, the 72CU model that was teased at clock speeds that hadn't been yet finalised. The 72CU model will be the RX 6800 XT which means that my intuition was 100 correct.
Maybe you should pay more attention to users who have been members for over a decade and are heavily-decorated instead of just tossing our logic aside like we're noobs. There's a reason why I have the stats I have and the fact that I've been here since 2009 should tell you that I'm not some noob teenager who started gaming when the GTX 1080 Ti first came out. Maybe my speculations could actually be valuable.