News Nvidia teases RTX 50 Blackwell Gaming GPUs for launch next month — The Witcher IV's first cinematic trailer likely leveraged the upcoming RTX 5090

They did with the 3090 ti though. History usually doesn't repeat itself, but it rhymes.

There was an 18 month gap though. So this isn't something we are likely to see any time soon. They know they can price gouge twice if they handle it the same way.

Though I was surprised when the 20 series launched with the full set of high end cards on day one. That could happen too.
 
  • Like
Reactions: helper800
RTX4090 24GB is struggling with AI workloads, so the RTX 5090 32GB and 21,760 CUDA cores will be a welcome bump for home users with enough cash to run AI on a local PC.
I have an RTX 4080 Super and the 16GB vram has become a significant roadblock to generating an output in a reasonable amount of time at a decent resolution. The 5080 with only 10,752 CUDA cores is a disappointment to read. Reading AI performance test reviews will be interesting as its not just about gaming anymore.
 
RTX4090 24GB is struggling with AI workloads, so the RTX 5090 32GB and 21,760 CUDA cores will be a welcome bump for home users with enough cash to run AI on a local PC.
I have an RTX 4080 Super and the 16GB vram has become a significant roadblock to generating an output in a reasonable amount of time at a decent resolution. The 5080 with only 10,752 CUDA cores is a disappointment to read. Reading AI performance test reviews will be interesting as its not just about gaming anymore.
Likely the 5090 will be too slow for AI needs as well, time to pony up and get a B100/B200 or last gen server grade stuff.
 
  • Like
Reactions: aberkae
RTX4090 24GB is struggling with AI workloads, so the RTX 5090 32GB and 21,760 CUDA cores will be a welcome bump for home users with enough cash to run AI on a local PC.
I have an RTX 4080 Super and the 16GB vram has become a significant roadblock to generating an output in a reasonable amount of time at a decent resolution. The 5080 with only 10,752 CUDA cores is a disappointment to read. Reading AI performance test reviews will be interesting as its not just about gaming anymore.
RTX cards are probably meant for gaming and hobbyists only, except for the 5090 which is just an absurd exageration. If Nvidia could, they'd call you and tell you to buy a server-grade card.

AMD doesn't, but nobody uses their cards for that anyway.
 
They did with the 3090 ti though. History usually doesn't repeat itself, but it rhymes.
Crypto mining made the 3090ti seem like a good business decision to Nvidia, but the market crashed just as they released it and the product got major discounts leading up to the release of ADA. With AMD not competing in the high end any more, there won't be 90Ti cards any more. Why sell a card for $2500, when you can sell it as an A6000 for $9000?
 
think like nvidia:
ever increasing price so 60tier at 450 min...and tariffs upcoming so lets add 20%....540 for 60tier gpu.
5090 wil be 1800+
Nvidia is not increasing the price of the x60 from $300 to $450. Stop the nonsense. On the flip side, no one is expecting the 5090 go for less than $2000. $1999 is the absolute bottom of the price estimate with the top being around $2500 (Titan RTX MSRP). If a 5090 with 32GB of VRAM is $1800 MSRP. Anyone in the market for one will probably be thrilled.
 
  • Like
Reactions: jp7189 and Sluggotg
60 series has been $350 in the past, so I wouldn't put it past them. They have said they are going to compete on features, not price.

60Ti at $400 MSRP now, could easily show up as $450. Given they are willing to charge $100 for double the memory, it will certainly be interesting.
 
60 series has been $350 in the past, so I wouldn't put it past them. They have said they are going to compete on features, not price.

60Ti at $400 MSRP now, could easily show up as $450. Given they are willing to charge $100 for double the memory, it will certainly be interesting.
60Ti and 60 are different tiers. The regular x60 class is not going to increase 50% in price in one generation. Even Nvidia must be aware how toxic the gaming community would be (4080 12GB anyone?) if their lowest tier card was $450.
 
The next Witcher game needs to add a cat partner to help out when battling monsters, as well as adding cuteness.

obDQfPT.jpeg




Beyond that, Nvidia needs to learn how to stop price gouging.
 
60Ti and 60 are different tiers. The regular x60 class is not going to increase 50% in price in one generation. Even Nvidia must be aware how toxic the gaming community would be (4080 12GB anyone?) if their lowest tier card was $450.

Well I can see one way it happens.

If they choose to reserve their smallest GPU for mobile only and push out a 192bit 12GB entry level desktop card, or 128bit 16GB (to compete with Intel and potentially AMD) Basically pushing the 60Ti class down a notch, while increasing price for that tier, but actually reducing the performance tier price.

Intel at 250 with a decent card. AMD could swoop in with an RX8600 that is surprisingly good at $250-300. If Nvidia decides not to compete on price.

Not seen any hints of a 107 or 106 Blackwell GPU, so this pure speculation.
 
x60 from $300 to $450.
1060 was 250
a 2060 was 350.
a 3060 was 329 (and by time 3rd party took cut it was over 350)

4060 was 299 but was at best a side-grade not an upgrade.

if 5060 isn't gimped like 4060 it will be raised & ALL gpu costs will go up because of tariffs as every single company will pass cost onto customer.


they have a history of raising prices between gens by 100
 
1060 was 250
a 2060 was 350.
a 3060 was 329 (and by time 3rd party took cut it was over 350)

4060 was 299 but was at best a side-grade not an upgrade.

if 5060 isn't gimped like 4060 it will be raised & ALL gpu costs will go up because of tariffs as every single company will pass cost onto customer.


they have a history of raising prices between gens by 100
Yeah this is because by the 4000 series the 4060 was essentially a 3050, a 4070 was a 3060, a 4080 was a 3070. Remember they tried to sell us what is a 4070 as a 4080 for 1000 dollars...
 
There is absolutely ZERO possibility that the Witcher 4 reveal trailer was showcased with anything other than the NVIDIA RTX 5090. NVIDIA can call it an “unannounced GeForce graphics card” all they like, but true hardcore gamers know the truth. The RTX 5080, while it will undoubtedly be a capable GPU, simply cannot match the jaw-dropping graphical fidelity showcased in that trailer. It’s not just about power—it’s about capacity. With only 16 GB of GDDR7 VRAM, the RTX 5080 is deliberately gimped by NVIDIA’s segmentation strategy, ensuring it remains far below the capabilities needed for such cutting-edge rendering. For a game like The Witcher 4, which clearly aims to push the boundaries of next-gen visuals, this VRAM limitation would bottleneck performance and prevent the kind of breathtaking imagery we saw.

That reveal trailer wasn’t just rendered on any high-end GPU; it was undoubtedly created using the NVIDIA RTX 5090 with its unparalleled 32 GB of GDDR7 VRAM. The RTX 5090 isn’t just an incremental upgrade—it’s a monumental leap forward, blowing even the RTX 4090 completely out of the water. The graphics in the trailer were inconceivable, bordering on photo-realism, and no lesser GPU could come close to achieving such fidelity. From the hyper-detailed textures to the realistic lighting and shadows, every frame screams the power of NVIDIA’s flagship card.

If anything, this trailer is a wake-up call for gamers and enthusiasts alike. It’s time to prepare your wallets—or perhaps even consider the kidney fund :) —because the RTX 5090 is not just a luxury; it’s the new benchmark for gaming excellence. Honestly, even if NVIDIA releases a hypothetical RTX 5080 Super with at least 24 GB of GDDR7 VRAM, the RTX 5090 will remain in a league of its own, the only GPU capable of delivering The Witcher 4’s true potential. UDNA and and RTX 5080 Super may come within moderate performance uplifts, but IMO they will never match the RTX 5090. I would love to be wrong more than anything on this statement as well.

AMD and Intel need to buff UDNA and Battlemage to the stars to stand any chance against NVIDIA in the GPU space now, NVIDIA is in a different dimension entirely!
 
Last edited:
1060 was 250
a 2060 was 350.
a 3060 was 329 (and by time 3rd party took cut it was over 350)

4060 was 299 but was at best a side-grade not an upgrade.

if 5060 isn't gimped like 4060 it will be raised & ALL gpu costs will go up because of tariffs as every single company will pass cost onto customer.


they have a history of raising prices between gens by 100
Stop playing with the numbers. 1060 Founders Edition launched at 300. None of the AIB's had any interest in undercutting Nvidia's own price, so there were no launch 1060's at the imaginary $250MSRP or anytime soon after that. Nvidia launched a 3GB version months later at $200 to fill the void of no $250 6GB versions. 1060 FE launched at $300, 2060 FE launched at $350. There is no precedent for your ridiculous $150 price increase for one x60 generation.
 
There is absolutely ZERO possibility that the Witcher 4 reveal trailer was showcased with anything other than the NVIDIA RTX 5090. NVIDIA can call it an “unannounced GeForce graphics card” all they like, but true hardcore gamers know the truth. The RTX 5080, while it will undoubtedly be a capable GPU, simply cannot match the jaw-dropping graphical fidelity showcased in that trailer. It’s not just about power—it’s about capacity. With only 16 GB of GDDR7 VRAM, the RTX 5080 is deliberately gimped by NVIDIA’s segmentation strategy, ensuring it remains far below the capabilities needed for such cutting-edge rendering. For a game like The Witcher 4, which clearly aims to push the boundaries of next-gen visuals, this VRAM limitation would bottleneck performance and prevent the kind of breathtaking imagery we saw.

That reveal trailer wasn’t just rendered on any high-end GPU; it was undoubtedly created using the NVIDIA RTX 5090 with its unparalleled 32 GB of GDDR7 VRAM. The RTX 5090 isn’t just an incremental upgrade—it’s a monumental leap forward, blowing even the RTX 4090 completely out of the water. The graphics in the trailer were inconceivable, bordering on photo-realism, and no lesser GPU could come close to achieving such fidelity. From the hyper-detailed textures to the realistic lighting and shadows, every frame screams the power of NVIDIA’s flagship card.

If anything, this trailer is a wake-up call for gamers and enthusiasts alike. It’s time to prepare your wallets—or perhaps even consider the kidney fund :) —because the RTX 5090 is not just a luxury; it’s the new benchmark for gaming excellence. Honestly, even if NVIDIA releases a hypothetical RTX 5080 Super with at least 24 GB of GDDR7 VRAM, the RTX 5090 will remain in a league of its own, the only GPU capable of delivering The Witcher 4’s true potential. UDNA and and RTX 5080 Super may come within moderate performance uplifts, but IMO they will never match the RTX 5090. I would love to be wrong more than anything on this statement as well.

AMD and Intel need to buff UDNA and Battlemage to the stars to stand any chance against NVIDIA in the GPU space now, NVIDIA is in a different dimension entirely!
Lol relax bro, it's not like you need to run the game at native 4k with ultra raygimmick turned on to enjoy it. At 1080p or 4k performance its going to look good at high settings even with a 4070. CDPR is a Nvidia studio that launches games that taxes cards so you believe you need to buy the latest and greatest but it's just "optimized" to showcase the most expensive card.

Two things.. no matter how good graphics gets it's not even close of being "realistic", it's just good graphs, and second, the other day I played returnal at low settings, 17fps avg, and still had a good time with it, although the game plot itself was utterly garbage. Let's just enjoy gaming without caring about graphs and content and we are good
 
Stop playing with the numbers. 1060 Founders Edition launched at 300. None of the AIB's had any interest in undercutting Nvidia's own price, so there were no launch 1060's at the imaginary $250MSRP or anytime soon after that. Nvidia launched a 3GB version months later at $200 to fill the void of no $250 6GB versions. 1060 FE launched at $300, 2060 FE launched at $350. There is no precedent for your ridiculous $150 price increase for one x60 generation.
1060's msrp at launch was 250 regardless of if anyone sold em at that.
https://nvidianews.nvidia.com/news/...ry-gamer:-nvidia-unveils-the-geforce-gtx-1060

my point is nvidia has a history of charging 100+ more than prior gen of same tier.