News Rumored $999 RTX 3080 Ti Could Face Radeon RX 6900 XT

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
If Nvidia can't make enough of the 3070, 3080, and 3090, how the heck are they going to get any number of 3080 Ti cards out?

If you heard "Moore's law-NVIDIA's master plan" he claims this was intentional. The GPUs were subsidized for the first months launch by NVIDIA. So NVIDIA wanted to control how many cards got out so they didn't lose that extra $50/GPU.

There were complaints from OEM's that the BOM (Verified by Gamer's Nexus) for parts was absurdly low and they weren't going to make any margin. So they are asking for price breaks from NVIDIA. Reports are flowing in that chip supplies are now flowing in to mfgs. But without the rebate-kickback.

NVIDIA originally was going to say "No" and instead offer higher memory models with much fatter margins for both parties. They would continue to intentionally limit the lower end memory cards. Thus the higher end cards with fatter margins and less friendly MSRP would be pushed on consumers.

It's almost a bait and switch tactic of old.

But AMD kind of ruined that plan. (Much to my happiness.)

A 3080ti would put a serious dent in the 6900XT price. The 6900XT is slower than a 3080 in Ray Tracing, Super Res upsampling is reported to be inferior quality wise and not available till spring now. Plus you need a 500 series motherboard plus Ryzen 5000 series chip. This is not a value proposition.

However since 6900XT is a mining monster, it will likely stay at $999. (Which is stupidity. The only loyal fans AMD will have will be the miners.) $800 should have been the price point for the 6900XT given the RT performance.

AMD and NVIDIA both could have easily stopped miners in their tracks. They could force cards to operate in no less than 8x mode. They could check drivers to see how many cards are enumerated on the bus. They could check executable names and compute programs for common hash algorithms. They could have then differentiated chipsets with ones for mining that cost over twice as much. That would have been a win/win to balance things out.

Did either of them do this? NO! Why? Because of money. And they have that right. But it does generate bad will.
 
Last edited:
Lets just assume RX 6900 XT is faster or equal to RTX 3090
and Nvidia is releasing a slower card than RTX 3090 for the price of AMD RX 6900 XT ?!

Who the hell is going to buy this out of stock 3080 TI ?!
Why to bother the card is out of stock 😛

The 6900XT is SLOWER than a 3080 in ray tracing. It also lacks the quality AI upsampling of DLSS 2.x which is a huge performance boost. AMD's implementation is delayed till Spring and rumored to be inferior.
 
Source(s)?

If you can't find the price of GDDR6X, how do you know it is very close to that of HMB2?

1- GDDR 6

https://www.guru3d.com/news-story/gddr6-significantly-more-expensive-than-gddr5.html


2- HBM2

https://www.gamersnexus.net/guides/3032-vega-56-cost-of-hbm2-and-necessity-to-use-it

Regardless, we’re at about $150 on HBM2 and $25 on the interposer, putting us around $175 cost for the memory system.

Thats $175 for 8GB on the VEGA 56 , so ~$22 per 1GB


3- GDDR6X I read some where it is between GDDR6 and HBM2 but closer to HBM2 , I wrote it from memory
 
  • Like
Reactions: TJ Hooker
The 6900XT is SLOWER than a 3080 in ray tracing. It also lacks the quality AI upsampling of DLSS 2.x which is a huge performance boost. AMD's implementation is delayed till Spring and rumored to be inferior.
It's rumored to be inferior by who? At the higher resolutions where upscaling makes the most sense, people are likely to be hard-pressed to pick out pixel-level differences between the two upscaling methods while playing a game anyway. If they improve on AMD's existing Contrast Adaptive Sharpening algorithm, great, but I doubt one solution will look massively better than the other in real-world scenarios at higher resolutions.

As for their RT implementation, I suspect it won't be quite on par with Nvidia's, or else they would have showed off some direct performance comparisons during the announcement. There is the possibility that it could be faster at some effects than others though, and we might not have a clear picture of how the two compare until games come out that are optimized with both implementations in mind.

Those numbers are from over three years ago though, and at a time when RAM prices in general were much higher, so I'm not sure how relevant they are today. And they didn't estimate $175 for 8GB, but rather "around $150", plus another estimated $25 for packaging the memory on an interposer. Had they gone with an HBM2 solution though, those interposer costs probably wouldn't scale linearly for a higher-VRAM variant of the card. There's also the possibility that going with HBM2 close to the graphics processor would have allowed them to get similar or better performance out of a somewhat smaller chip, and reduced power draw and heat output a bit, potentially saving costs in other areas. So, it's not exactly a direct comparison, but we can assume that the cost savings of GDDR6X were enough to convince Nvidia to go with that instead.

That GDDR6 pricing article is also almost two years old, from not long after the RTX 20-series cards started using it. And as the article points out in the second paragraph...
"The mentioned prices correspond to a purchase quantity of 2,000 pieces. Manufacturers of video cards are likely to buy larger quantities, which means they could get the parts cheaper. 3dcenter.org estimates the possible discount at 20 to 40 percent, the prices in the table below would only be estimates."

So, according to that source, a company like Nvidia might have only been paying somewhere in the vicinity of $7-$9 per GB for the GDDR6 used in their cards a couple years ago. From what I've heard, GDDR6X does in fact cost more, but how much more is pretty vague. More than $100 for 10GB is probably almost certain, but I would be surprised if Nvidia were paying over $150 for it.