News AMD Radeon RX 6900 XT Review: Powerful and Pricey

So why did you put games with Ray tracing in the standard gaming mix when you have a dedicated section for ray tracing. By your numbers the 6900XT beats the 3090 except the ray tracing games brought it down below. Pretty shaddy skewing the numbers like that... Another Toms review eh.
 
Most of the reviews have been shadey AF. "Let's compare vs DLSS but turn SAM off because reasons" let's focus on raytracing even tho it's only 0.5% of new games... even better let's omit the games where it crushes nvidia because reasons.

Coreteks and hardware unboxed had good reviews everyone else is an obvious mess. Ya it's worse than 3070 sometimes... Face+palm.
 
To anyone who got any piece of tech this year:

HOW?

I was on walmarts site every drop of the ps5, I was there on the drops for 3080s, and now I was there for the 6900xt refreshing sites on my phone and computer.
Add to cart button to out of stock.
Every. Single. Time.
Is it really just luck? Browser dependent? I've never had so many issues trying to secure something online.
 
The review only made the 6800 XT look more appealing to me. I feel like the difference between the 6800 XT and 6900 XT is too small to justify the higher price.
To anyone who got any piece of tech this year:

HOW?
You must be a 1) streamer or youtuber, 2) scalper, 3) miner, 4) game developer, or 5) friend of 1-4. Just being a gamer or PC enthusiast is not enough anymore.
 
"A small step up from RX 6800 XT performance, a bigger step up in price. "

- If that's what you think about AMD, you ain't gonna like Nvidia....

Nvidia 3090: a small step up from 3080 performance, a MASSIVE step up in price.

Let's remind everyone exactly who Nvidia themselves are - right now - marketing the 3090 to (straight from their front page) :-
"The GeForce RTX™ 3090 is a big ferocious GPU (BFGPU) with TITAN class performance. It’s powered by Ampere—NVIDIA’s 2nd gen RTX architecture—doubling down on ray tracing and AI performance with enhanced Ray Tracing (RT) Cores, Tensor Cores, and new streaming multiprocessors. Plus, it features a staggering 24 GB of G6X memory, all to deliver the ultimate gaming experience. "
--> The 3090 is clearly marketed PRIMARILY as a GAMING CARD...!!!
-->a small step up from 3080 performance, a MASSIVE step up in price. Indeed...!!!
 
Last edited:
To anyone who got any piece of tech this year:

HOW?

I was on walmarts site every drop of the ps5, I was there on the drops for 3080s, and now I was there for the 6900xt refreshing sites on my phone and computer.
Add to cart button to out of stock.
Every. Single. Time.
Is it really just luck? Browser dependent? I've never had so many issues trying to secure something online.

3 reasons....
  1. I took advantage of queue systems. I've purchased 2 EVGA 3080s because I signed up early that AM and got in the queue system.
  2. Preorders...it's just that simple. I work from home so I was ready and waiting during all console preorder days for retailers to go live. Then I followed popular twitter accounts who tweet when the retailers go live.
  3. Just being lucky. For example I was able to get an order in with AntOnline for 5900x on release day. I try to pay attention to the smaller companies for stuff like this because they get less site traffic and have less bots sitting on them.

See I'm a sneakerhead so we go through stuff like this all the time for popular shoes...fighting with bots and all. So grabbing these consoles and tech items was difficult but I'm used to the grind. I'll be honest and say grabbing multiple consoles since the first preorder days have honestly been quite easy. GPUs and CPUs on the other hand was nightmare but I got what I wanted. Patience is key too
 
  • Like
Reactions: artk2219
'That's all well and good, but there are a couple of problems when comparing the GeForce RTX 3080 and 3090 to the Radeon RX 6800 XT and 6900 XT. First, RTX 3090 more than doubled the VRAM of the 3080'

That comment almost made me stop reading the rest of the review.
Simply because it's a ridiculous comment, the simple fact is the 3080 should have shipped with 16g vram anyway imo.

To commend nvidia for the extra 14gb vram on the 3090 is completely nonsensical when the 3080 is actually lacking makes no sense at all.

The 6800, 6800xt and 6900xt all have 'enough' vram for any scenario, the 3070 and 3080 arguably don't.
 
Most of the reviews have been shadey AF. "Let's compare vs DLSS but turn SAM off because reasons" let's focus on raytracing even tho it's only 0.5% of new games... even better let's omit the games where it crushes nvidia because reasons.

Coreteks and hardware unboxed had good reviews everyone else is an obvious mess. Ya it's worse than 3070 sometimes... Face+palm.

It's not entirely "shadey" (sp) at all. Turning SAM off makes sense because that is something which, currently, is AMD specific in a very specific set of circumstances, namely using a Ryzen Zen 3 CPU and a 500 series motherboard. If you take that 6900xt and drop it in an intel motherboard, SAM is a non-factor. However, DLSS on an nVidia card will operate regardless of the CPU/mboard combo in use. Also, SAM is a feature of the motherboard/CPU/PCI interface, and not of the GPU itself necessarily, so it is reasonable to not include that when trying to compare perf of GPUs. It may be fair to say that DLSS shouldn't be included, but it's not nVidia's fault that AMD was unable to prepare their equivalent technology in time for their launch, and I do not feel it's an unfair comparison to measure the perf of each company's jewel cards together notwithstanding that face. If AMD can produce their version of DLSS, then by all means the benchmarks should be recalculated for a fair comparison to see if it matches.

As for comparing ray tracing, that's again on AMD, not on nVidia. AMD is trying to offer a competitive product, You cannot treat them with kid gloves when their own offering currently simply cannot perform to the same level as their competitor. The fact that it's in only 0.5% of new games (a made up number, to be sure, so not really defensible here) is irrelevant in much the same way that you could look at Steam's measurements that show how the vast majority of current gamers typically play at resolutions lower than 4k and claim it's silly to focus on 4k game benchmarks. Considering that AMD's own brag page about RDNA2 Architecture discusses "Hardware Accelerated Raytracing", I think AMD deserves to have raytracing be a focus of benchmarks since they themselves are using it as a selling point. If you're going to talk it up, you have to be prepared to handle the comparison, after all.
 
I miss when 700-750USD could get me a halo card...

It does no good to tell people not to buy 6900XT and 3090 for gaming; they're going to do it anyway, because they can, and they have the money for it. That's questionable as valid reasons/excuses, but whatever.

The 6800, 6800xt and 6900xt all have 'enough' vram for any scenario, the 3070 and 3080 arguably don't.
Those last 2 are expected to launch with higher Vram models, and if true - I haven't been keeping up with the news lately - people who jumped on the bandwagon early got shafted.
 
  • Like
Reactions: Jim90 and bigdragon
'That's all well and good, but there are a couple of problems when comparing the GeForce RTX 3080 and 3090 to the Radeon RX 6800 XT and 6900 XT. First, RTX 3090 more than doubled the VRAM of the 3080'

That comment almost made me stop reading the rest of the review.
Simply because it's a ridiculous comment, the simple fact is the 3080 should have shipped with 16g vram anyway imo.

To commend nvidia for the extra 14gb vram on the 3090 is completely nonsensical when the 3080 is actually lacking makes no sense at all.

The 6800, 6800xt and 6900xt all have 'enough' vram for any scenario, the 3070 and 3080 arguably don't.
The point is that if you're trying to justify a massive increase in price, the 3090 offers tangible benefits over the 3080. The 6900 XT. doesn't offer much more than the 6800 XT.

And I maintain that most games will be just fine with the 10GB 3080 for the next several years -- worst case, you drop from 4K textures to 2K textures, which won't actually affect image quality much at all. (Because 4K textures don't get used unless a texture occupies more than 2K wide/tall pixels -- MIPMAP will just use a 2K or lower res texture when the polygon is only covering a small part of the display.) There are edge cases -- professional workloads being a good one -- where 24GB vs. 10GB makes a big difference. That's why in Blender the 3090 is around 30% faster than 3080, but the 6900 XT is only 5-8% faster than 6800 XT.
Nvidia Radeon RX 6900 XT: Impressive Mainstream Appeal
I think there's a mistake in the wording.
Yup. Last night the conclusion header and text (which was just the GTX 3060 Ti review content still in need of editing) got put into the CMS before I had finished writing that part. We missed updating the header when the conclusion got updated. (Specifically, the correct conclusion header was in a GDoc and was "Fast but Expensive" -- I've now put it into the article.)
Those last 2 are expected to launch with higher Vram models, and if true - I haven't been keeping up with the news lately - people who jumped on the bandwagon early got shafted.
Not at all. They got exactly what they paid for, and they were early adopters and got to enjoy the products in question for potentially many months. I think we'll get 3080 Ti or 3080 20GB in 2021, but it's not going to be hugely faster than 3080 in most games for a long time. See above. Being shafted is when you order one product and get something else, or are somehow treated "unfairly." Like, trying to order an RTX 3080 and having bots buy out all the stock before you might qualify as being shafted. Buying a 10GB card when 16GB and 20GB cards will come out in the future? That's a choice you make; no one is forcing people to buy 10GB cards. (And Cyberpunk 2077 seems to be just fine with 8GB of VRAM, FWIW.)
 
Not at all. They got exactly what they paid for, and they were early adopters and got to enjoy the products in question for potentially many months. I think we'll get 3080 Ti or 3080 20GB in 2021, but it's not going to be hugely faster than 3080 in most games for a long time. See above. Being shafted is when you order one product and get something else, or are somehow treated "unfairly." Like, trying to order an RTX 3080 and having bots buy out all the stock before you might qualify as being shafted. Buying a 10GB card when 16GB and 20GB cards will come out in the future? That's a choice you make; no one is forcing people to buy 10GB cards. (And Cyberpunk 2077 seems to be just fine with 8GB of VRAM, FWIW.)
I was referring to what Nvidia did with the Super models on Turing. People who waited got an minor performance bump for the same price.
Nvidia could do that with those higher Vram SKUs, right? They may not get more performance, but if they offer extra resources for the same price...
 
  • Like
Reactions: phenomiix6
I was referring to what Nvidia did with the Super models on Turing. People who waited got an minor performance bump for the same price.
Nvidia could do that with those higher Vram SKUs, right? They may not get more performance, but if they offer extra resources for the same price...
They're not going to double the ram and sell it for the same price. The 16/20GB models were cancelled. That's pretty much been confirmed at this point. The new rumors are Ti models for both the 3070 and 3080 which will have to be more than a RAM increase. A 3080 with 20GB of RAM and a cuda core count similar to a 3090 will almost assuredly have an MSRP of $1000 and compete with the 6900XT. No one who bought a 3080 for $700-800 at launch is getting shafted because months later Nvidia releases a $1000+ card that's faster.
 
Last edited:
The bump in specs from the 6800XT to 6900XT is very minor, so I agree that the bump in price of 350 USD may not be justified. But there are few things here we also need to consider,
  1. This is a full fat Navi21 (no CU cut), and a binned chip to have higher clocks and same power requirements - So its going to be rare out of the fab. Even the RTX 3090 don't give you the full fat GA102, and its also a silicon lottery if you want a better binned chip.
  2. USD 999 seems like a lot of money, but when you consider that RTX 3080 is selling way off its MSRP and RX 6800 XT that is almost non-existent, I feel the price of the RX 6900 XT looks somewhat reasonable for a halo product.
 
What i am seeing is the 6900xt is the best card until you turn on the nvidia bells and whistles.
Not an AMD fanboy. I prefer nvidia but make due with more affordable products.

That is if the Nvidia bells and whistle is available in the game to begin with. Not all games run RT and DLSS. RT will get a higher adoption rate since both AMD and Nvida supports it now. But RT without DLSS may be a problem because performance will likely tank (AMD will tank more). If you have a 1 grand card, would you prefer higher FPS + resolution or have RT on?
 
  • Like
Reactions: spentshells
To anyone who got any piece of tech this year:

HOW?

I was on walmarts site every drop of the ps5, I was there on the drops for 3080s, and now I was there for the 6900xt refreshing sites on my phone and computer.
Add to cart button to out of stock.
Every. Single. Time.
Is it really just luck? Browser dependent? I've never had so many issues trying to secure something online.

Automation - Lots of bots hammering the sites as soon as they become available.
 
To anyone who got any piece of tech this year:

HOW?

I was on walmarts site every drop of the ps5, I was there on the drops for 3080s, and now I was there for the 6900xt refreshing sites on my phone and computer.
Add to cart button to out of stock.
Every. Single. Time.
Is it really just luck? Browser dependent? I've never had so many issues trying to secure something online.

Got my 3090 through evgas que system.

Got my 5800x by watching a youtube stream of macros which hit all the big retailers and make a sound when stock drops. I think its called shiny tech deals.
 
What I get from the latest GPU reviews (either Nvidia or AMD) is that:

RT is something only the top end nvidia GPUs can handle on most games, and at high resolutions often becomes viable only if DLSS is enable (which is only precent in very little games, and it does not always work as good as you will like to).
It seems GPU stock wont be normalized at least until Q2 2021, which sucks for us gamers.
GPU prices are insane right now.
Cyberpunk fans thought the new RTX cards will suffice to play it at high resolution, details and refresh rate, and at least for now it does not seems to be the case (it may change with future patches and drivers updates). Then again this same cards struggle to run MSF and Watch Dogs Legion so I don't see why is this a surprise with Cyberpunk 2077.
If you own (like me) a RTX 2000 series card, take good care of it, and enjoy it. This new GPUs are probably not the best launch for us. Specially not for now, not with this stupid prices.
Seen the glass half full, at least we may get a game update for the The Witcher 3 that may make this awesome looking game to be even more impresive. Lets hope we don't need RTX 4000 series or RX 7000 series to play it lol.