Question RTX 3080 vs RX 6900 XT

Dec 2, 2020
4
0
10
Hi, hope that here I can get the views of most so that I can decide on what to do.

So first i know that this is not the right moment to purchase a GPU but since I have the opportunity and the prices will not be reasonable for the next 6 months (if not 1 year) i would consider making the purchase now.

I have a ryzen 5 3600x, 16gb ram and most of the time play warzone and I am looking for the upgrade to have better 1440p settings and better FPS.

When checking several sites I am finding Nvidia RTX 3080 at €$1,600 and at the same price the AMD RX 6900 xt (Sapphire nitros +)

my query is to go for the AMD gpu which has better specs that the Nvidia or else go for the RTX. the DLSS are favorable to the RTX but since the specs or the AMD are better they sometimes are close as per FPS.

what are your opinions?

Thanks
 
D

Deleted member 362816

Guest
RTX 3080 is around 7% faster and has DLSS, 6900xt has fidelityFX and more vram.

If prices are the same I think this boils down to what you prefer and like better watch some review etc and make that choice.

I have used both cards and I personally like the 6900xt better but if it came down to it I would buy which ever is cheaper.
 
First off, both cards will perform spectacularly. You will probably be CPU bound in some instances.

I would go for the RX 6900 XT. The extra 6GBs of RAM (over the RTX 3080) will ensure that performance won't drop off a cliff when all the VRAM gets used up on the RTX 3080. Even TODAY some AAA titles will use more VRAM (used NOT allocated) than the RTX 3080 has, when the settings are maxxed. Very interesting post showing off the 'VRAM cliff' I am referring to -

View: https://www.reddit.com/r/nvidia/comments/itx0pm/doom_eternal_confirmed_vram_limitation_with_8gb/


Also, note that the latest Unreal Engine 5 prefers it's software-based 'lumens' ray-tracing technology to hardware ray-tracing. Initial tests show that RX 6000 cards benefit more from 'lumens' ray-tracing than RTX 3000 cards, However, RTX cards had such an advantage previously that it may just be more of an evening out performance-wise.
 
Last edited:
  • Like
Reactions: Phaaze88
D

Deleted member 362816

Guest
Source?
(Tom's Hardware shows the RX 6900 XT leading the RTX 3080 in it's massive multi-game test)

I read that article never mind. Good to know the card I like best is performing better. I think with DLSS though from what I have personally seen the 3080 still pulls ahead in many titles.

Keep in mind I am a AMD fan boy so not like I am trying to knock the 6900xt which is what I personally used.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3080-vs-AMD-RX-6900-XT/4080vs4091
 
I read that article never mind. Good to know the card I like best is performing better. I think with DLSS though from what I have personally seen the 3080 still pulls ahead in many titles.

Keep in mind I am a AMD fan boy so not like I am trying to knock the 6900xt which is what I personally used.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3080-vs-AMD-RX-6900-XT/4080vs4091
Definitely. AMD has some massive catch-up with FidelityFX and ray-tracing. Although, I haven't found the ray-tracing hit on AMD cards to be as bad as some make it out to be. Metro Exodus Enhanced looks spectacular and is smooth as butter with RT on and all settings (except one) maxxed with my RX 6900 XT.

I'm a fanboy of all technology. ;) I've switched back and forth between AMD and Intel for CPUs and AMD (ATI) and NVIDIA for GPUs so many times that there's no way I can pick one over the other.
 
D

Deleted member 362816

Guest
Definitely. AMD has some massive catch-up with FidelityFX and ray-tracing. Although, I haven't found the ray-tracing hit on AMD cards to be as bad as some make it out to be. Metro Exodus Enhanced looks spectacular and is smooth as butter with RT on and all settings (except one) maxxed with my RX 6900 XT.

I'm a fanboy of all technology. ;) I've switched back and forth between AMD and Intel for CPUs and AMD (ATI) and NVIDIA for GPUs so many times that there's no way I can pick one over the other.

I agree with everything you are saying. I really enjoy my 6900xt and I wouldn't trade it for anything on the market currently. I have the ASRock Radeon RX 6900 XT PHANTOM GAMING D, The temps are great and to date I have had zero issues with it.

That being said the pc's that I have built with Founder ED 3080's I have had no issues with also other then some weird power draw issues.
 
  • Like
Reactions: alceryes
I agree with everything you are saying. I really enjoy my 6900xt and I wouldn't trade it for anything on the market currently. I have the ASRock Radeon RX 6900 XT PHANTOM GAMING D, The temps are great and to date I have had zero issues with it.

That being said the pc's that I have built with Founder ED 3080's I have had no issues with also other then some weird power draw issues.
Mine is an AMD reference RX 6900 XT. It's actually boosts to 2500MHz on its own, without any kind of voltage or settings change. I'm hoping that when I get the time to tweak (sigh - looking like after the holidays now) I'll get some great results.

I had an RTX 3080 for a bit but it had stability issues. It was a refurb and I knew what I was getting into when I got it. When it did work it was awesome. The power draw issue on the RTX 3080 is a known issue. They sometimes have transient power spikes that can trip the sensitive PSU overcurrent protections, even in some very expensive PSUs.
 
Jan 12, 2022
1
0
10
He said "7% Faster"...which for CPU's & GPU's that never guarantees actually better. The speed (mhz) range of the GPU is just 1 metric adding up to it's overall performance, which you gave just 1 reviewer's example of. Other metrics are the processor type & it's chipsets' power itself...the VRAM amount, VRAM type, VRAM speed etc...features, driver functionality & quality & wattage TDP. As an example of wattage power, in the CPU race for "gaming supremacy", Intel stated their 12th gen i9 regained the "crown" of highest per core scores etc....but didn't mention the power consumption. They basically OC the CPU out of the box, causing it to suck up 238-241w indefinitely at load, vs the Ryzen 9 5900x & 5950x which both have only a 105w TDP & will boost up to approx 140-150w, which is 100w shy of the Intel *winner...& that's actually in & of itself almost 100w lower than it's 11th gen 11900k predecessor which consumes as much as 340+w!!!

Keep an open mind with the '22 cards from Nvidia & AMD as something as trivial as driver compatability may ultimately be the difference between the eventual champ!