News GPU Face Off: GeForce RTX 3090 vs Radeon RX 6900 XT

Status
Not open for further replies.

watzupken

Respectable
Mar 16, 2020
821
385
2,270
1
I am not sure what is the point of comparing DLSS with one that runs at native resolution. The fact that the 2 cards don't differ that greatly in games in the first place is a clear indication that DLSS will be faster, assuming the game supports DLSS. While its true that AMD will have their DLSS competitor at some point this year, but till then, this comparison is moot.
 

d0x360

Honorable
Dec 15, 2016
70
27
10,570
3
Nvidia is the only one that supports Tensorflow, and even witouth Tensorflow, 24 Gb matters a lot in AI.
We are talking gaming here. Anyways I've been looking to upgrade my 2080ti ftw3 ultra and I figured this generation is get a 3080ti when it gets released but next generation... It's highly likely I'll be going AMD.

They are right up there in performance except with ray tracing BUT it's going up against nVidia 2nd gen rtx so I wasn't expecting AMD to win there. Navi 3 is going to be a different story. Improvements to ray tracing, their own version of DLSS and likely a lower price.

Hell AMD could probably beat the 3090 with Navi 2. Increase the memory bus, increase the infinity cache, switch to gddr6x annnd that's probably all it would take.

Truth is ray tracing still isn't in that many games. That will obviously change but for now you really aren't missing out on much...

My biggest concern is how DLSS is becoming a crutch to get better performance especially at 4k. Yes it does look better than native 4k with TAA simply because of how TAA works. Surely a better method of anti aliasing is in the world somewhere. Something that can handle everything like TAA but with the quality of MSAA.

I just don't want devs to skip proper optimization and just toss in DLSS like they did with Horizon ZD & Death Stranding. The fact that a 2080ti needs DLSS to run death Stranding at 4k60 is ridiculous... It's a game designed for the ps4. A high end pc should not have needed DLSS to get to 60fps... Hell it should probably run at 4k90 no problem. Then we have Horizon.. that game doesn't have DLSS and it runs terribly. It's the same engine so whats the issue?

By comparison I can run Forza Horizon 4 at max settings at 4k with 8xMSAA! at a locked 60.

If I drop it down to 4xMSAA which is still insane at 4k in terms of performance penalty the game runs at a solid 90fps.

So this is a game with insanely detailed car models, really good ai, a 300hz physics engine thats far more taxing than anything in HZD or Death Stranding... So it must come down to optimization because 4k with 8xMSAA is essentially the same performance penalty is ray traced gi, shadows and reflections all turned on... It might even be a bigger penalty because MSAA is expensive at 1080p let alone 4k.

Ok rant/ramble over lol
 
Reactions: Phaaze88

Karadjgne

Titan
Ambassador
3090 vs 6900XT: Which is the better card.....

Obviously the one that you can actually get your hands on. I certainly do not care about a few measly % differences or if one card gets 200fps on 4k and the other one gets 185fps. All that matters is having either one of the cards actually physically inside your pc.

You'd have to be a complete and total moron to pass up the option for either card at this point in time, just for a few fps or better quality picture on a 4k monitor.
 
Reactions: Phaaze88

carocuore

Notable
Jan 24, 2021
382
88
840
33
AMD is dead to me. It took the ripofftracing bait just to justify a 50% increase in their prices.

AMD cards died with the 5700XT. It's not an alternative anymore, it's the same with a different logo.
 

watzupken

Respectable
Mar 16, 2020
821
385
2,270
1
AMD is dead to me. It took the ripofftracing bait just to justify a 50% increase in their prices.

AMD cards died with the 5700XT. It's not an alternative anymore, it's the same with a different logo.
I am confused. What are you comparing to show that 50% increase in price? 6700XT vs 5700XT? And in case you have not noticed, the prices of ALL graphic cards have gone through the roof.
 

dehdstar

Reputable
Nov 30, 2017
12
0
4,510
0
GeForce RTX 3090 takes on Radeon RX 6900 XT in this extreme GPU match up, where we look at performance, features, efficiency, price, and other factors to determine which GPU reigns supreme.

GPU Face Off: GeForce RTX 3090 vs Radeon RX 6900 XT : Read more

Yeah, I found it really interesting what AMD did to brag about their efficiency, this gen. We knew it was coming, when they abstained from releasing a high-end GPU after Vega failed. I mean, there's speculation as to why GCN 3.0 failed. Some articles proposed that the adoption rate of optimizing for Graphics Core Next was the issue. So, we never seen it really shine. The same was said about the 1000 and 2000 series, even though the former did really well and won that generation.

This time around, AMD had the more efficient tech...and they still do. AMD ups the TDP to 450W, next gen and Nvidia will be pushing closer to 600W, with some articles even reporting 900W, but that might just be for the 4090 TI. My wife and I notice our energy bill, even with me gaming on my 300W card alone lol. I am not going to an Nvidia, until they overhaul their tech, as AMD did and make it efficient. But someone, to whom cost is not objective? A die-hard Nvidia fan? Well, they will do everything in their power to swallow that cost and pretentious Mac-like pricing scheme lol that Nvidia have been known to impose. At least, where their high-end cards are concerned. You'll pay 3,000 for a high-end GPU that runs 1000W, if that's what Nvidia expects you to do, for high end gaming. That's just the enthusiast market for you.

And here where it gets interesting. I'm reminded of the time Phil Dunphy says "My, how the turn tables have..."
Just as the Radeon 1800XT did a long time before it, they schooled Nvidia. Of course, this was known to happen with ATI designed cards...all the way up till the AMD takeover, and Radeon finally dissipated into a more budget-minded brand, as did their CPUs at the time. Only in modern times, has AMD finally stepped up to become a true competitor that is after much more than just the middle market.

The 6000 series represents the Ryzen 1000 series. In that it steps up the game, to provide a major headache to the competition and I think this reborn AMD is here stay...it has to be, if Intel is not coming into the GPU market. Anyway, I feel like the 5700 XT released to send a message: "Be glad we're not releasing a high-end GPU this year!" I knew something was up, when I saw that the 5700 XT, with only half the cores, outperforming Vega 64 and Radeon VII, which have 64 or 60 CUs respectively (something equivalent to a 6800).

The 6900 XT released with one arm tied behind its back...which is to say, "power starved," but only to make a point... "We can draw little over 250W (something around 255W, tops) and nearly match par with the 3090. But should you open up a little, known tool, called, "More Power Tool?" You can bring the 6900 XT neck and neck and neck with a 3090, maybe even more so. Don't touch the clocks...just set the TDP to something closer to the 3090 (just under 350W) and? Done. It's an, effectively, a budget-minded 3090. Next gen should be interesting, to see what Nvidia does to tighten things up a bit. You can only push existing tech so far, before a redesign is warranted and they are effectively riding the same tech as they had with RTX 2000 series. It's no lie, they're going to have to throw more processors and power to compete. If AMD plan to throw a whopping 450W at the 7900 XT? Things are going to get ugly at Nvidia's side of the table...especially with power and gas pricing going up.
 
Status
Not open for further replies.

ASK THE COMMUNITY