News GeForce RTX 4070 vs Radeon RX 6950 XT: Which GPU Is Better?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Sleepy_Hollowed

Distinguished
Jan 1, 2017
512
200
19,270
This is basically a non-question.
If you can fit the bigger card and don't care about raytracing, get the AMD one.
Otherwise, honestly, skip that nvidia card unless your card died. It's a staggeringly terrible card unless you're only going to play older games, and even then, it's too expensive.
I don't know how this is even a question, but the cost of optimizing RAM and for raytracing in 2023 is still too high.
Until there's some sort of middleware or underlying framework for games that does this automatically, there's just no way I could recommend a 12 GB card that calls itself "mid range" at those prices.
 
clearly a sponsored article from ngreedia. and the author and editors of this site should hang their heads in shame for trying to pass this off as anything but.

this is something akin to what i'd expect to see from "userbenchmark" this site has fallen pretty low. when i thought their "just buy it" coverage of the 2000 series was peak shill... they release this userbenchmark quality article.
 
Jul 7, 2022
602
562
1,760
This is such a limited view of things. RX 6950 XT is not "much faster" at raster. It's faster, yes. In some cases it's barely faster, in others it's 20% faster (one GPU tier), and in a select few (generally AMD promoted) games, it might even be 35% faster.

But it's slower in RT, yes. Provably so in virtually every case (except games that do very little with RT, like Dirt 5, Far Cry 6, Resident Evil Village...)

It's also slower in games that support DLSS and not FSR2. And it's potentially much slower in games that support DLSS 3 (though again, that's not quite as big of a performance jump as Nvidia promotes).

And if you do anything with AI, you'd have to be heavily sponsored by AMD to think that's the best option. Getting most AI projects to run on AMD hardware right now is a pain. It's why so many of the projects don't even bother.

But let's go back to the RT and DLSS topic. Because Nvidia has tied RT with DLSS so heavily, most games that use significant RT effects also support DLSS. So if we're being real, it's not just potentially a bit slower, it's the difference between fully playable 1440p with DLSS Quality versus not playable unless you turn off RT (on the 6950).

That right there is one of the biggest things in favor of Nvidia RTX 40-series right now, and even 30-series to a lesser degree. For better or worse, Nvidia has the market share to make things happen. It's pushing ray tracing and DLSS, because it knows it can win every time those are factored in. As a gamer, if I'm just playing games and not testing, I will generally turn on DLSS upscaling and Frame Generation if they're available. Even on an RTX 4090! Because again, being real, it makes games feel better (smoother) and that easily outweighs the potential loss in image fidelity or slight increase in latency (with Frame Generation).

Also, it means you can do stuff like this:
WnbmhVCALAEey7LFqJvkXZ.png

That's fully path traced, and while it doesn't make CP77 a better game, it does make it look better. 1440p with Quality or Balanced upscaling plus Frame Generation is very playable on an RTX 4070. The fastest AMD cards barely break 30 fps (with FSR2 Performance mode). That's the worst-case scenario for AMD, and while it's mostly limited to one game (unless you want to count Minecraft, Quake 2, or Portal RTX), I think we're going to see more experimentation with full ray tracing (aka "path tracing") in the coming years.

Would I sacrifice a bit of rasterization performance, dropping from ~113 fps on average to ~98 fps on average at 1440p, in order to potentially get to play around with future path tracing games? Yes, I would. And I think a lot of other gamers would as well. And the longer AMD downplays ray tracing and calls it unnecessary, the more it's going to fall behind until we reach the point where, for a lot of games, it actually won't be unnecessary.

People should stop making excuses for AMD's lack of RT performance and ask them to do better. Because more than any other graphics technology, ray tracing has the potential to truly change the way games look. We're not there yet (well, we sort of are in Cyberpunk 2077 Overdrive mode), but we're moving in that direction. If you want to hate Nvidia for pushing us there, feel free, but there's a reason CGI gets used so heavily in movies. Games are years away from Hollywood quality CGI, but they continue to close the gap. I'm looking forward to seeing more truly next-gen graphics implementations, rather than minor improvements in rasterization techniques.
Well that is certainly your opinion and a valid yet minority one at that, however, I would say the predominant opinion is that, until native ray tracing performance is at a point where rasterization is no longer needed, it’s a gimmick, sort of like how for car enthusiasts, hybrids don’t get any respect, but pure ICE (raster) and pure EV (ray trace) cars are all the rage. Same goes for DLSS and FSR, it’s sort of like car purists prefer a high displacement naturally aspirated engine (high frame rate native performance) vs a small displacement forced induction engine (low frame rate native/high frame rate upscaled performance). The overwhelming sentiment is “there’s no replacement for displacement”.

And as for frame generation, an opposite valid opinion is that it’s glorified image interpolation that gives the false sense of better than native smooth video, (with fake frames that sometimes look like kindergarteners drew them), while delivering input latency worse than native. It’s not an advantage, it’s simply trading one pro for several cons. At least with DLSS, the pros outweigh the cons for the most part.

The reason I believe why ray tracing should not be as highly weighted as what you give it is that rasterization is the most important performance metric because, as long as Nvidia insists on hybrid raster/ray-tracing, the dependent variable is raster performance and the independent variable is ray tracing performance, IE with a high performance raster card with poor RT performance, you still have a playable experience by turning off ray tracing, but if you have a high performance RT card with poor raster performance, then the GPU is useless with ray tracing both on and off.

P.S. on a lighter note, I guarantee that Nvidia’s 5000 series will come with proprietary DLSS 4 with “Frame Synthesis” Tech, then DLSS 5 with “Artificial Frame Injection” Tech on the 6000 series, “Simulated Frame Infusion” Tech, etc.
 
May 6, 2023
13
21
15
Performance means performance. Rasterization, ray tracing, upscaling, AI. Ignoring any of those elements is biasing the results. I spent a lot of time discussing them because they are important forward-looking tech that impacts performance.

That’s they also play a part in features is irrelevant. Had this been RX 7000, there would be potential to at least note DP2.1 — which I noted regardless. For the 6000-series, there’s not a single exclusive feature that comes to mind.

I note on the performance conclusion that if you’re sure you don’t care about RT, DLSS, and AI, then the 6950 wins on rasterization alone. That’s ultimately going to be more limiting (IMO) in the settings you might choose to run down the road, more so that 12GB vs 16GB.

As for anyone wondering why this comparison, it’s because it’s my job to do articles like this. “Jarred, write a face off of RTX 4070 versus whatever AMD GPU makes the most sense” was my instruction. If we had RX 7700/7800, I could have done that instead. But AMD is the one that dropped 6950 pricing to try to put it against the 4070.

People — not AMD or Nvidia fanboys, because they’ve already decided — just want information, and are wondering which $600-ish GPU is the better choice. Of course AMD fanboys will spout the AMD party line: More VRAM! Faster rasterization! And Nvidia fanboys will do the same: DXR, DLSS, AI, etc. There are valid points for both sides, but the middle ground is usually where the truth lies.

Is having 16GB better than 12GB? Yes. How often does it make a big difference? it matters in AI LLMs and a select few games with often questionable optimizations. (Nearly every game that has poor performance on a 12GB card has also been lambasted for being a bad port or poorly optimized.)

Is faster ray tracing performance better to have? Assuming it doesn’t massively compromise rasterization performance, objectively yes. Why wouldn’t you want better performance in something you might potentially use?

Is having the option to enable DLSS and DLSS Frame Generation good? If you don’t have a horse in the race, yes. Why wouldn’t you want to have more choice?

Now, how important each of those is in the grand scheme of things definitely becomes subjective. I, personally, want them all. I want more efficient parts that cost less as well. But ultimately you have to decide for yourself.
Anybody could construct an article with much better balance view on this match up :
1. Raw performance.
2. Features performance.
3. Price & efficiency.
Your article constructed to highlight RT/DLSS. So title should not be a matchup. But something like: "(PR) 4070 vs 6950XT. The RT/DLSS King".
Stop digging :)
 

Thunder64

Distinguished
Mar 8, 2016
114
161
18,760
...If you want to hate Nvidia for pushing us there, feel free, but there's a reason CGI gets used so heavily in movies. Games are years away from Hollywood quality CGI, but they continue to close the gap. I'm looking forward to seeing more truly next-gen graphics implementations, rather than minor improvements in rasterization techniques.

Yes AMD should do better in RT. BUt you go on to talk about CGI in movies? It is used "so heavily" because it is it is far cheaper and I would guess audiences have adjusted.

BUt why do people think "Jurassic Park" looks more real than "Jurassic World"? Limited use of CGI.


This may have to do with the "Uncanny Valley" effect, which I am sure you know about.

It may end up as another gimmick like high framrate movies that people didn't like or 3D TV. Or it could become the future. It is hard to tell just yet.
 

sherhi

Distinguished
Apr 17, 2015
79
51
18,610
"The GeForce RTX 4070, which finally brings the latest generation hardware down into the realm of affordability for many gamers.".

Affordability?? Please, stop embarrassing yourself!!

And this whole ludicrous article, pitting the latest technology GPU from Nvidia against a 2 year old AMD card is a very transparent PR stunt. What's next? The pitiful 4070 against the Rage 128 from 1998?

When I mention the shameful daily kissing up to Nvidia, the senior editors here are outraged! But we're getting used to this silliness.
Comparison of these 2 cards is absolutely okay even though one is older tech, you know why? And why the author decided to test them against each other? Because they have similar price...which he then throws out of the window and uses MSRP...


This is also why comparing let's say 6 core AMD Vs 6 core Intel CPU if they are significant amount of money apart makes no sense. Comparing products in certain price brackets makes sense.
 
  • Like
Reactions: eldakka1

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
A lot of people here baselessly accusing @JarredWaltonGPU of being paid or influenced by Nvidia... There's no way he or anyone at Tom's would compromise their integrity like that. You all need to calm down, take a deep breath, Tom's Hardware has been doing great work for as long as I have been on the internet.
Do authors and editors have their favourites and biases? Yes, they do but not that at the cost of their professionalism and work ethics.
On the subject at hand, as I said before, there is no reality where the 4070 is a better purchase or GPU than the 6950 XT. The raster performance gap is so wide you'd have to be out of your mind to pay the same money to get the 4070. RT performance is moot because the framerates you get out of the 4070 are so low that playing with the feature enabled doesn't make sense.
I won't even address the bizarre inclusion of Stable Diffusion benchmarks.
Power consumption is a win for 4070, numbers don't lie and Jarred's point is entirely valid.
Did they use an AMD CPU? Perhaps that would skew things in the 6950's favor more
The question is, why don't you? According to all sources including Tom's AMD has the fastest gaming CPUs, yet you insist on using the 13900K ( very power inefficient as well) for testing. Preferring Intel CPUs is definitely a proven bias at Tom's, that I have observed.
 
  • Like
Reactions: Jagar123

sherhi

Distinguished
Apr 17, 2015
79
51
18,610
Here is what I found with a simple 5 second google search. As for being real, I am more real than you, and calling a subject backed up by verifiable evidence as non-sense sounds like ridiculous conjecture from someone who doesn’t know anything more than a lamen understanding of graphics architectures.


View: https://m.youtube.com/watch?v=RQA3cTPNicg


View: https://m.youtube.com/watch?v=oH0UkA7eoCk


View: https://m.youtube.com/watch?v=iSIif657MO4


View: https://m.youtube.com/watch?v=fJc--C01P90








You don't have to go that far.


Reason to avoid cards like 6600: "only 8gigs of RAM"
Reason to avoid 4070: "12gigs may not be enough for future games" (this negative is conveniently left out on some other cards in that article)

Btw there is graph in that article with 4070 and 6800 and 100 dollars price difference. Just saying.
 

zx128k

Reputable
Wrong wrong wrong, frame generation significantly increases input latency.


Just quit man
It reduces it compared to native rendering. The cards gets a higher latency if it renders at native 4k at a lower fps. If you switch on DLSS frame generation, you render an output at 4k. The latency is lower than what you got at native rendering on that card.

Thus, via basic understanding of how DLSS works. It always reduces latency and it imposible for it to be otherwise.

On PC nVidia control the market. AMD has a very small market share. nVidia cards are the VRAM limits software has to follow.
 

zx128k

Reputable
Anybody could construct an article with much better balance view on this match up :
1. Raw performance.
2. Features performance.
3. Price & efficiency.
Your article constructed to highlight RT/DLSS. So title should not be a matchup. But something like: "(PR) 4070 vs 6950XT. The RT/DLSS King".
Stop digging :)
Basically AMD cards have less RT performance. It down to hardware design and nothing else. nVidia cards have better upscaling using AI. nVidia costs more and controls most of the market. AMD are a small PC discrete GPU manufacture.

Basically the 4000 series has stronger DXR hardware and DLSS frame generation. DLSS frame generation in games that support it, means AMD cannot win in FPS. An AMD 6950 cannot match a 3080 10GB in DXR performance.

Coming to the overall PC market share hold, AMD's share increased by 0.4%, Intel's share declined by -1.1% and NVIDIA's shares increased by 0.68%. Intel still retains a 71% market share within the overall GPU segment with NVIDIA coming in at the second place at 17% and AMD at 12% market share. The discrete GPU market share declined to 13 million units but NIDIA retained its >80% market share. Intel maintained its 6% share and AMD retained 9% market share in terms of shipments.
With so little of the discrete GPU market, AMD is basically irrelevant. nVidia doesn't compete with AMD, Intel does.


Intel_aurora_004.png

3.png

As you can see latency is below native render on the tested card.

4.png

Big increase in FPS.
 
Last edited:

irish_adam

Distinguished
Mar 30, 2010
229
50
18,760
>Based on the majority of your readers reactions to your article, and the abysmal adoption of 4070’s by the gaming community, you either wrote an objectively biased article, or you are out of touch with the cohort population you target.

Bandwagon arguments aren't compelling. And calling somebody "biased" just calls to your own bias. BTW, "objectively biased" is an oxymoron.

I don't have a dog in this fight. But it's always been clear to me that there's a strong contingent of AMD fans here (as on other online forums), that gets vocal every time somebody says something bad about AMD. Even as I scanned through the piece, it was very predictable what the reaction would be in the forum. And true to form, they (you) delivered.

At the end of the day, what's better or worse is just an opinion. It depends on how one weighs the various factors. It doesn't matter which one "wins," but whether the info is good, as then you can make your own judgment based on those info. I don't see anybody claiming Jarred's info is wrong. Take that for what it is, and leave the winning/losing beeswax to the kids to squabble over.

Then again, I get why we see "winning/losing" pieces, on here and elsewhere. They get more user engagement, more than just a simple comparative listing of A-vs-B stats. As said, people are predictable, and a sure-fire way to get clicks is to get them to argue their "side" is better.

One comparison that is striking to me is the drastic difference in power consumption, where the 4070 avg 183.4W vs the Asrock 6950XT's 378W (or MSI's 440W). 200W of additional power is a significant factor for case cooling. It's weird that no AMD fans piped up about that HUGE disparity, especially as they love to carp about power efficiency when it comes to Intel vs AMD CPUs.

Jarred's piece didn't change my mind about which is better. But the info is good (I didn't know about AMD's power hog tendency) which makes it a good piece.

Edit: Upon 2nd reading, Asrock 6950XT avg 305.4W (and 326.4W for 4K). The above numbers were for peak use. Still a huge difference in power use.

It is easy to just through away everyone's comments here as just "AMD FANBOIS" but apart from the VRAM issue brought up, no one here is bashing Nvidia. I stated in my comments that I didn't have a problem with the writer crowning the 4070 as the winner, there are compelling reasons to think that and he is entitled to his opinion.

The issue everyone here has is that writer claims the 4070 generally outperforms the 6950 which it doesn't as has been highlighted. The 4070 really only wins when using proprietary tech that is incorporated in only a fraction of games.

Now full disclosure I currently have a 7900XT but before that I had a 2080ti. I have always tried to get the best bang for buck at the price range I want to pay regardless of brand (this goes for CPU's too) I rely on tech sites to give me facts that I can use to make informed decisions and I'd rather not be mislead by posting opinion pieces masquerading as factual pieces.
 

zx128k

Reputable
It is easy to just through away everyone's comments here as just "AMD FANBOIS" but apart from the VRAM issue brought up, no one here is bashing Nvidia. I stated in my comments that I didn't have a problem with the writer crowning the 4070 as the winner, there are compelling reasons to think that and he is entitled to his opinion.

The issue everyone here has is that writer claims the 4070 generally outperforms the 6950 which it doesn't as has been highlighted. The 4070 really only wins when using proprietary tech that is incorporated in only a fraction of games.

Now full disclosure I currently have a 7900XT but before that I had a 2080ti. I have always tried to get the best bang for buck at the price range I want to pay regardless of brand (this goes for CPU's too) I rely on tech sites to give me facts that I can use to make informed decisions and I'd rather not be mislead by posting opinion pieces masquerading as factual pieces.
1080p.png

At 1080p the 6900xt and 4070 are close in raster.

relative-performance-rt-1920-1080.png


7900xt and 4070 close in DXR.
 
No mention of AV1? I think that's a good thing going for the 4070. Once YT and Twitch add it to their upstream, it'll become a must have, for sure. For the uninitiated, I'm talking about AV1 encoding.

As for the VRAM argument. Sorry Jarred, but I do VR, so for me VRAM matters as much as raw performance in raster, which the 6950XT has plenty of (I have a 6900XT for VR).

While the 4070 is vastly more efficient, let's not kid ourselves here: the target audience of ~£500+ GPUs won't care much about power usage. Maybe heat, but not power; at least not in a significant enough manner to sway a purchase decision at that price range.

Also, ROCm may sway the compute element in favour of AMD soon. They're making really good strides there and it wasn't even mentioned in the article.

Anyway, I think the 6950XT is the smarter buy for most people and whoever wants nVidia will get nVidia no matter what anyone says to them.

Regards.
 

zx128k

Reputable
No mention of AV1? I think that's a good thing going for the 4070. Once YT and Twitch add it to their upstream, it'll become a must have, for sure. For the uninitiated, I'm talking about AV1 encoding.

As for the VRAM argument. Sorry Jarred, but I do VR, so for me VRAM matters as much as raw performance in raster, which the 6950XT has plenty of (I have a 6900XT for VR).

While the 4070 is vastly more efficient, let's not kid ourselves here: the target audience of ~£500+ GPUs won't care much about power usage. Maybe heat, but not power; at least not in a significant enough manner to sway a purchase decision at that price range.

Also, ROCm may sway the compute element in favour of AMD soon. They're making really good strides there and it wasn't even mentioned in the article.

Anyway, I think the 6950XT is the smarter buy for most people and whoever wants nVidia will get nVidia no matter what anyone says to them.

Regards.
The RX6950 is like the 3090/3090 ti, very few people will buy one and no one will advise/recommend you buy these cards.

AMD Radeon RX 6950 XT review: Buy the 6900 XT instead

The RX 6950 XT is a graphics card that doesn’t need to exist. It’s a lot like Nvidia’s 12GB RTX 3080 — a refresh meant to push prices up without offering any clear performance improvements.
For a lot of gamers, it doesn't even matter how fast the RX 6950 XT might be because — much like the RTX 3080 Ti and above from Nvidia — there's no way in Hades that they're going to spend more than $1,000 just on a graphics card.
Also a 6950 is 629.99 and a RX 7900xt is $789.99. There is no reason to every buy a 6950 unless you are helping AMD get rid of unsold 6950's. The 6900xt/6950xt is lower in price for a reason.
 
Last edited:
The RX6950 is like the 3090/3090 ti, very few people will buy one and no one will advise/recommend you buy these cards.

AMD Radeon RX 6950 XT review: Buy the 6900 XT instead



Also a 6950 is 629.99 and a RX 7900xt is $789.99. There is no reason to every buy a 6950 unless you are helping AMD get rid of unsold 6950's. The 6900xt/6950xt is lower in price for a reason.
I think you're trying to clutch at straws really hard here.

It's ok. Different people have different needs and can, in fact, have a reasonable explanation why they choose what they choose.

Also, the 6900XT is not being manufactured anymore AFAIK. I wouldn't be surprised the 6950XT on shelves is all there is left as well.

Regards.
 

baboma

Prominent
Nov 3, 2022
200
182
760
>It is easy to just through away everyone's comments here as just "AMD FANBOIS" but apart from the VRAM issue brought up, no one here is bashing Nvidia. I stated in my comments that I didn't have a problem with the writer crowning the 4070 as the winner, there are compelling reasons to think that and he is entitled to his opinion.

Excuse the broad brush. Sure, not everyone who disagrees with the piece is a fanboy, just the shrill ones accusing THW of being shills. And there are always enough of them to make a circus.

It's fine to disagree, and to explain your reasons why. It's not fine to get worked up over it and get in a huff about being biased this or shill that. (Not to say that you did or didn't; honestly I haven't kept track.) As you said yourself, the writer is entitled to his opinion, and others are entitled to disagree. No need for all the sturm und drang.

It's downright moronic to use "bias" as some sort of a disqualification because everybody is biased to some degree. Being biased is not an argument, just as having a different opinion doesn't equate to being wrong. Too many people are hung up on having their view as being the only "right" view, and everyone else being "wrong."

As for my opinion, I don't think the 4070-vs-6950XT is a good compare, as 6950XT will soon be eclipsed with the 7700/7800XT launch in a month or two. But as Jarred said, he needs to do an A-B piece now, and he can only compare cards that are available. This piece will soon be superceded with a 4070-vs-7800XT piece, which I think would be more relevant.

Again, sorry to you and others who feel slighted from the "fanboy" comment.
 

zx128k

Reputable
I think you're trying to clutch at straws really hard here.

It's ok. Different people have different needs and can, in fact, have a reasonable explanation why they choose what they choose.

Also, the 6900XT is not being manufactured anymore AFAIK. I wouldn't be surprised the 6950XT on shelves is all there is left as well.

Regards.
Get a 7900xt?
 
Get a 7900xt?
If you make that argument, then why not go up in ~$150 jumps and just get a 4090 instead? Ah, but $1600 is too much! Well, almost $800 for a lot of people is harder to justify than $650. Whomever can get hardware at $800 surely can get a 7900XTX then, right? And then the jump from $950 is just small beans to $1600! And let's ignore the 4080 and do ourselves a favour.

Sarcasm aside, not a good argument IMO. They're different price points for a reason. I do agree if you can get a 7900XT, then it is a better buy, but this article is not about the 7900 siblings or even the 4070ti/4080 (4080 siblings?).

Regards.
 
  • Like
Reactions: Nyara and eldakka1

zx128k

Reputable
If you make that argument, then why not go up in ~$150 jumps and just get a 4090 instead? Ah, but $1600 is too much! Well, almost $800 for a lot of people is harder to justify than $650. Whomever can get hardware at $800 surely can get a 7900XTX then, right? And then the jump from $950 is just small beans to $1600! And let's ignore the 4080 and do ourselves a favour.

Sarcasm aside, not a good argument IMO. They're different price points for a reason. I do agree if you can get a 7900XT, then it is a better buy, but this article is not about the 7900 siblings or even the 4070ti/4080 (4080 siblings?).

Regards.
Lets ignore that the 6950xt is basically not worth the money. The RT performance is garbage and it has no AI cores. At least the 7900xt has decent RT performance and will be good for a few years.

The ray tracing performance of the RX 6950 XT is closer to that of the RTX 3070 Ti, or about 23% slower than the RTX 4070. The RTX 4070 is a more efficient GPU, and also offers next-gen features such as DLSS 3 Frame Generation. ZOTAC Gaming GeForce RTX 4070 Twin Edge is $599.99. A gaming power draw is 200w for the 4070 and 350w for the 6950xt reference design.

Like I stated you would never buy a 6950xt.
 
This is such a limited view of things. RX 6950 XT is not "much faster" at raster. It's faster, yes. In some cases it's barely faster, in others it's 20% faster (one GPU tier), and in a select few (generally AMD promoted) games, it might even be 35% faster.

But it's slower in RT, yes. Provably so in virtually every case (except games that do very little with RT, like Dirt 5, Far Cry 6, Resident Evil Village...)

It's also slower in games that support DLSS and not FSR2. And it's potentially much slower in games that support DLSS 3 (though again, that's not quite as big of a performance jump as Nvidia promotes).

And if you do anything with AI, you'd have to be heavily sponsored by AMD to think that's the best option. Getting most AI projects to run on AMD hardware right now is a pain. It's why so many of the projects don't even bother.

But let's go back to the RT and DLSS topic. Because Nvidia has tied RT with DLSS so heavily, most games that use significant RT effects also support DLSS. So if we're being real, it's not just potentially a bit slower, it's the difference between fully playable 1440p with DLSS Quality versus not playable unless you turn off RT (on the 6950).

That right there is one of the biggest things in favor of Nvidia RTX 40-series right now, and even 30-series to a lesser degree. For better or worse, Nvidia has the market share to make things happen. It's pushing ray tracing and DLSS, because it knows it can win every time those are factored in. As a gamer, if I'm just playing games and not testing, I will generally turn on DLSS upscaling and Frame Generation if they're available. Even on an RTX 4090! Because again, being real, it makes games feel better (smoother) and that easily outweighs the potential loss in image fidelity or slight increase in latency (with Frame Generation).

Also, it means you can do stuff like this:
WnbmhVCALAEey7LFqJvkXZ.png

That's fully path traced, and while it doesn't make CP77 a better game, it does make it look better. 1440p with Quality or Balanced upscaling plus Frame Generation is very playable on an RTX 4070. The fastest AMD cards barely break 30 fps (with FSR2 Performance mode). That's the worst-case scenario for AMD, and while it's mostly limited to one game (unless you want to count Minecraft, Quake 2, or Portal RTX), I think we're going to see more experimentation with full ray tracing (aka "path tracing") in the coming years.

Would I sacrifice a bit of rasterization performance, dropping from ~113 fps on average to ~98 fps on average at 1440p, in order to potentially get to play around with future path tracing games? Yes, I would. And I think a lot of other gamers would as well. And the longer AMD downplays ray tracing and calls it unnecessary, the more it's going to fall behind until we reach the point where, for a lot of games, it actually won't be unnecessary.

People should stop making excuses for AMD's lack of RT performance and ask them to do better. Because more than any other graphics technology, ray tracing has the potential to truly change the way games look. We're not there yet (well, we sort of are in Cyberpunk 2077 Overdrive mode), but we're moving in that direction. If you want to hate Nvidia for pushing us there, feel free, but there's a reason CGI gets used so heavily in movies. Games are years away from Hollywood quality CGI, but they continue to close the gap. I'm looking forward to seeing more truly next-gen graphics implementations, rather than minor improvements in rasterization techniques.
In the coming years 12gb will be unusable at 1440p unless you turn down settings. Case in point "the last of us" "Jedi survivor" and "Hogwarts Legacy" are really pushing things. And if you look at the 3070 or even 3080 vs the 6800, the 6800 is faring better due to memory.

Many new games are going fsr2 also because it supports all hardware including pascal/polaris and is agnostic. Yes it is inferior. But it works and is good enough unless you have a 32" monitor and intentionally look for the differences.

The 6950 will still be smoother in these high texture memory cases. Albeit without RT.

I really agree with you most of the time. But this time I think you are off Jarred. But it's just personal opinion when the cards are this close.

If money were no object 4090. 4080 vs 7900xtx is a much closer race because 4080 cost $200 more. I consider the 4080 RT "usable" and memory "acceptable"
 
Last edited:

zx128k

Reputable
In the coming years 12gb will be unusable at 1440p unless you turn down settings. Case in point "the last of us" "Jedi survivor" and "Hogwarts Legacy" are really pushing things. And if you look at the 3070 or even 3080 vs the 6800, the 6800 is faring better due to memory.

Many new games are going fsr2 also because it supports all hardware including pascal/polaris and is agnostic. Yes it is inferior. But it works and is good enough unless you have a 32" monitor and intentionally look for the differences.

The 6950 will still be smoother in these high texture memory cases. Albeit without RT.

I really agree with you most of the time. But this time I think you are off Jarred. But it's just personal opinion when the cards are this close.

If money were no object 4090. 4080 vs 7900xtx is a much closer race because 4080 cost $200 more. I consider the 4080 RT "usable" and memory "acceptable"
Cherry picked games.

The Last of Us.

What's clear is that there is a long road ahead because there's the sense right now that the PC version is effectively a beta - that is, feature complete but in need of a profound level of optimisation and bug-fixing.

The Last of Us Part I v1.0.4.0 targets framerate issues by further optimizing GPU and CPU performance.

The Last of Us Part 1 has a really weird memory management system, if you have an 8GB GPU, the game will seemingly reserve 1.6GB for the OS, even if Windows isn't using anywhere near that amount. If you have a 24GB GPU, it reserves nearly 5GB of VRAM for the OS. It's like they have just copy/pasted the memory management system of the PS5 version (the PS5 uses unified memory) and slapped this on the PC version, without taking into account PCs have separate RAM and VRAM pools.

This comparison is also valid as 'The Last of Us Part 1' is built using the engine of 'The Last of Us Part II' So for TLOU2 to have better textures on a PS4 than TLOU1 on PC with Medium textures on a more powerful GPU is quite telling.

In some sections of the game, the PS5 version in 'Performance' mode (1440p, 60fps upscaled) has superior shadow quality than the game running on PC at 4K with everything completely maxed out.

The game in some instances also has superior lighting quality on the PS5 in performance mode than 4K maxed on the PC, look at the lighting coming off and around the lamp

For the amount of VRAM this game uses, the 'Low' and 'Medium' textures are really unacceptable.

In conclusion, The Last of Us Part 1 on PC currently has issues with... - VRAM & Memory management - Texture issues - CPU utilisation issues - Loading times - Some features on the PC version are flat out broken - Crashing - Shader compilation issues - Other in-game issues like raining inside buildings A PC with similar specs to a PS5 will currently run the game with worse performance and graphical fidelity. The PC version of The Last of Us Part 1 currently has many issues, it is an outlier and should not be used to predict future hardware requirements or to try and score points for GPU wars.
source

Review,
The Last Of Us Part 1 50/100
The PlayStation classic remains out of reach on PC due to debilitating performance issues.

Same issues with Jedi Survivor on release which is also being patched.

Picking these games only seeks to mislead.



View: https://youtu.be/CXB3MHShzlQ
 
Last edited:
May 6, 2023
7
0
10
What are the pro cons of everyone choosing either AMD or Nvidia?
I have problems with AMD drivers that is interfering with Windows so I'm looking for other option which is Nvidia.
But does anyone has bad experience using Nvidia?
 

zx128k

Reputable
What are the pro cons of everyone choosing either AMD or Nvidia?
I have problems with AMD drivers that is interfering with Windows so I'm looking for other option which is Nvidia.
But does anyone has bad experience using Nvidia?
nVidia have less serious issues than AMD but both manufactures have their problems. AMD is worse but by how much is the question.
 
Lets ignore that the 6950xt is basically not worth the money. The RT performance is garbage and it has no AI cores. At least the 7900xt has decent RT performance and will be good for a few years.

The ray tracing performance of the RX 6950 XT is closer to that of the RTX 3070 Ti, or about 23% slower than the RTX 4070. The RTX 4070 is a more efficient GPU, and also offers next-gen features such as DLSS 3 Frame Generation. ZOTAC Gaming GeForce RTX 4070 Twin Edge is $599.99. A gaming power draw is 200w for the 4070 and 350w for the 6950xt reference design.

Like I stated you would never buy a 6950xt.
As pointed out in this same article, the RT performance on BOTH is garbage for the price point. So if you toss out the RT for the 6950XT, then you also toss it out as a valid argument for the 4070. It doesn't matter that the 4070 is 25% better when it's 25% and deliver like 5-10 extra frames on average and diminishes with resolution and memory usage.

And one final comment on the AI part. Until games don't start using "AI" (call it whatever you want within the FP8 and 16 capabilities), then the "AI" marketing term is completely meaningless and useless for gamers. Whoever is going to work with any form of serious Modeling Language, they won't be buying in this price range. And if they do, it's gaming performance won't even be part of the equation. That's like cross-shopping a Ford F250 and a McLaren 720S: "but the McLaren can pull my trailers!".

I'll stop here. You're, again, clutching at straws really badly.

EDIT: The article made a sudden disappearance from the top list. I hope you guys revisit and republish instead.

Regards.
 
Cherry picked games.

The Last of Us.



The Last of Us Part I v1.0.4.0 targets framerate issues by further optimizing GPU and CPU performance.












source

Review,
The Last Of Us Part 1 50/100
The PlayStation classic remains out of reach on PC due to debilitating performance issues.

Same issues with Jedi Survivor on release which is also being patched.

Picking these games only seeks to mislead.



View: https://youtu.be/CXB3MHShzlQ

Cherry picked or not, they are all AAA games and very recent releases. So that makes them VERY relevent.

RT just eats up memory too as bounding boxes are setup