RTX 4070 vs RX 7900 GRE faceoff: Which mainstream graphics card is better?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

35below0

Notable
Jan 3, 2024
1,029
446
1,090
When you look at all the data in aggregate, the lead of the 7900 GRE is 3.1% at 4K, 3.4% at 1440p, 1.7% at 1080p, and -1.3% at 1080p medium.
That's why we called it a tie. Not because there aren't outliers, but taken just at face value, the numbers are very close.
I can't help but think that if two GPUs are close in a majority of games with the 7900GRE consistently being a little bit ahead, and they are not close in a small number of games where the 7900GRE pulls into a clear lead, then they are not mostly close and tied on average.
The 7900GRE gets better results.

You have explained how this performance is maybe flattered a little bit by the way tests were done, and why it really should be a tie. But looking at the table it seems obvious which GPU is the winner.

Also, can we have pie charts?

Also, also, as someone pointed out the games themselves may not be so relevant.
My take away here is that the 7900GRE is better value, and a better performer in a handful of games. Not RT but that's less than 1% of games that i could play.
Software and features would sway my purchase to nvidia though.
 
Fake frames and fake 1440p made from 1080p with DLSS. Best features are FAKE. Not talking about RT.
EVERYTHING we render on PCs and consoles is "fake." Just because you use upscaling doesn't make the end result more fake than something else — and again, framegen is a different matter and I don't recommend it in general. Graphics has been about trying to optimize the way we estimate pixels to create a pleasing end result since day one. AI upscaling via DLSS and XeSS shows that there are indeed better ways to get there than the status quo.
 
  • Like
Reactions: HWOC

35below0

Notable
Jan 3, 2024
1,029
446
1,090
Fake frames and fake 1440p made from 1080p with DLSS. Best features are FAKE. Not talking about RT.
I thought the article was interesting and fair. Yes, AMD wins in rasterization by a decent margin, but not all gamers are the same. Personally I value ray-tracing highly, and appreciate good power/performance ratio. Also for me, NVIDIA software features are a must these days, I use my gaming PC for work as well, and the Broadcast package is invaluable. The background removal is miles ahead of MS Teams or Zoom, I couldn't live without it anymore.
Nothing fake about background removal.
 
  • Like
Reactions: HWOC
Yes, particularly with the understanding that 4K with DLSS (or FSR2/3) makes it a far better experience. If you have a 4K 144Hz (or higher) monitor, of course, I'd recommend a much more potent GPU than either of these. :)

While this is technically true, now that the limits have been removed, the reality is that I don't really recommend overclocking these days for about 99% of people. AMD and Intel CPUs are already near their limits (I supposed OC on Ryzen 5 and Core i5 is more worthwhile). AMD and Nvidia GPUs are also very near their limits, and you start to use a lot more power for minor gains in performance.

Of course, the 7900 GRE was intentionally limited at launch with slower GDDR6 clocks, mostly to keep it from being too close to the 7900 XT (that's my take anyway). So the VRAM OC on the GRE is more worthwhile, and doesn't massively increase power. But you generally hit about a 5~10 percent OC with tweaking on most GPUs these days, and an extra few percent that only applies to a small minority of users isn't really a major factor.

Glad to see we are on the same page :)

The entire point of product reviews is to test and tell the readers if a feature is worthwhile or not. You have rightly pointed out the limited OC potential in high end CPUs and GPUs in general. But we would know that information only if a reviewer tests it and tells it to the community!

As for OC, both Nvidia and Radeon have auto OC as a feature. Better to test and give a complete picture - both auto oc and manual. With these features, its kinda analogous to having XMP on/off.
 
For those asking about chart colors, I still haven't heard any real suggestions / recommendations. Also keep in mind that we may do something like AMD vs AMD GPUs at some point, so "neutral" can be better there. Still, here are red/grey and green/grey options. Does anyone love the way these look? I'm mostly ambivalent on the red, green is a bit much though, and I still generally think the blue might be best.

If you think you can come up with something better, please do and show the result. Like, use Paint or whatever to fill in the bars with different colors so others can see what it looks like. For now, this has already taken way more time than I think it warrants.

faceoffchart-RX7900GRE-RX6950XT-1440p-ult.png

faceoffchart-RTX4070TiSuper-RTX3090Ti-1440p-ult.png

faceoffchart-ArcA77016GB-ArcA750-1080p-ult.png
 
  • Like
Reactions: HWOC and Order 66

35below0

Notable
Jan 3, 2024
1,029
446
1,090
For those asking about chart colors, I still haven't heard any real suggestions / recommendations. Also keep in mind that we may do something like AMD vs AMD GPUs at some point, so "neutral" can be better there. Still, here are red/grey and green/grey options. Does anyone love the way these look? I'm mostly ambivalent on the red, green is a bit much though, and I still generally think the blue might be best.
Run a poll? With some samples for people to pick.

Also consider using fewer items in a chart. If it's bigger, it's easier to read. Split a chart into more pages so all games are present.
 
  • Like
Reactions: Order 66

TheHerald

Proper
Feb 15, 2024
164
39
110
I can't help but think that if two GPUs are close in a majority of games with the 7900GRE consistently being a little bit ahead, and they are not close in a small number of games where the 7900GRE pulls into a clear lead, then they are not mostly close and tied on average.
The 7900GRE gets better results.

You have explained how this performance is maybe flattered a little bit by the way tests were done, and why it really should be a tie. But looking at the table it seems obvious which GPU is the winner.

Also, can we have pie charts?

Also, also, as someone pointed out the games themselves may not be so relevant.
My take away here is that the 7900GRE is better value, and a better performer in a handful of games. Not RT but that's less than 1% of games that i could play.
Software and features would sway my purchase to nvidia though.
This is not directed at you specifically, a lot of people seem to be bothered about this. Why does the personal opinion of a reviewer is called into question? The data is there for all to see, if you prefer raster and don't care about rt and dlss then the choice is obvious. The reviewer prefers rt and dlss and he expressed that, who cares. When I read reviews I want the data to be accurate, I don't give a rats ass about the reviewrs opinion ( I mean I do, just like I care about anyone else's opinion). It's just his preference and it has no impact on my buying decision.

Also saying that only 1% of games have RT is very disgeniube. Because on the same note only 2% of games released need such a fast card to begin with. Yes, people are buying new gpus for that 2-5% of new games that require a fast card, and a big percentage of those do in fact have RT.
 

HWOC

Reputable
Jan 9, 2020
144
23
4,615
This is not directed at you specifically, a lot of people seem to be bothered about this. Why does the personal opinion of a reviewer is called into question? The data is there for all to see, if you prefer raster and don't care about rt and dlss then the choice is obvious. The reviewer prefers rt and dlss and he expressed that, who cares. When I read reviews I want the data to be accurate, I don't give a rats ass about the reviewrs opinion ( I mean I do, just like I care about anyone else's opinion). It's just his preference and it has no impact on my buying decision.

Also saying that only 1% of games have RT is very disgeniube. Because on the same note only 2% of games released need such a fast card to begin with. Yes, people are buying new gpus for that 2-5% of new games that require a fast card, and a big percentage of those do in fact have RT.
Very well said. I wish everyone on the internet would think as sensibly as this.
 

35below0

Notable
Jan 3, 2024
1,029
446
1,090
Why does the personal opinion of a reviewer is called into question?
When I read reviews I want the data to be accurate, I don't give a rats ass about the reviewrs opinion
Reviewers opinion cannot be his verdict. The verdict has to be data based, and that verdict is NOT 4070 > 7900GRE, not according to the performance data.

What the author added to the data was not opinion but auxilliary data and circumstances such as:
(emphasis mine)
To be clear, 4K showed the largest performance gap, in favor of the 7900 GRE. Actually, it's slightly larger on rasterization (17% lead versus 16% lead), slightly lower in RT (13.5% loss versus 12% loss). When you look at all the data in aggregate, the lead of the 7900 GRE is 3.1% at 4K, 3.4% at 1440p, 1.7% at 1080p, and -1.3% at 1080p medium.


AMD wins in rasterization by a larger amount than it loses in RT.
However, and this is the part people are overlooking, outside of Avatar and Diablo, we didn't test with upscaling (those two games had Quality upscaling enabled).

And I know there are DLSS haters that will try to argue this point, but in all the testing I've seen and done, I would say DLSS in Quality mode is now close enough to native (and sometimes it's actually better, due to poor TAA implementations) that I would personally enable it in every game where it's supported. Do that and the
AMD performance advantage evaporates. In fact, do that and the 4070 comes out 10~15% ahead overall.


On the performance side, DLSS Quality beats FSR 2/3 Quality, so that comparing the two in terms of strict FPS isn't really "fair." Without upscaling in rasterization games, AMD's 7900 GRE wins. With upscaling in rasterization games, the 7900 GRE still performs better
This is why, ultimately, the performance category was declared a tie — if anything, that's probably being nice to AMD by mostly discounting the DLSS advantage.

Put another way:
Without upscaling, AMD gets a clear win on performance in the rasterization category, loses on RT by an equally clear margin. So: AMD+ (This is intentionally skewing the viewpoint to favor AMD, which is what many seem to be doing.)


The key point of departure is opinion on whether upscaling matters. The data is there and points to the 7900GRE performing better without upscaling.

It is the author's opinion that DLSS should be enabled in every game, and if this is done then the 4070 is the winner.
What people objected to in this thread is NOT THF favoring nvidia blindly, but the author claiming that:

4070 is the clear winner overall and tied on performance IF DLSS is enabled in every game
and
4070 is the clear winner overall and tied on performance

are the same thing. They are not.

AMD fanboys and trolls may annoy you but many gamers will disagree with the autor's opinion that DLSS is always good and always on, and therefore forget the 7900GRE, 4070 is better.

It may well be, but for other reasons. The verdict is incongruous to say the least and grotesque and biased to say... well, more than least.

I wouldn't even say that the GPUs are tied on performance, though i accept the author's argument about the GRE's advantage being situational and overstated/skewed.
Likewise, i would not really consider testing at 4K being relevant to the verdict. It's handy to have the info but neither GPU is a 4K performer.


Reality is that there are two verdicts. One that favors nvidia in a bs way, and one that favors AMD in a bs way as well.
We have to have this argument because the author chose one of them and called it a fair verdict.


Buy the GPU that fits your needs and budget. Neither of these is a worthy investment but between them i have to say the 4070 is completely pointless because the Ti is the better choice.
The 7900GRE also competes with the 7800XT which may just be better value depending on the price you pay and how high you can stretch your budget.

I would sit and anticipate the 50XX series, but that's really neither here nor there.
 

TheHerald

Proper
Feb 15, 2024
164
39
110
Reviewers opinion cannot be his verdict. The verdict has to be data based, and that verdict is NOT 4070 > 7900GRE, not according to the performance data.

What the author added to the data was not opinion but auxilliary data and circumstances such as:
(emphasis mine)




The key point of departure is opinion on whether upscaling matters. The data is there and points to the 7900GRE performing better without upscaling.

It is the author's opinion that DLSS should be enabled in every game, and if this is done then the 4070 is the winner.
What people objected to in this thread is NOT THF favoring nvidia blindly, but the author claiming that:

4070 is the clear winner overall and tied on performance IF DLSS is enabled in every game
and
4070 is the clear winner overall and tied on performance

are the same thing. They are not.

AMD fanboys and trolls may annoy you but many gamers will disagree with the autor's opinion that DLSS is always good and always on, and therefore forget the 7900GRE, 4070 is better.

It may well be, but for other reasons. The verdict is incongruous to say the least and grotesque and biased to say... well, more than least.

I wouldn't even say that the GPUs are tied on performance, though i accept the author's argument about the GRE's advantage being situational and overstated/skewed.
Likewise, i would not really consider testing at 4K being relevant to the verdict. It's handy to have the info but neither GPU is a 4K performer.


Reality is that there are two verdicts. One that favors nvidia in a bs way, and one that favors AMD in a bs way as well.
We have to have this argument because the author chose one of them and called it a fair verdict.


Buy the GPU that fits your needs and budget. Neither of these is a worthy investment but between them i have to say the 4070 is completely pointless because the Ti is the better choice.
The 7900GRE also competes with the 7800XT which may just be better value depending on the price you pay and how high you can stretch your budget.

I would sit and anticipate the 50XX series, but that's really neither here nor there.
It's not possible to add into the data the impact of dlss. So it's fine to add that in the verdict. I agree with the reviewer about the importance of dlss, although even with dlss I wouldn't prefer a 4070 over a 7900 gre. He would, and it's fine, it's really irrelevant. Verdicts or opinions, doesn't really matter, as long as his data are pristine, I personally really don't care and neither should anyone else to be fair.

As you've said, neither of these cards are great, but the 4070 will have some serious longevity issues. I don't mind it cause I change my gpu every Gen, but for people that wanna keep it, the 7900 gre is the better choice, with or without dlss. But, that's my opinion, someone can call it BS and that's also fine.

Eg1. We might even agree that his verdict should be different, but verdicts are so irrelevant to me that I havent even bothered to read it yet. My apologiesb to the guy doing the review, nothing personal, I just care about that juicy data.
 
It's not possible to add into the data the impact of dlss. So it's fine to add that in the verdict. I agree with the reviewer about the importance of dlss, although even with dlss I wouldn't prefer a 4070 over a 7900 gre. He would, and it's fine, it's really irrelevant. Verdicts or opinions, doesn't really matter, as long as his data are pristine, I personally really don't care and neither should anyone else to be fair.

As you've said, neither of these cards are great, but the 4070 will have some serious longevity issues. I don't mind it cause I change my gpu every Gen, but for people that wanna keep it, the 7900 gre is the better choice, with or without dlss. But, that's my opinion, someone can call it BS and that's also fine.

Eg1. We might even agree that his verdict should be different, but verdicts are so irrelevant to me that I havent even bothered to read it yet. My apologiesb to the guy doing the review, nothing personal, I just care about that juicy data.

personally could just be me i dont think the 4070 or 4070 super will be a issue atm at 1440p. no one is buying a 4070 super with the idea of 4k at least i would hope not. whatever the reason i find that amd gpus use more memory then there nvidia counterparts. dunno if this is because of ggdr6x being faster.

7900 gre and rdna cards in general have some teething issues that amd needs to really iron out properly
like the power draw still being all over the place.
drivers being broken that anti aliasing just point blank either doesnt work ( rdna 2/3) ( i know its driver related cause i cant see this console wise).

im personally wanting to see what intel offers with battlemage.
 

g-unit1111

Titan
Moderator
This matchup isn't with a 4070s, which I still wouldn't buy over a 7900gre, due to the pitiful 12gb vram.

Yeah the VRAM issue is the main reason why I've been considering the 7900 GRE. But I've been through like a thousand searches on what GPU is better, but haven't been able to find a clear answer yet. If the 7900GRE is better, I'll get that. If the 4070 Super is better, I'll get that. I do know one thing, if I get the 7900 GRE, I'm going with the Sapphire Nitro+ for sure.
 

TheHerald

Proper
Feb 15, 2024
164
39
110
personally could just be me i dont think the 4070 or 4070 super will be a issue atm at 1440p. no one is buying a 4070 super with the idea of 4k at least i would hope not. whatever the reason i find that amd gpus use more memory then there nvidia counterparts. dunno if this is because of ggdr6x being faster.

7900 gre and rdna cards in general have some teething issues that amd needs to really iron out properly
like the power draw still being all over the place.
drivers being broken that anti aliasing just point blank either doesnt work ( rdna 2/3) ( i know its driver related cause i cant see this console wise).

im personally wanting to see what intel offers with battlemage.
Im not saying there is an issue at the moment, but longevity is certainly an issue. Even if you ignore the vram, in raster performance it's lacking (compared to the 7900 GRE) and although it's leading in RT, it won't have long legs in heavy RT games.
 

TheHerald

Proper
Feb 15, 2024
164
39
110
Yeah the VRAM issue is the main reason why I've been considering the 7900 GRE. But I've been through like a thousand searches on what GPU is better, but haven't been able to find a clear answer yet. If the 7900GRE is better, I'll get that. If the 4070 Super is better, I'll get that. I do know one thing, if I get the 7900 GRE, I'm going with the Sapphire Nitro+ for sure.
I'd say stretch your budget and grab a 4070ti Super. Yes, it's an extra 200, but it's really worth it compared to the rest of the lineup. Of course that is assuming you care about RT, if you don't then the Gre is fine
 
  • Like
Reactions: beyondlogic

logainofhades

Titan
Moderator
Yeah the VRAM issue is the main reason why I've been considering the 7900 GRE. But I've been through like a thousand searches on what GPU is better, but haven't been able to find a clear answer yet. If the 7900GRE is better, I'll get that. If the 4070 Super is better, I'll get that. I do know one thing, if I get the 7900 GRE, I'm going with the Sapphire Nitro+ for sure.

Then you may want to watch this.

View: https://www.youtube.com/watch?v=tFKhlplCNpM
 
  • Like
Reactions: g-unit1111

TheHerald

Proper
Feb 15, 2024
164
39
110
If money were no object I'd get the 4070TI Super without even thinking about it. But unfortunately it is.
Okay, so it's between the 4070 and the 7900GRE?

The 4070 will be a lot faster in heavy RT games (I mean a LOT). Id say that the 7900 GRE can't really play RT heavy games, and the 4070 will need a lot of compromises to do so. So at this price point I wouldn't care so much about RT myself. DLSS is super nice but you basically sacrifice vram to do so, and assuming you don't plan on upgrading soon the 7900 GRE is the better choice.

This coming from what a lot of people call an "Intel / nvidia fanboy" so..
 
It still feels gross to hear $500-$600 GPUs called "mainstream/midrange"
I remember what that was the price of the fastest highend card and the equivalent of a 4070 was considered upper midrange/performance and cost maybe 350-400
That was around the GeForce 5/6/7/8/9/GTX 200/400/500/600/700/900 era.
The FX 5950 Ultra launched at $500
The 6800 Ultra launched at $500
The GTX 980 launched at $550
The 980 Ti launched at $650 (As did the 980 Ti and 1080 Ti)
The 1080 launched at$700 (as did the 2080, but that wasn't the top card anymore)
The 2080 Ti launched at $1200
The 4080 launched at $1200
The 4090 at $1600

From the GTX 10 to the RTX 40 increased prices across 4 generations and 5 years by 128%
The 11 generations and 10 years before that oncreased prices by 40%.

The issue I have with AMD cards is anti aliasing is broken on rdna 2 dunno if they fixed it on rdna 3 but I've only ever experienced broken anti aliasing on AMD cards which ruins games it's particularly broken on tales of arisen.
As nvidia user I can say antialiasing is broken on geforce cards as well.

Or rather, antialiasing is broken these days in general.
That's simply the result of the AA methods modern games use. They're either blurry or flicker, or both.
 
4070 is the clear winner overall and tied on performance IF DLSS is enabled in every game
4070 is the clear winner overall and tied on performance
Actually, you're incorrectly stating things.

I said the 4070 is the clear winner overall (IMO) and wins on performance if you enable DLSS (it may still be behind if you also enable FSR2/3, but FSR2/3 looks worse). The 4070 still wins overall but loses slightly on performance if you don't use upscaling — and of course loses by a larger amount in rasterization games. So, the overall tie was declared because it's situational: win with DLSS by a modest amount, lose without DLSS by a similar amount — and also because all indications are that AI features and performance will become more relevant over time, not less.

People like to point at the 12GB as being a handicap compared to the GRE's 16GB. And it is, though the impact is often overstated. Few if any games, on Nvidia GPUs, have issues with 12GB — without specific mods, I'm not even sure I could name one. On AMD GPUs, 12GB can sometimes run into VRAM issues in my experience at 4K max settings (particularly with heavy ray tracing). Basically, due to architectural and driver differences, it often feels to me like Nvidia 8GB ~= AMD 10GB, Nvidia 10GB ~= AMD 12GB, and Nvidia 12GB ~= AMD 14GB (which basically means 16GB in this case).

But just as lack of VRAM could limit the future usefulness of a card, the lack of higher AI performance could similarly limit cards. It's almost inevitable at this point that we're going to get games where there will be more complex AI processes that will run on Nvidia's tensor cores. Those same processes will also likely be able to run on AMD's AI accelerator or even via GPU shaders... but they will run much slower. Note that it's also unclear how things will scale when doing concurrent graphics + AI workloads, due to shared execution resources. (As an example, RTX 20-series doesn't scale as well as Ampere/Ada when trying to do concurrent shaders+RT+tensor, but I haven't seen anything to really investigate what happens on RDNA 3 when you try to do shaders+RT+AI.)

When will we get games that truly leverage AI hardware? That's less clear. Some might come out in the next year or two, but those will likely be trailblazers or even just tech demoes of a sort. There are certainly game developers looking at Nvidia's ACE stuff, and I'm not sure how that will run on non-Nvidia GPUs. Anyway, it might be five or more years before we have AI-driven conversations and quests and game world interactions with the player(s), running via machine learning algorithms. But I'd wager heavily that we will see such games at some point, with the first coming before five years have passed.

How will such games run on the 7900 GRE versus the 4070? I suspect it will be even worse than the current RT performance... but just as the 12GB vs 16GB really ends up being about adjusting your settings and expectations (i.e. you don't need to use ultra-max settings!), I'm skeptical anyone will need to play the first AI-powered games. I just know I personally will want to try them, if only to see what they're like.
 
  • Like
Reactions: HWOC
Actually, you're incorrectly stating things.

I said the 4070 is the clear winner overall (IMO) and wins on performance if you enable DLSS (it may still be behind if you also enable FSR2/3, but FSR2/3 looks worse). The 4070 still wins overall but loses slightly on performance if you don't use upscaling — and of course loses by a larger amount in rasterization games. So, the overall tie was declared because it's situational: win with DLSS by a modest amount, lose without DLSS by a similar amount — and also because all indications are that AI features and performance will become more relevant over time, not less.

People like to point at the 12GB as being a handicap compared to the GRE's 16GB. And it is, though the impact is often overstated. Few if any games, on Nvidia GPUs, have issues with 12GB — without specific mods, I'm not even sure I could name one. On AMD GPUs, 12GB can sometimes run into VRAM issues in my experience at 4K max settings (particularly with heavy ray tracing). Basically, due to architectural and driver differences, it often feels to me like Nvidia 8GB ~= AMD 10GB, Nvidia 10GB ~= AMD 12GB, and Nvidia 12GB ~= AMD 14GB (which basically means 16GB in this case).

But just as lack of VRAM could limit the future usefulness of a card, the lack of higher AI performance could similarly limit cards. It's almost inevitable at this point that we're going to get games where there will be more complex AI processes that will run on Nvidia's tensor cores. Those same processes will also likely be able to run on AMD's AI accelerator or even via GPU shaders... but they will run much slower. Note that it's also unclear how things will scale when doing concurrent graphics + AI workloads, due to shared execution resources. (As an example, RTX 20-series doesn't scale as well as Ampere/Ada when trying to do concurrent shaders+RT+tensor, but I haven't seen anything to really investigate what happens on RDNA 3 when you try to do shaders+RT+AI.)

When will we get games that truly leverage AI hardware? That's less clear. Some might come out in the next year or two, but those will likely be trailblazers or even just tech demoes of a sort. There are certainly game developers looking at Nvidia's ACE stuff, and I'm not sure how that will run on non-Nvidia GPUs. Anyway, it might be five or more years before we have AI-driven conversations and quests and game world interactions with the player(s), running via machine learning algorithms. But I'd wager heavily that we will see such games at some point, with the first coming before five years have passed.

How will such games run on the 7900 GRE versus the 4070? I suspect it will be even worse than the current RT performance... but just as the 12GB vs 16GB really ends up being about adjusting your settings and expectations (i.e. you don't need to use ultra-max settings!), I'm skeptical anyone will need to play the first AI-powered games. I just know I personally will want to try them, if only to see what they're like.
Are you done digging your grave?

Even if we took all your arguments to make your review relevant, the matter of fact is that Performance and Price take the majority of the weight.

I would put Performance at 40%, Price at 40%, Memory at 10% and the rest at 10%.

The fact that you put the 4070 (NON SUPER) above the GRE is literally a farce.

At this tier of GPU, CUDA and Ray Tracing should not even matter. You would definitely not buy a 4070 for some CUDA work and you will get abysmal Ray Tracing performance for that tier making this whole analysis a joke.

People are not dumb, we have seen the comparisons all over the internet and we know that the 4070 is pale in comparison of the 4070 SUPER and the GRE.