RTX 4070 vs RX 7900 GRE faceoff: Which mainstream graphics card is better?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

LabRat 891

Honorable
Apr 18, 2019
78
59
10,610
The 4070 was the one they kept in the stack and just lowered the price on. It was the 4070 Ti and 4080 which were discontinued.
LmpwZw


I think they meant 'dead' as a "value", tho.
Diff. but same: Buying a 6950XT when the 7900GRE, exists.

Also, the 4070's MSRP changed to $549. Which, really does make this review Marketing Comparison "relevant".
Though, I do not agree w/ THG's 'conclusions' in any way, shape, or form.
 
Last edited:

LabRat 891

Honorable
Apr 18, 2019
78
59
10,610
I'm curious, did you not overclock the VRAM on 7900GRE?
(Arguably, AMD should have used a new firmware to overclock the VRAM without having the user do it themselves)
IIRC, they've done that once before. No, it wasn't well-received; this scenario is a lil diff.
 

aurizz

Distinguished
Feb 22, 2007
3
3
18,515
Exactly what I was thinking.

The 7900 GRE is a full product tier above the 4070 (no super, no Ti) in rasterization; DXR RT performance is extremely competitive.
The 7900 GRE is only slightly disadvantaged in nV-optimized titles, and a head(+shoulders) above the 4070 in most others.

Atop the raw performance, the 7900 GRE does not use the an hero (12)16-pin power input, and is less money than its nearest competitor, the 4070 Super.

The 'top tier' AIB partner 7900 GREs are $559-600, and even the $549 MSRP models perform like a 3090(Ti) (in rasterization) out of box.
relative-performance-1920-1080.png


-And 3070Ti-3080-like RT performance:
relative-performance-rt-1920-1080.png


IMHO, No "gamer" *actually* cares about 'efficiency'. It's become a nonsensical marketing 'wank'; a false virtue.

If people do/did *actually* care, they'd be buying the 'fattest' silicon, and undervolting+underclocking.
(Which, most-all F@Hers do w/ their cards)

I have hands-on experience w/ both Vega 10 and Navi 31; underclocked, they 'sip' power.
Ex: Underclocking the 7900 GRE (no undervolt) can make it a ~100W or less (under load) GPU, while still outputting 45-60+ FPS @ 1080p (UE4 MechWarrior 5 was the one test I did)
It is very sweet when someone posts the exact post that you want to post.
7900GRE is one higher than 4070 but victory goes to NVIDIA because of "software and features". WTF is wrong with TH?!
 
The CPU reviews here in Tom's have stock and OC results. They also have the test bench and the settings used listed in the review clearly. I cannot identify which model of the 4070 and the 7900GRE was used in this test.

The numbers dont lie, but metrics without the complete picture is not the whole truth either.

"The gaming performance gap is less than 5% even at 4K."

Both of these cards are not aimed at 4k gaming. They are 2k resolution cards. The above statement gives a wrong message to the readers. Experienced reviewers should not be doing this!
 
Last edited:

TheHerald

Upstanding
Feb 15, 2024
283
66
260
Exactly what I was thinking.

The 7900 GRE is a full product tier above the 4070 (no super, no Ti) in rasterization; DXR RT performance is extremely competitive.
The 7900 GRE is only slightly disadvantaged in nV-optimized titles, and a head(+shoulders) above the 4070 in most others.

Atop the raw performance, the 7900 GRE does not use the an hero (12)16-pin power input, and is less money than its nearest competitor, the 4070 Super.
RT performance isn't competitive at all though. It's only games with basically non existent RT that bring the average close, in heavy RT games the 4070 is much faster. Also the 7900 not using the 16 pin power is a drawback, not a bonus.

If you care about raster, the Gre is the better choice, if you care about RT, 4070 all the way.
 

Thunder64

Distinguished
Mar 8, 2016
121
171
18,760
The CPU reviews here in Tom's have stock and OC results. They also have the test bench and the settings used listed in the review clearly. I cannot identify which model of the 4070 and the 7900GRE was used in this test.

The numbers dont lie, but metrics without the complete picture is not the whole truth either.

"The gaming performance gap is less than 5% even at 4K."

Both of these cards are not aimed at 4k gaming. They are 2k resolution cards. The above statement is gives a wrong message to the readers. Experienced reviewers should not be doing this!

Yup, just more grift from Tom's. I'd like to get an answer as to why this article was so shoddy.
 

HWOC

Reputable
Jan 9, 2020
144
23
4,615
I thought the article was interesting and fair. Yes, AMD wins in rasterization by a decent margin, but not all gamers are the same. Personally I value ray-tracing highly, and appreciate good power/performance ratio. Also for me, NVIDIA software features are a must these days, I use my gaming PC for work as well, and the Broadcast package is invaluable. The background removal is miles ahead of MS Teams or Zoom, I couldn't live without it anymore.
 

mhmarefat

Distinguished
Jun 9, 2013
43
44
18,560
Welcome to 2024. GPUs with slower performance have more "features" and thus recommended!!
WOW! BTW what are those "features"? Generating fake frames.
 
  • Like
Reactions: P1nky

Colif

Win 11 Master
Moderator
if you play the games that have the Features Nvidia has, you buy those cards. If you don't play those games, choice is more open. Shame AAA games take so long to come out now so Nvidia tech demos are few and far between.

RT may be the future but we still not there yet.
 

oofdragon

Honorable
Oct 14, 2017
264
260
11,060
Guys it's TH, they ve been doing this for ages now. Yes if you have a brain the 7900 is much better, yes anyone who has actually gamed in the last two years don't give a d about ray tracing since it's a useless heavy fps penalty gimmick. RT does not deliver what it promises, even in "full path tracing" you just look at raster and in many instances it looks better. RT is a selling gimmick that sites like this will keep talking about because their goal is to sell you the product and that's it.
 

s997863

Distinguished
Aug 29, 2007
144
10
18,685
Exactly - I find it odd that nvidia's pricing encourages people to move to AMD via a console. Buy a whole GPU, or spend less and buy a whole console..? I'm not sure that when my 1080ti eventually dies I'll bother with PC gaming into the future - something cheap for my older games maybe, but nothing upper-end again
I can't even fathom the interest & market demand even if the cards were cheaper and truly "mainstream". None of the benchmarked games are even good or reviewed well, except Minecraft (a meme at this point). I don't understand how the PC gaming market is still alive with such poor service instead of people turning to the huge 30+ year catalog of older games they can try (I just played HOMM2 for the first time). And by poor service I don't just mean the games' quality. Modern hardware & software have sucked out all the fun & magic of computing. I tried installing Win10 a few weeks ago to get a feel for it, and the latest Intel graphics drivers were 700+ MB, yet excluded the graphics control panel, which MS allegedly wants to be downloaded from the Windows store separately, while the oldest Win10 compatible drivers were ~200MB and included a graphics panel that didn't fully work anyway. I had to download a very old version of RivaTuner just so I could have the convenience of hotkeys to quickly switch between different brightness/gamma/contrast profiles for reading / gaming / watching videos in a bright or dim room.
 
  • Like
Reactions: LabRat 891
Nvidia is clear winner...
These chiplets design will need more data chip degradation, interconnects, solder, pcb and other things. Only downside off nvidia right now it's the melting connector.
That is only on higher models can confirm 4070 and 4070 super have never caught fire. Still bonkers how they can sell a fire hazard.

Chiplet design is first out the gate for the GPU side won't expect anything great till like rdna 4-5 as Ryzen had similar issues.
 
Guys it's TH, they ve been doing this for ages now. Yes if you have a brain the 7900 is much better, yes anyone who has actually gamed in the last two years don't give a d about ray tracing since it's a useless heavy fps penalty gimmick. RT does not deliver what it promises, even in "full path tracing" you just look at raster and in many instances it looks better. RT is a selling gimmick that sites like this will keep talking about because their goal is to sell you the product and that's it.
The issue I have with AMD cards is anti aliasing is broken on rdna 2 dunno if they fixed it on rdna 3 but I've only ever experienced broken anti aliasing on AMD cards which ruins games it's particularly broken on tales of arisen.
 
It is nice of them to beta test the power connectors, by time AMD use them they might actually work.

Have 7900 XT, not seen an AA problems myself but I don't play every game
Yeah not sure what issue is myself I'm still trying to diagnose why it does that on some games and not others I can't replicate it on intel or Nvidia just seems like a amd driver issue I do have my theories on it but that's all they are theories.

 
Last edited:
Was this necessary? Even if it wasn't the best way to do it, the person was providing feedback about the chart visuals in other screens, and even suggested an alternative if you, as the professional, couldn't find an appropriate color pallet.

Maybe try a darker background color with lighter bars instead?
This was intended as a serious statement. I know from experience that 95% of people simply don't care too much about what colors are in the charts. So here's a person complaining and saying they should be "better" in a very nebulous fashion. And I'm saying: Knock yourself out! Show me something with colors that people here agree are better and I will happily use it.

I did try multiple color options, and red/blue and red/green looked worse, while blue/dark gray was less offensive in my opinion, and also in the opinion of a few other editors here at Tom's Hardware. I honestly don't see a problem with the current charts, but I'm also looking at the charts on a 28-inch monitor on a desktop PC. Meanwhile, someone provided limited feedback basically saying, "the charts suck" and I'm saying, "if you care so much, take some time to do your own work and show me something clearly better."

Here are some examples — not intended to be "final" but to show how red/blue and red/green look. I personally think the red/blue is maybe okay (not great but okay — it was my initial coloring and the other editors didn't like it), while the green/red looks terrible Christmas puke.

1713361695968.png

1713361764523.png

I'm actually okay with this last one as well, in grey/red though maybe with minor tweaks, because the Tom's Hardware color is red in our logo. But again, is this so much better that everyone prefers it, or is it just different and some people will still hate it? Because that's sort of the expected result: No matter what you do, someone is going hate it and complain.

1713361935686.png
 
The CPU reviews here in Tom's have stock and OC results. They also have the test bench and the settings used listed in the review clearly. I cannot identify which model of the 4070 and the 7900GRE was used in this test.

The numbers dont lie, but metrics without the complete picture is not the whole truth either.

"The gaming performance gap is less than 5% even at 4K."

Both of these cards are not aimed at 4k gaming. They are 2k resolution cards. The above statement gives a wrong message to the readers. Experienced reviewers should not be doing this!
To be clear, 4K showed the largest performance gap, in favor of the 7900 GRE. Actually, it's slightly larger on rasterization (17% lead versus 16% lead), slightly lower in RT (13.5% loss versus 12% loss). When you look at all the data in aggregate, the lead of the 7900 GRE is 3.1% at 4K, 3.4% at 1440p, 1.7% at 1080p, and -1.3% at 1080p medium.

That's why we called it a tie. Not because there aren't outliers, but taken just at face value, the numbers are very close. Pick different games and we could skew those figures, but that's an entirely different discussion. Yes, AMD wins in rasterization by a larger amount than it loses in RT. However, and this is the part people are overlooking, outside of Avatar and Diablo, we didn't test with upscaling (those two games had Quality upscaling enabled).

The more time passes, the more I fall into the camp of thinking anyone not using DLSS on an RTX GPU is basically giving up a decent performance boost with negligible loss of image quality. There are exceptions, and they mostly come down to games with a buggy/poor implementation — in which case swapping DLSS DLLs and using Nvidia Inspector to set mode E with the 3.7 DLL probably fixes the issues. In general, meaning in games where DLSS and FSR are both implemented well, right now DLSS is clearly superior on image fidelity, it's in more games, and DLSS also factors into the overall performance.

FSR 2/3 at present does not match DLSS in image quality. (Note that I never mention framegen as a major win, because I also feel it's not nearly as useful as Nvidia — and now AMD! — like to pretend.) In fact, XeSS in DP4a mode (with version 1.2 and now 1.3) looks better than FSR 2/3 upscaling... well, at least if you stick with the old upscaling factors. XMX mode looks even better than DP4a mode. So, DLSS wins on image quality, XeSS XMX mode is second and relatively close to DLSS, XeSS DP4a is a more distant third (and has a higher performance hit!), and then FSR 2/3 upscaling is last in terms of image quality. There are things where FSR looks fine, but if you do a systematic look at all areas, there are many documented issues — transparency is a big one.

And I know there are DLSS haters that will try to argue this point, but in all the testing I've seen and done, I would say DLSS in Quality mode is now close enough to native (and sometimes it's actually better, due to poor TAA implementations) that I would personally enable it in every game where it's supported. Do that and the AMD performance advantage evaporates. In fact, do that and the 4070 comes out 10~15% ahead overall.

But wait, didn't we already give the feature category to Nvidia? Isn't it getting counted twice? Perhaps, but it really is a big deal in my book. On the features side, Nvidia has DLSS and the AI tensor cores, which also power things like Broadcast, Video Super Resolution, and likely plenty more things in the future. On the performance side, DLSS Quality beats FSR 2/3 Quality, so that comparing the two in terms of strict FPS isn't really "fair." Without upscaling in rasterization games, AMD's 7900 GRE wins. With upscaling in rasterization games, the 7900 GRE still performs better but it looks worse. This is why, ultimately, the performance category was declared a tie — if anything, that's probably being nice to AMD by mostly discounting the DLSS advantage.

Put another way:
Without upscaling, AMD gets a clear win on performance in the rasterization category, loses on RT by an equally clear margin. So: AMD+ (This is intentionally skewing the viewpoint to favor AMD, which is what many seem to be doing.)
Price remains a tie.
Features is massively in favor of Nvidia. So: Nvidia+++
Power and efficiency favors Nvidia a lot as well. So: Nvidia++

Based on that, it's five points to Nvidia and one point to AMD. If someone asks me which GPU to get, 4070 vs 7900 GRE, I will tell them to get the Nvidia card. Every time. I would happily give up 10% performance in native rasterization games to get everything else Nvidia offers in its ecosystem.
 
  • Like
Reactions: 35below0 and HWOC

Udyr

Honorable
Mar 3, 2021
254
106
9,690
This was intended as a serious statement. I know from experience that 95% of people simply don't care too much about what colors are in the charts. So here's a person complaining and saying they should be "better" in a very nebulous fashion. And I'm saying: Knock yourself out! Show me something with colors that people here agree are better and I will happily use it.
Your argument is valid and understandable, but your rebuttal to the lack of decency from the poster came out as a response to a personal attack.

I appreciate you taking the time to explore different options. The focus is not to "offend" or "not offend" anyone, but trying to get a better view of the information to the readers, which is the final goal. Unfortunately, as you said, not everyone will be happy and someone will always complain, which is why we shouldn't take those feedback/criticisms as an offense.
 
  • Like
Reactions: Order 66
To be clear, 4K showed the largest performance gap, in favor of the 7900 GRE. Actually, it's slightly larger on rasterization (17% lead versus 16% lead), slightly lower in RT (13.5% loss versus 12% loss). When you look at all the data in aggregate, the lead of the 7900 GRE is 3.1% at 4K, 3.4% at 1440p, 1.7% at 1080p, and -1.3% at 1080p medium.

I would say that this is again because of 12gb vs 16gb VRAM. 4k requires more VRAM. And the games would be GPU bound performance wise. Its the same reason why people still test at 1080p and show the peak performance difference without any other bottlenecks.

GPU rasterisation performance wise, both 4070 and the 7900GRE are more suited for 120+ fps gaming at 2k resolution.

you can see the performance ball park at 4k res here:

https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

60 to 70 fps.

Would you recommend a 60fps rasterisation performance card for 4k? This is what i meant by misleading the readers.

To put it plainly, you could have worded it better by saying 'the maximum performance difference was seen in 4k'. This would have clarified if the performance difference check was carried out at all relevant game resolutions.

I have no qualms about DLSS, ray tracing and the extra features. its upto user to decide which features are more important for him/her.

What i dont like is - half truths and misleading opinions.

The other part, as shown by other reviewers, is that the 7900GRE now has a much higher overclocking potential compared to the 4070. This again is a significant factor to consider.
 
Last edited:

HWOC

Reputable
Jan 9, 2020
144
23
4,615
To be clear, 4K showed the largest performance gap, in favor of the 7900 GRE. Actually, it's slightly larger on rasterization (17% lead versus 16% lead), slightly lower in RT (13.5% loss versus 12% loss). When you look at all the data in aggregate, the lead of the 7900 GRE is 3.1% at 4K, 3.4% at 1440p, 1.7% at 1080p, and -1.3% at 1080p medium.

That's why we called it a tie. Not because there aren't outliers, but taken just at face value, the numbers are very close. Pick different games and we could skew those figures, but that's an entirely different discussion. Yes, AMD wins in rasterization by a larger amount than it loses in RT. However, and this is the part people are overlooking, outside of Avatar and Diablo, we didn't test with upscaling (those two games had Quality upscaling enabled).

The more time passes, the more I fall into the camp of thinking anyone not using DLSS on an RTX GPU is basically giving up a decent performance boost with negligible loss of image quality. There are exceptions, and they mostly come down to games with a buggy/poor implementation — in which case swapping DLSS DLLs and using Nvidia Inspector to set mode E with the 3.7 DLL probably fixes the issues. In general, meaning in games where DLSS and FSR are both implemented well, right now DLSS is clearly superior on image fidelity, it's in more games, and DLSS also factors into the overall performance.

FSR 2/3 at present does not match DLSS in image quality. (Note that I never mention framegen as a major win, because I also feel it's not nearly as useful as Nvidia — and now AMD! — like to pretend.) In fact, XeSS in DP4a mode (with version 1.2 and now 1.3) looks better than FSR 2/3 upscaling... well, at least if you stick with the old upscaling factors. XMX mode looks even better than DP4a mode. So, DLSS wins on image quality, XeSS XMX mode is second and relatively close to DLSS, XeSS DP4a is a more distant third (and has a higher performance hit!), and then FSR 2/3 upscaling is last in terms of image quality. There are things where FSR looks fine, but if you do a systematic look at all areas, there are many documented issues — transparency is a big one.

And I know there are DLSS haters that will try to argue this point, but in all the testing I've seen and done, I would say DLSS in Quality mode is now close enough to native (and sometimes it's actually better, due to poor TAA implementations) that I would personally enable it in every game where it's supported. Do that and the AMD performance advantage evaporates. In fact, do that and the 4070 comes out 10~15% ahead overall.

But wait, didn't we already give the feature category to Nvidia? Isn't it getting counted twice? Perhaps, but it really is a big deal in my book. On the features side, Nvidia has DLSS and the AI tensor cores, which also power things like Broadcast, Video Super Resolution, and likely plenty more things in the future. On the performance side, DLSS Quality beats FSR 2/3 Quality, so that comparing the two in terms of strict FPS isn't really "fair." Without upscaling in rasterization games, AMD's 7900 GRE wins. With upscaling in rasterization games, the 7900 GRE still performs better but it looks worse. This is why, ultimately, the performance category was declared a tie — if anything, that's probably being nice to AMD by mostly discounting the DLSS advantage.

Put another way:
Without upscaling, AMD gets a clear win on performance in the rasterization category, loses on RT by an equally clear margin. So: AMD+ (This is intentionally skewing the viewpoint to favor AMD, which is what many seem to be doing.)
Price remains a tie.
Features is massively in favor of Nvidia. So: Nvidia+++
Power and efficiency favors Nvidia a lot as well. So: Nvidia++

Based on that, it's five points to Nvidia and one point to AMD. If someone asks me which GPU to get, 4070 vs 7900 GRE, I will tell them to get the Nvidia card. Every time. I would happily give up 10% performance in native rasterization games to get everything else Nvidia offers in its ecosystem.
Very well put Jarred, and thank you for clarifying the DLSS/FSR points as well. I would be of the same vein that not using DLSS 2 if it's an option just doesn't make sense. Until the AMD feature and software set improves, I can't see any viable alternatives to NVIDIA for the next couple of years myself, even if AMD pure rasterization performance was 25% higher for the same amount of dollars. I've owned the same amount of green and red GPUs over the years, I'm not a fanboy of either, but as it stands I always recommend NVIDIA cards when someone asks, the big picture just makes so much more sense.

Edit: As for the chart colors, I actually like the red and green. Those colors are too firmly embedded in my brain, like a TV channel logo permanently burned in the corner of an old TV screen.
 

aurizz

Distinguished
Feb 22, 2007
3
3
18,515
Welcome to 2024. GPUs with slower performance have more "features" and thus recommended!!
WOW! BTW what are those "features"? Generating fake frames.
Fake frames and fake 1440p made from 1080p with DLSS. Best features are FAKE. Not talking about RT.
 
Would you recommend a 60fps rasterization performance card for 4k? This is what i meant by misleading the readers.
Yes, particularly with the understanding that 4K with DLSS (or FSR2/3) makes it a far better experience. If you have a 4K 144Hz (or higher) monitor, of course, I'd recommend a much more potent GPU than either of these. :)
The other part, as shown by other reviewers, is that the 7900GRE now has a much higher overclocking potential compared to the 4070. This again is a significant factor to consider.
While this is technically true, now that the limits have been removed, the reality is that I don't really recommend overclocking these days for about 99% of people. AMD and Intel CPUs are already near their limits (I supposed OC on Ryzen 5 and Core i5 is more worthwhile). AMD and Nvidia GPUs are also very near their limits, and you start to use a lot more power for minor gains in performance.

Of course, the 7900 GRE was intentionally limited at launch with slower GDDR6 clocks, mostly to keep it from being too close to the 7900 XT (that's my take anyway). So the VRAM OC on the GRE is more worthwhile, and doesn't massively increase power. But you generally hit about a 5~10 percent OC with tweaking on most GPUs these days, and an extra few percent that only applies to a small minority of users isn't really a major factor.
 
  • Like
Reactions: HWOC and Lucky_SLS