Feature GPU Benchmarks Hierarchy and Best Graphics Cards

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Ugh. "This video is not one of those BS clickbait videos!" Proceeds to deliver a BS clickbait video titled "I just found out the RX 6700 Performs the SAME as the RTX 4070..." Um, no. Not even close. Maybe in one particular game that is known to be a poor port, but that's not at all representative, so a title focusing on that aspect is by definition clickbait.

For the record, in a larger test suite, I found the RX 6700 10GB is about as fast as an RTX 2080 Super, and slower than RTX 3060 Ti. That's in rasterization performance. In ray tracing, it's below the RTX 2060 Super. The RX 6950 XT is about 60% faster, and the RTX 4070 is 43% faster (sticking with rasterization performance). Anything that falls well outside that mark is going to be due to whack coding or some other factor.

It looks like The Last of Us is hitting CPU limits. Also, the "OC + SAM" testing should really just be dropped and replaced with stock + SAM/ReBAR. It's making the charts messy and muddying the waters. Overclocking is variable and it's not usually representative of the end-user experience IMO. Anyway, there are lots of questions raised, and that video doesn't provide much in the way of answers.

What happens at 1440p or 4K? Because that would tell you if there's a CPU limit. There's very little reasonable explanation for why the RX 6700 would otherwise match an RX 6950 XT.

[Disclaimer: I couldn't handle listening to him after about 15 seconds, so I stopped and just skipped forward and looked at the charts. What I saw in the first bit was enough to make me question the rest.]
 
  • Like
Reactions: Roland Of Gilead
Ugh. "This video is not one of those BS clickbait videos!" Proceeds to deliver a BS clickbait video titled "I just found out the RX 6700 Performs the SAME as the RTX 4070..." Um, no. Not even close. Maybe in one particular game that is known to be a poor port, but that's not at all representative, so a title focusing on that aspect is by definition clickbait.
@VoodooSJ Okay, so proper benchmarking is apparently hard for some places to grasp. I still don't know how he tested for that video, but I just ran a smattering of GPUs through a benchmark sequence. I'm seeing nothing even remotely similar to his claims, which is hardly surprising but still...

ALLGPU-tlou-i-01-1080p-Med.png
ALLGPU-tlou-i-02-1080p-Ultra.png
ALLGPU-tlou-i-03-1440p-Ultra.png
ALLGPU-tlou-i-04-4K-Ultra.png

Now, there are some interesting takeaways, like the fact that the RX 7600 basically matches an RTX 4060 Ti. But it's an AMD-promoted game that's known to hit VRAM hard, and I'm going to chalk that up to Nvidia's drivers probably not being as tuned for this particular title on 8GB GPUs like the 4060 Ti. The RX 6700 XT also came relatively close to the RTX 4070 at 4K Ultra... but neither is really delivering a decent experience there, and I'm pretty sure my RTX 4070 result at 4K was "bad." Look at the RTX 4070 Ti and it does much, much better at 4K.

This was quick and dirty benchmarking, so I just ran each test twice and wasn't paying close attention. If I exited and restarted the game, the RTX 4070 would probably have improved a lot. (Side note: the RX 6600 XT the first time, without restarting before 4K testing, was getting ~5 fps.)

You definitely need/want at least 12GB VRAM if you're going to play on Ultra settings. High would be okay on the 8GB cards, though.
 
  • Like
Reactions: VoodooSJ
Thanks Jarred. So, if I understood well, the conclusion will be that, in this particular game, besides poor coding and AMD preference, it likes above 8GB vRAM cards? And in this guy tests, the 6700 wasn´t perfoming at 6800 levels but it was the other way around, the 6800 performing at 6700 levels due to some sort of bottlenecking/poor testing methods?
And just to be sure, your test are conducted with SAM/ReBAR enabled? If it is true, would you run a test on the 6700 without it to see how much difference does it make in this game?

On a totally different matter, and since you are one of the few that tests MSFS, there is an issue with navi 22 cards in DX12 mode that has been ignored by Asobo and AMD at least for a year now. Everytime that there are clouds or it is foggy, you can see vertical bars inside the cockpit. It is widely reported in the community but they don´t even acknowledge that this issue exists. Using DX11 "fixes" the issue but DX12 feels smoother and even the game loads faster. Have you tried it in DX12?
Is there something you could do about it? Talking to an AMD representative perhaps? At least having a confirmation that they know about this will be useful. Thanks again Jarred, I really appreciate you taking the time to test and answer, also being reachable this way is highly appreciated.
 
Thanks Jarred. So, if I understood well, the conclusion will be that, in this particular game, besides poor coding and AMD preference, it likes above 8GB vRAM cards? And in this guy tests, the 6700 wasn´t perfoming at 6800 levels but it was the other way around, the 6800 performing at 6700 levels due to some sort of bottlenecking/poor testing methods?
And just to be sure, your test are conducted with SAM/ReBAR enabled? If it is true, would you run a test on the 6700 without it to see how much difference does it make in this game?

On a totally different matter, and since you are one of the few that tests MSFS, there is an issue with navi 22 cards in DX12 mode that has been ignored by Asobo and AMD at least for a year now. Everytime that there are clouds or it is foggy, you can see vertical bars inside the cockpit. It is widely reported in the community but they don´t even acknowledge that this issue exists. Using DX11 "fixes" the issue but DX12 feels smoother and even the game loads faster. Have you tried it in DX12?
Is there something you could do about it? Talking to an AMD representative perhaps? At least having a confirmation that they know about this will be useful. Thanks again Jarred, I really appreciate you taking the time to test and answer, also being reachable this way is highly appreciated.
On the MSFS stuff, I test AMD in DX12 mode (and if I test with FrameGen, I have to use DX12 on Nvidia as well). DX11 is generally faster for Nvidia and Intel GPUs, while DX12 is faster for AMD. Or at least that was what I saw the last time I checked. But my test sequence is from outside the plane, and I never play the game other than for benchmarking purposes — I am not at all a big simulator fan. If there's a support page or forum and people have created a thread, I don't know that there's much else that can be done. It's clearly not a high priority for Asobo right now.

On the TLOU testing, yes, ReBAR is enabled. It's an Intel PC, so no SAM technically — that's only for AMD on AMD. ReBAR does sort of the same thing but apparently isn't as optimized, according to AMD. Are you wanting a test with the RX 6700 10GB (in addition to the 6700 XT)? I can run that, sure.

TLOU goes a bit above 8GB of VRAM use at 1080p ultra, then around 9.5GB for 1440p ultra, and close to 12GB for 4K ultra. Obviously, it needs a beast of a GPU to handle 4K ultra in the first place — the 4070 Ti only gets 43 fps and the RX 6800 XT gets 36 fps. Alternatively, you could turn on FSR2 upscaling, but that would reduce image quality a bit.

So for 4K native at 60 fps or more, I would think the RTX 4090 is probably the only GPU that can get there right now, depending on what area you're using for testing. I specifically chose a more demanding area (there's a pond and vegetation, right before you take Ellie into the destroyed city), and if you're running around in a sewage pipe or inside a building as an example, the fps would be much higher.
 
  • Like
Reactions: VoodooSJ
On the TLOU testing, yes, ReBAR is enabled. It's an Intel PC, so no SAM technically — that's only for AMD on AMD. ReBAR does sort of the same thing but apparently isn't as optimized, according to AMD. Are you wanting a test with the RX 6700 10GB (in addition to the 6700 XT)? I can run that, sure.
Yes please! I do own a 6700 non XT (Intel CPU) but I don´t play the game. The reason behind my interest is that I have 2 nephews that own a PS5 and play it, and they are coming to visit and stay for a few days so they probably will demand the game in my PC :) I started searching info about it and found the rumor that it performed really well in the 6700, so it was interesting to find out what was really going on. I´ll be fine anyway cause my monitor is only 1080p. Thanks Again! Cheers!
 
Yes please! I do own a 6700 non XT (Intel CPU) but I don´t play the game. The reason behind my interest is that I have 2 nephews that own a PS5 and play it, and they are coming to visit and stay for a few days so they probably will demand the game in my PC :) I started searching info about it and found the rumor that it performed really well in the 6700, so it was interesting to find out what was really going on. I´ll be fine anyway cause my monitor is only 1080p. Thanks Again! Cheers!
TL;DR: RX 6700 10GB is basically like the RX 7600, a bit faster at higher resolutions.

ALLGPU-tlou-i-01-1080p-Med.png
ALLGPU-tlou-i-02-1080p-Ultra.png
ALLGPU-tlou-i-03-1440p-Ultra.png
ALLGPU-tlou-i-04-4K-Ultra.png
The game supports FSR2, and some GPUs will default to having that enabled. I wouldn't be shocked if that's what happened with some of the other benchmarks you've seen. Turn on FSR2 Quality upscaling and the RX 6700 will indeed perform like an RX 6800 XT running at native!

Something like High settings at 1440p with FSR2 Quality should get you above 60 fps.
 
  • Like
Reactions: VoodooSJ
Thanks! I hoped that the 6700 would be closer to the XT. 🙁 I also noticed that the 7600 has the narrower gap between average and 1% low, is this indicative of something like smoother gameplay or better frametime?
 
Thanks! I hoped that the 6700 would be closer to the XT. 🙁 I also noticed that the 7600 has the narrower gap between average and 1% low, is this indicative of something like smoother gameplay or better frametime?
Sort of, at least in theory. The run-to-run variation in 1% lows tends to be a lot higher than the average fps variation. Basically, a single "bad" frame can skew those results quite a bit more. Realistically, you won't notice the difference between 66 fps on the 1% lows and 63 fps, but if it were 15% lower, then it could indicate more frequent stuttering.

The RX 7600 is interesting here, as it has substantially more theoretical compute — 21.8 teraflops FP32 peak, compared to just 11.3 teraflops on the RX 6700 10GB. But so far, most games don't seem to benefit much from the changes. Chips and Cheese did an architecture investigation that suggests more effort would need to be applied at the compiling and optimization stage to extract better performance from the RDNA 3 GPUs. Given The Last of Us Part 1 has a lengthy shader compilation process that runs each time you swap GPUs, it's possible that's why the 7600 does generally better with TLOU than in some other games.
 
  • Like
Reactions: VoodooSJ
Hi Jarred, I really love the summary graphics for the GPU hierarchy. This is the first time I've seen them compared to the written tiers that I remember from before. These graphics help me refine my own purchase thoughts in terms of what resolution I want to plan for.
 
  • Like
Reactions: JarredWaltonGPU
Thank you for the power usage section of this feature!

It would be useful if there were some sort of way to gauge a card's relative performance at the same time as power usage.

Perhaps some sort of list for each card collating similarly performing cards and their power usage, so we can more easily pick out how they compare with eachother?
 
  • Like
Reactions: Order 66
Thank you for the power usage section of this feature!

It would be useful if there were some sort of way to gauge a card's relative performance at the same time as power usage.

Perhaps some sort of list for each card collating similarly performing cards and their power usage, so we can more easily pick out how they compare with eachother?
You can click the specs column to go to the appropriate review, and the power pages usually have a lot more information. But that’s more on GPUs tested in the past year as well.
 
You can click the specs column to go to the appropriate review, and the power pages usually have a lot more information. But that’s more on GPUs tested in the past year as well.
Thanks for the info, I checked out the review pages, and they do indeed have a lot more information.

Maybe a better way for me to phrase my post would just be to explain my situation (This is to illustrate my request, not a direct question)

I want to buy a new card, I want it to be somewhere around a 3060/4060, I would like it to be an AMD card because I want the most Linux compatibility, and I want to avoid power hungry cards.
Well, really, I want to see if that's even a realistic option right now but the point is the same.

All the data is there, but as far as I can tell, it's not all in one place in a manner that helps make that decision more easily.

Of course, I am very appreciative that it is there at all, power usage wasn't really made light of for a while.
 
  • Like
Reactions: Order 66
I have just noticed that the individual game charts have been updated to the newer test suite, including different settings for RDR2. For the medium settings "adv. medium/off" is easy enough to understand but what do you use for anisotropic filtering and the geometry and grass level of detail sliders? and on Far Cry 6 Ultra, is it still tested with the HD texture pack?

BTW, will the vega/10/16 series gpus eventually get updated to the new test suite? or are they going to be phased out from the charts entirely?
 
Last edited:
  • Like
Reactions: Order 66
I have just noticed that the individual game charts have been updated to the newer test suite, including different settings for RDR2. For the medium settings "adv. medium/off" is easy enough to understand but what do you use for anisotropic filtering and the geometry and grass level of detail sliders? and on Far Cry 6 Ultra, is it still tested with the HD texture pack?

BTW, will the vega/10/16 series gpus eventually get updated to the new test suite? or are they going to be phased out from the charts entirely?
I've dropped a lot of older GPUs from the charts (ie, not the tables, just the graphs), and I'm still debating about when/if to fully switch to the new testbed. Basically, I was only going to test RT-capable GPUs on the new suite, but I know a lot of people like to see the older stuff for comparison. I just don't know how far back I want to go on testing, and if I update the test suite again in the coming months... well, the new suite would probably punish older GPUs even more than the current.

Probably the reasonable cutoff is that anything that lacks currently updated drivers gets the axe. So RX Vega / Polaris are now on "life support" from AMD, which means I can cut them if I do retesting (the existing 2021/2022 table would remain on the second page, probably). For Nvidia, everything GTX 900-series and newer is still technically supported, but I only have 970/980/980 Ti cards from that generation. GTX 1050 is a big problem, since it only has 2GB VRAM — even 4GB can potentially have issues now. If I do test older generation hardware, a lot of cards will be limited to 1080p, and maybe even just 1080p medium in some case.

As for test settings used, everything uses the presets: "medium/high" or "ultra/very high/extreme/whatever" depending on the game. The only game that doesn't include an actual preset is RDR2, where everything needs to be manually set. For that, I drop all settings to minimum first, then go through the advanced options and set everything at medium, and then set the standard options to medium. Ultra is similar, except "Ultra/max" everywhere.

Except, what I really do is write the settings to the config file before launching. Here are the settings files I use for medium and ultra in RDR2.
 

Attachments

Please can you update Intel arc charts with newest drivers or can you put a list down the gpu chart which driver you used as a list?
 
Please can you update Intel arc charts with newest drivers or can you put a list down the gpu chart which driver you used as a list?
I have not kept meticulous records of which drivers were used for testing, as there have been a lot of changes. I do know I retested most of the Arc cards this past summer/fall, and most of the changes were negligible. The big changes with Arc have been on some older DX11 games where the old driver path was very sub-optimal, but the only game I still test in DX11 mode is Total War: Warhammer 3. (I switched to DX12 for Flight Simulator on Arc GPUs at some point as well, when it began performing better than the DX11 mode.)

As another point of reference, the Arc A580 only launched in October and was thus tested with the October driver codebase. I believe I retested the A750 as well, and possibly the A770 cards (I don't recall exactly). The review only shows the performance on the newer 13900K test PC, but the tables in the hierarchy are from the 12900K results — so yes, each card in the hierarchy gets tested on at least two different PCs. The A580 is about 9% slower than the A750, as expected, which means both it and the A750 numbers are valid.

If there's some particular game where you think the Arc results are outdated / too slow, let me know. Keep in mind that the 1% lows are more variable than the average fps, and so there are a few instances where as an example A770 8GB has a bigger dip than the A750.
 
I assume you're referring to the 7600 XT?
Yes :) And I was referring to the fact that it's missing from this graph (as you're right in the table it's present already):
wmo3mxO.png


P.S. Wouldn't it make more sense to start with the 1440p or even the 4k graph as they are probably anyway what people are more looking after and show more clearly the differences between the GPUs? I don't know if you gather the info on this website but I'd wager that most people who frequent here are having at least 1440p monitor (though of course they might still game on 1080p).
 
Yes :) And I was referring to the fact that it's missing from this graph (as you're right in the table it's present already):
wmo3mxO.png


P.S. Wouldn't it make more sense to start with the 1440p or even the 4k graph as they are probably anyway what people are more looking after and show more clearly the differences between the GPUs? I don't know if you gather the info on this website but I'd wager that most people who frequent here are having at least 1440p monitor (though of course they might still game on 1080p).
Ah. I'll have to add it there. I thought I had... maybe I just didn't upload the new images.

As for order, I put 1080p ultra first because I test all cards at 1080p ultra/medium. A few cards are dropped at 1440p, and a lot of cards are dropped at 4K. So I show the most cards possible on the initial slide.
 
While we're at it... AMD just dropped 7700 XT pricing to 419 that makes it really close to 6800. In the US it's now 400 USD vs. 420 USD. RDNA2 Vs RDNA3. 16GB vs. 12 GB. So many questions, so few answers.
I live in Europe, so the situation is a bit different at the moment:
6800 is 389 EUR
7700 XT is 433 EUR
I was wondering if I should get the 6800 while it's good 10% cheaper, being more or less on par in terms of performance and there are still supplies of it or it's better to take the 7700 XT?
 
While we're at it... AMD just dropped 7700 XT pricing to 419 that makes it really close to 6800. In the US it's now 400 USD vs. 420 USD. RDNA2 Vs RDNA3. 16GB vs. 12 GB. So many questions, so few answers.
I live in Europe, so the situation is a bit different at the moment:
6800 is 389 EUR
7700 XT is 433 EUR
I was wondering if I should get the 6800 while it's good 10% cheaper, being more or less on par in terms of performance and there are still supplies of it or it's better to take the 7700 XT?
It's a bit of a toss up. On the one hand, more memory and bandwidth on the 6800. But in the games I've tested, it's still usually slightly slower than the 7700 XT. The new features of RDNA 3 are fine to have, though I suspect a lot of people won't really use them. Basically, if you do video streaming, AV1 encode is a good extra.

If you're not in a rush, you can always just wait to see how things change in the coming weeks. Prices will always change a bit, but I don't know that things will get much cheaper from where they are now any time soon.
 
  • Like
Reactions: radosuaf