GPU Performance Hierarchy 2024: Video Cards Ranked

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Again, thanks for the informative reply!

I have only used Nvidia GPUs in desktops, not that I ever had a reason not to get AMD ... just in almost 20 years I have only had two GPUs of my own I guess.

Had Windows 95/98/XP with the family ... then got my own newer XP in 2005 from my family as a present ... so that had GeForce 8400 GS which I then upgraded to GeForce GTX 650 Ti Boost ... which is now in the system I tried building a few years ago when the GPU shortage came at the same time I was looking for a GPU.

I agree, if there is a useful feature to use, then of course use it. I suppose I would have to try and see performance gain or loss for myself to agree or disagree one way or another.

This probably isn't the right place to ask, but is there some fast wireless display software or adapter that keeps performance from a PC to a TV or a monitor? Or if certain graphics cards work better doing that than others? I assume connection via HDMI will always be the best performance just like connection via ethernet is the best performance vs wifi. I have been using TVs as my PC display for years now
WiDi (Wireless Display) was a thing I remember testing over a decade ago on some laptops. That was eventually discontinued and superseded by Miracast. There's now support for up to 4K streaming, but I don't know how it does with latency. Certainly cabled connections will be better on the latency aspect, though for things like video (not gaming) the latency isn't a major issue.
 

HoveringAbove

Prominent
Dec 7, 2022
5
2
515
Is it possible to share the exact settings used for Red Dead Redemption 2 at 1080p medium and ultra? It's annoying that the game doesn't have universal quality presets.
 
Is it possible to share the exact settings used for Red Dead Redemption 2 at 1080p medium and ultra? It's annoying that the game doesn't have universal quality presets.
Sure. I think I wrote this somewhere, but you're right: the lack of presets sucks! (It bases your "preset" preference on how much VRAM you have, not on potential GPU speed.)

  1. Set the preset to the minimum value, then set/confirm everything to minimum; unlock the advanced settings, and also set all of those to the minimum values.
  2. For 1080p medium, now go into the basic settings and put everything up one notch (mostly "medium" but also 2xAF). Do not enable DLSS, MSAA, or FSR.
  3. For the "ultra" settings, go back into the basic settings and turn everything to maximum (high/ultra/16xAF). Do not enable DLSS, MSAA, or FSR.

Those are my current settings, but I'm going to change things for the upcoming reviews. Basically, instead of the above, we'll start by setting everything to "medium/on" including the advanced options for medium, and for ultra everything will be at maximum. I'll still leave DLSS/FSR/MSAA settings off, for apples-to-apples, though I'll show FSR2 or DLSS on initial GPU reviews if the game supports either of those (but only on the reviewed card).
 
  • Like
Reactions: HoveringAbove
what most reviews missing is how much watts gpu draws at lets say 1080p 60fps...
most reviews just bench its max potential, not everybody runs theirs GPUs at max all time

from what i can say i switched from 1070ti which was eating around 200watts in lots of 1080p 60fps at ultra games...now im on rx 6800 and wattage droped to around 60 watts, even at 1440p at ultra (60fps) im still under 100 watts

why i went for rx 6800...well i wanted better gpu..rtx is here in eu a bit overpriced and amd isnt that far behind while being cheaper....before purchasing i wanted 6750 xt, but looking through several youtube comparisons 6750xt vs 6800 vs 6800xt...6750xt was eating more wattage as 6800 while producing less fps, 6800 xt had few fps higher, but wattage was also higher (+10% fps/+30% power)..so i went with a winner lol and indeed gpu runs at very low watts, undervolted its even better :)
 

HoveringAbove

Prominent
Dec 7, 2022
5
2
515
Sure. I think I wrote this somewhere, but you're right: the lack of presets sucks! (It bases your "preset" preference on how much VRAM you have, not on potential GPU speed.)

  1. Set the preset to the minimum value, then set/confirm everything to minimum; unlock the advanced settings, and also set all of those to the minimum values.
  2. For 1080p medium, now go into the basic settings and put everything up one notch (mostly "medium" but also 2xAF). Do not enable DLSS, MSAA, or FSR.
  3. For the "ultra" settings, go back into the basic settings and turn everything to maximum (high/ultra/16xAF). Do not enable DLSS, MSAA, or FSR.
Those are my current settings, but I'm going to change things for the upcoming reviews. Basically, instead of the above, we'll start by setting everything to "medium/on" including the advanced options for medium, and for ultra everything will be at maximum. I'll still leave DLSS/FSR/MSAA settings off, for apples-to-apples, though I'll show FSR2 or DLSS on initial GPU reviews if the game supports either of those (but only on the reviewed card).

Thanks a lot!
Btw does TAA stay off too?
 

King_V

Illustrious
Ambassador
Hey, @JarredWaltonGPU . . . somehow this one popped back into my head (must have stumbled across a reference to it somewhere recently), but, by any chance, were you ever able to get a hold of an RX 5300.

Yes, of course I mean on purpose, why do you ask? :LOL:

I realize it's got that whole 3GB VRAM issue and all, but, eh, what can I say? I can't shake my obsession with the budget boards.
 
Hey, @JarredWaltonGPU . . . somehow this one popped back into my head (must have stumbled across a reference to it somewhere recently), but, by any chance, were you ever able to get a hold of an RX 5300.

Yes, of course I mean on purpose, why do you ask? :LOL:

I realize it's got that whole 3GB VRAM issue and all, but, eh, what can I say? I can't shake my obsession with the budget boards.
Sadly (not really!), I do not have an RX 5300. I have a bunch of other old cards, but nothing like the 5300. There's a used on on eBay for $70... but I don't actually want the card. LOL
 

King_V

Illustrious
Ambassador
Sadly (not really!), I do not have an RX 5300. I have a bunch of other old cards, but nothing like the 5300. There's a used on on eBay for $70... but I don't actually want the card. LOL
Yeah, I saw that one. Did a bit of digging and I've read hints for performance ranging from "it's about where a 1650 is" to "edging out a 1650Ti" (??)

The curiosity is actually making me think I might grab one of those for $70. Of course, the "used" but also "more than 10 available" would have me asking where they came from first. Not that I can imagine the RX 5300 being used for mining...

EDIT: a couple of grammar/clarification fixes
 
Last edited:

King_V

Illustrious
Ambassador
@JarredWaltonGPU - I don't know whether to be proud or ashamed, but I actually, after a bit of hemming and hawing, ordered one. It arrived today.

Yep, I'm now the owner of an RX 5300. I suppose I'll play with it soon, but, other things do keep getting in the way. After all, I'm also the guy who did this, so, whatever I try, hopefully I won't wait nearly that long!
 
@JarredWaltonGPU - I don't know whether to be proud or ashamed, but I actually, after a bit of hemming and hawing, ordered one. It arrived today.

Yep, I'm now the owner of an RX 5300. I suppose I'll play with it soon, but, other things do keep getting in the way. After all, I'm also the guy who did this, so, whatever I try, hopefully I won't wait nearly that long!
By "play with it soon," do you mean in less than eight months? Or more than eight months? Enquiring minds want to know! :ROFLMAO:
 

King_V

Illustrious
Ambassador
Minor nitpick on the update to the hierarchy chart, and I'm guessing it's a quirk in how the table sorting is done, but, the RX 7800 XT is very slightly faster than the 6800 XT at ultra in 4k and 1440p, by 1.3 and 1.0 fps, respectively at 1080p medium by 0.3 fps, but at 1080p ultra, falls short of the 6800 XT by 0.1fps.

And that knocked it down below the 6800 XT in the chart?

Yeah, I know that for all intents and purposes, it's a dead heat, but that still seemed weird.
 
Minor nitpick on the update to the hierarchy chart, and I'm guessing it's a quirk in how the table sorting is done, but, the RX 7800 XT is very slightly faster than the 6800 XT at ultra in 4k and 1440p, by 1.3 and 1.0 fps, respectively at 1080p medium by 0.3 fps, but at 1080p ultra, falls short of the 6800 XT by 0.1fps.

And that knocked it down below the 6800 XT in the chart?

Yeah, I know that for all intents and purposes, it's a dead heat, but that still seemed weird.
So, the current hierarchy is sorted by the 1080p ultra results, because I have to choose something. I could try to sort by the aggregate (geomean) of multiple columns, but I only try to test (nearly) every GPU at 1080p medium/ultra, and so older and slower cards can't generally do 1440p ultra and thus sorting by that in any form creates a problem.

Anyway, all four columns are visible just so people can look at what they care about most. I wouldn't expect anyone looking at an RTX 3090 or 4080 or above (or 6900 XT or 7800 XT or above) to be primarily worried about 1080p performance, so high-end GPU shoppers should probably focus on the 1440p or even 4K columns.

Note also that these tables are from the Core i9-12900K, which does reduce CPU performance by maybe 10% compared to the 13900K. Even with the RX 7800 XT, at 1080p CPU limits can become a factor. The tables are also sorted by a combination of average and 1% low FPS — I believe the exact formula is geomean(avg, avg, avg, avg, 1%low), so four times the relative weight is given to the average compared to the 1% low — and if you look at the rasterization charts at the bottom, you can see that Nvidia's 1% lows are often worse than AMD's at 1080p.
 

King_V

Illustrious
Ambassador
Fair enough, and yeah, you've got to pick something. And, yeah, trying to account for performance across the columns would get weird for those that aren't/can't be tested at the higher resolutions.

Forgot about that, which is kind of embarrassing given my usual obsession with how the underdog cards perform. 😁
 
  • Like
Reactions: Order 66
Jan 22, 2024
61
22
35
Text should be updated, I think. For example: both 6700 10GB and 4060 are 299 USD, 6700 is 62.6 fps @1440p raster, while 4060 is 61.2 fps but listed as best bang for buck... And 7600 at 239 USD seems to better than both of them.
 
  • Like
Reactions: Order 66
Text should be updated, I think. For example: both 6700 10GB and 4060 are 299 USD, 6700 is 62.6 fps @1440p raster, while 4060 is 61.2 fps but listed as best bang for buck... And 7600 at 239 USD seems to better than both of them.
Buried in testing and review hell right now. I will be overhauling things... after the 4080 Super is out the door, most likely.
 
Jan 22, 2024
61
22
35
I find the data to compare to be the best out there (lots of info, nice tables), but the text is slightly outdated :). Would appreciate another take on the FPS / Watt article, it hasn't been updated for quite a long time now.
 
I find the data to compare to be the best out there (lots of info, nice tables), but the text is slightly outdated :). Would appreciate another take on the FPS / Watt article, it hasn't been updated for quite a long time now.
Note that every recent review includes tables on the power page that show FPS/W and (at the time of writing) FPS/$. There's too much coming down the firehose right now for me to update all these other pieces, but I anticipate having a bit more time to do that come February (assuming I don't die in the process).
 
Jan 22, 2024
61
22
35
The general hierarchy bases on TDP (as far as I understand), which is not 100% accurate and the tables in the reviews are a bit limited in terms of the number of cards compared (which is completely understandable).
I know it takes a lot of time, so they will not updated every other week, but I cannot find any other source that would do it so meticulously, so appreciate any effort in that direction.
FPS/$ is not THAT much interesting for me, as I Iive in Europe, but it's always a good starting point nevertheless.
 
Last edited:
The general hierarchy bases on TDP (as far as I understand), which is not 100% accurate and the tables in the reviews are a bit limited in terms of the number of cards compared (which is completely understandable).
I know it takes a lot of time, so they will not updated every other week, but I cannot find any other source that would do it so meticulously, so appreciate any effort in that direction.
FPS/$ is not THAT much interesting for me, as I Iive in Europe, but it's always a good starting point nevertheless.
If you're talking about this section, the general hierarchy uses the power information from the PCAT testing:
1706119337294.png
Which means it's actually from the new hierarchy that I've never quite switched to for various reasons. But I'm about to, because I've discovered changes in my test PCs (Windows 11 security updates, possibly) that have dropped performance in CPU limited games again. VBS is off, but performance is now basically what I used to get with VBS enabled. [Sigh] So it's time to wrap up some of the 20-series testing and then swap to the new data set, as soon as the last of these Super reviews are done... (Famous last words.)
 
  • Like
Reactions: radosuaf
Jan 22, 2024
61
22
35
Seems it was a good idea to switch back to W10 like a week ago :).

4070 seems to be such an interesting card... Too bad it's so expensive.
 
Last edited:

daris98

Distinguished
Jun 8, 2016
16
5
18,515
Thank you for the hard work updating the list. This is one of the most comprehensive list on the internet. It can even be considered THE list.

I have several suggestions though. For readers, it might be easier to visualize value (the most important parameter in a buying decision) in terms of performance vs proce graph. This would certainly help a ton of readers.

I also think it's about time to consider migrating to a DDR5 + 3DCache system soon. Take your time, because it's a lot of work, but I'm sure you wouldn't regret progressively preparing before it's eventually a necessity.

I'd like to add that power consumption relates to heat density, which is a huge concern with people chasing for a quiet or small system. More power doesn't only mean higher power bills, but also needs a better/bigger PSU, a HSF with a bigger footprint and more noise. I think the concept is important to mention.

Plus, be sure to also mention a GPU's cooling performance (about specific AIB coolers). Certain AIBs can sometime make terrible coolers which often irritates their buyers even more than performance value.

Just some suggestions, of course. Please keep up the good work.
 
Jan 22, 2024
61
22
35
I also think it's about time to consider migrating to a DDR5 + 3DCache system soon.

If it means ditching results for older cards, I'd be strongly against. Don't think anybody would have time to test RDNA1 and NVIDIA 2000 series upwards... I would think that is the minimum such a base should comprise of.

Plus, be sure to also mention a GPU's cooling performance (about specific AIB coolers). Certain AIBs can sometime make terrible coolers which often irritates their buyers even more than performance value.

It would be quite difficult to give someting more than a general quidance (like Gigabytes tend to be noisy and Sapphire Pulse is a good noise/price/performance compromise).
 

daris98

Distinguished
Jun 8, 2016
16
5
18,515
If it means ditching results for older cards, I'd be strongly against. Don't think anybody would have time to test RDNA1 and NVIDIA 2000 series upwards... I would think that is the minimum such a base should comprise of.
Take your time, because it's a lot of work, but I'm sure you wouldn't regret progressively preparing before it's eventually a necessity.

It would be quite difficult to give someting more than a general quidance (like Gigabytes tend to be noisy and Sapphire Pulse is a good noise/price/performance compromise).
A "general guidance" like:

"Always consider the cooling performance of the exact GPU you are buying, as it is often overlooked"

is good, while

"Always get Sapphire for AMD and ASUS for Nvidia"

is not.