GPU Performance Hierarchy 2024: Video Cards Ranked

abryant

Asst. Managing Editor
Staff member
May 16, 2016
183
17
18,685
Is RX Vega 56 quicker than GTX 1070? How about GTX 1050 vs RX 550? We've tested all the current chips from Nvidia and AMD so you can see how each stacks up. Read more here.
GPU-Hierarchy-Cover-early-2019.jpg

CHRIS ANGELINI
@cangelini

Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
 

Barty1884

Retired Moderator
The legacy aspect/chart has only a 2080 and 2080TI from the RTX lineup. While they are addressed in the modern ranking, can we expect the long-term chart to be updated to include 2070, 2060, 1660TI etc.... ?
 

johnvosh

Distinguished
May 22, 2012
8
1
18,515
When are you going to update the chart? It is missing quite a few new cards such as the RTX 2070, 2060, Entire GTX 16 series. The top of the page says updated April 25th, 2019, but it hasn't....
 
  • Like
Reactions: Frank_G_PGH

olehaus

Distinguished
Dec 31, 2007
1
0
18,510
This might have been pointed out earlier but, I think the 1660 cards shoud have been mentioned in the 3rd paragraph (together with 1070 etc.) in the comments beneath the current lineup hierarchy. I am glad these 16xx cards came to market because they will knock down the overpriced used (for mining) 10xx cards :) .
 
Oh boy here we go...
Radeon VII beats Titan X (or Xp) and 1080 ti.
RTX 2060 and Vega 56 beat 1070 ti.
GTX 1070 beats GTX 1660 ti
RX 570 4gb beats GTX 1060 3gb

The power draws on GTX 1080, RX 590, RX 570 and RX 560 are all wrong.

What a mess this chart is.
 

TJ Hooker

Titan
Ambassador
Oh boy here we go...
Radeon VII beats Titan X (or Xp) and 1080 ti.
RTX 2060 and Vega 56 beat 1070 ti.
GTX 1070 beats GTX 1660 ti
RX 570 4gb beats GTX 1060 3gb

The power draws on GTX 1080, RX 590, RX 570 and RX 560 are all wrong.

What a mess this chart is.
Only thing that looks wrong to me performance wise is the 2060 vs 1070 Ti being flipped, but they're pretty close regardless. Listed power consumption values are just the rated TDPs, which seem correct other than for the 1080.
 

johnvosh

Distinguished
May 22, 2012
8
1
18,515
It is now May 13th, no reply from the original author of this post, chart still has not been updated at all. The RTX 2060 has been out since Jan 15, 2019 and is still not on there. The RTX 2070 has been out since October 2018 and it isn't even on the list......
 
I like the update.
I see that several integrated graphics chips were included; all from amd.
Because of the increasing performance of intel integrated graphics, I would like to see HD630 and HD 620 included. From passmark ratings, they might be in the GT730 performance tier. About the minimum that can be considered as gaming.
It probably does not make sense to include them in the first/current section which starts with the GT1030, about the bottom of anything considered as decent gaming.

The excellent ryzen integrated graphics should be included, likely in the current list somewhere.
These ranking lists are a good way for a prospective build to determine their graphics choice.
 
Nov 10, 2019
3
0
10
It just Wrong.. Made by Amd fan

2060S and 2070 is faster than Rx5700
2070S is faster than Rx5700XT

(Forza is Amd game, so yeah Amd biased test, use neutral games or over 20 game to get real performance diffefences)

relative-performance_2560-1440.png
 

systemBuilder_49

Distinguished
Dec 9, 2010
94
31
18,570
If you are buying a graphics card this xmas and are spending < $450, I think you want to get an NVidia 1660 super or higher. In general we buy only AMD products (Ryzen 1700x, Ryzen 3600, RX 480, RX 490), but I believe at this time around, the streaming encoding quality is way WAY higher than on the RX 5700, RX 5800, RX 5800 XT. The latest AMD cards just don't hardware-encode video with very good quality, compared to NVidia. If you get the 1660 super (or any 2060 card), you will be much happier as the hardware encoder is highly superior. We just built a computer last christmas for streaming and broke away from AMD and used a $310 RTX 2060 for exactly that reason!
 

King_V

Illustrious
Ambassador
@JarredWaltonGPU thanks for the update.

And, I am nearly POSITIVE that I am the only person curious at all about this, but way down at the bottom of the hierarchy list, where would the Vega 3 (ie: Athlon's integrated GPU) fall in? I would guesstimate that it would definitely be above UHD 630, and possibly a "competitor" to the Iris Plus . . . any measurements or guesses on that one?
 
@JarredWaltonGPU thanks for the update.

And, I am nearly POSITIVE that I am the only person curious at all about this, but way down at the bottom of the hierarchy list, where would the Vega 3 (ie: Athlon's integrated GPU) fall in? I would guesstimate that it would definitely be above UHD 630, and possibly a "competitor" to the Iris Plus . . . any measurements or guesses on that one?
Oh, jeez. Vega 8 was painful to test as it stands. Thankfully, I don't have Vega 3 chips for testing. :sneaky:

UHD 630 probably comes at least relatively close to Vega 3, if you're talking about the Athlon 200GE (maximum GPU clock of 1000MHz). Ryzen 3 5300G on the other hand with Vega 6 might not be too shabby, probably close to GT 1030 performance. The CPU and memory support will be a factor on the really low-end stuff as well. Vega 3 with DDR4-2667 RAM as an example will get hit from both the reduction in CUs and the reduction in bandwidth. And then the slower CPU (2-core/4-thread, at 3.2GHz) will potentially hurt as well.

It will probably play League of Legends, Dota 2, and CS:GO okay, though! 🙃
 

alfema

Prominent
Oct 10, 2021
6
1
515
Hello there,

Some suggestions.

First apologize for my Google English.

To facilitate the comparison between graphics, it would be good if in GPU Ranking the position occupied by each graphic was added, in Legacy GPU Hierarchy the level of each group.

The way the reference table is organized now, using 100% for the most powerful one, it makes any low or mid-range graphics look like garbage in terms of performance, I think it would be more appropriate to take as a basis the graphic that allows a 1080p resolution and 60 FPS in triple A games, this way it is easy to orient yourself when deciding. And if separations according to resolution and FPS were also added, it would be fantastic.

Greetings
 
The new, revamped GPU hierarchy article looks good @JarredWaltonGPU. Is it gonna get a review thread so people can bash it/take it apart? :LOL:
So, at one point I asked the forum mods if they could clear out all the old comments, since some were from five or seven or whatever years ago. The result? They archived the comments thread and killed off any way to respond to it!
I asked them to bring it back, create a new thread, whatever, and they either didn't care or didn't know how.
We could create a clone of the article on a new URL, which I've suggested before, but the Powers That Be™ do not want to do that for Reasons™. If it were up to me, it would already be at the /features/gpu-benchmarks-hierarchy URL instead of /reviews/gpu-hierarchy,4388.html
¯\(ツ)
 
So, at one point I asked the forum mods if they could clear out all the old comments, since some were from five or seven or whatever years ago. The result? They archived the comments thread and killed off any way to respond to it!
Ha! They didn't want to spend the 60 or so mins to either script something to do it or go through and delete individually. Nuclear option = less clicks for admins.


I asked them to bring it back, create a new thread, whatever, and they either didn't care or didn't know how.
They probably don't know how to without bring back everything and going back to square one.


We could create a clone of the article on a new URL, which I've suggested before, but the Powers That Be™ do not want to do that for Reasons™. If it were up to me, it would already be at the /features/gpu-benchmarks-hierarchy URL instead of /reviews/gpu-hierarchy,4388.html
¯\(ツ)
Pshh! Silly @JarredWaltonGPU! Original ideas have no place in the realms of the Powers That Be™!!
 
Ha! They didn't want to spend the 60 or so mins to either script something to do it or go through and delete individually. Nuclear option = less clicks for admins.

They probably don't know how to without bring back everything and going back to square one.

Pshh! Silly @JarredWaltonGPU! Original ideas have no place in the realms of the Powers That Be™!!
We actually talked about this in the morning meeting today, and maybe it will get fixed. Eventually. Maybe...
 
Excellent news. Thanks!

Quick sidebar for you. Is the 300W TBP you mentioned in your review of the 6900 XT with or without the +15% power limit slider?
Thanks!
300W is the TBP at "stock" without AMD's Rage Mode or any other power tweaks, using a reference card. I have tested some 6900 XT cards where the power use gets close to 400W (at stock), and can easily exceed that if you move the power slider up:
https://www.tomshardware.com/reviews/asrock-rx-6900-xt-formula/5
 
  • Like
Reactions: alceryes
I have tested some 6900 XT cards where the power use gets close to 400W (at stock), and can easily exceed that if you move the power slider up:
https://www.tomshardware.com/reviews/asrock-rx-6900-xt-formula/5
Yup. That's the XTXH core on that ASRock RX 6900 XT Formula. Although AMD gave the core a physically different name they, in their infinite wisdom, decided that cards with the XTXH core will still be the RX 6900 XT, causing endless confusion. These 'special' cores are heavily binned and provide extra voltage and power ceilings at stock for both the core and memory.

On my lowly reference/Zotac 6900 XT I'm hitting a 2600MHz peak core and 2100MHz VRAM clock, all while undervolting and keeping it cool and quiet. No XTXH core needed here. ;)