Desktop GPU Performance Hierarchy Table (Archive)

Status
Not open for further replies.

bit_user

Polypheme
Ambassador
Thanks for the update!

we've assigned each a score where the fastest card gets 100 and all others are graded relative to it. These numbers are based on the geometric mean fps from our Far Cry 5, Forza Motorsport 7, and Ashes of the Singularity: Escalation benchmarks, giving us a good mix of game genres and APIs.

I like this scoring methodology, except...
1. It's not specified, but I feel that the scores used should be ones at some plausible preset and AA for a given game.
2. Separate scores (and columns) should be provided for 1920x1080, 2560x1440, and 3840x2160.

Otherwise, very nice.
 
Regardless of the reason, the 570 outscored the 1060 3gb in the average and should be ranked as such. Having it with a lower score, yet above the 570 is disingenuous. If the scoring system is to be creditable, then they need to be listed as they performed. Lack of memory is not an excuse, it's a limitation of the card as designed.
 
Those three games do NOT represent a typical gamers experience. They are newer games and thus favor AMD more. Hey, I'm not badmouthing AMD just that most gamers would have more OLDER games in the mix.

Practically too, VEGA64 with a weaker cooler can throttle down quite a bit.

(I am rooting for AMD to come out with a more power efficient card even if it's not much more powerful than VEGA64... at some point AMD and NVidia should switch to multi-die GPU's which will be interesting)
 


I have seen little to nothing for AMD. Lots of secrecy with their next GPU. One thing for sure though is that if they do not get a handle on the crypto-mining inflation there is no way they will conquer the gaming market. Not with double the price of a GTX 1080 and a much (nearly double) higher TDP.

Short of the die hard AMD fans and crypto-miners no one would pay $1200 for a GPU when you can get the 1080 Ti for $850.

And thanks to that price inflation nVidia isn't going to drop prices on theirs anytime soon except when they launch their new top end GPUs.
 
This new format is a nice way to show roughly how these current-generation cards compare. I definitely want to see the legacy chart stay too though, as it helps provide a rough estimation of how old cards compare to newer ones.

The testing methodology for the new chart doesn't really seem up to par though. Basing numbers on performance in just 3 games is not going to provide particularly accurate results, especially when 2 of the 3 are running at resolutions that some of the cards aren't even designed to handle. If the choice of resolution is going to reposition cards in the chart, why not provide separate results for different resolutions? The vast majority of people are still gaming at resolutions of 1080p or below, so that should be the most relevant resolution to test. If someone is gaming at 1440p or higher with a higher-end card, then they are probably also more likely to read reviews, and less likely to need a chart like this.

I must say that I prefer the performance summary charts included in TechPowerUp's reviews, as they not only provide separate charts for 1080p, 1440p and 2160p, but they also base the results on the combined performance of the 20+ games in their test suite, preventing anomalous results in any one title from throwing off the average too much. Plus, they make the actual frame rate results for each game available in separate charts, so one can get an idea of whether a certain difference in performance might even be relevant at a given refresh rate, and in the games they actually play. Tom's of course provides much more detailed results for each game tested in their reviews, but that doesn't apply in a summary like this. I would at the very least like to see more games being used to calculate these results.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
I like this scoring methodology, except...

    ■ It's not specified, but I feel that the scores used should be ones at some plausible preset and AA for a given game.
    ■ Separate scores (and columns) should be provided for 1920x1080, 2560x1440, and 3840x2160.


Otherwise, very nice.

So, the biggest challenge is using settings that 1) don't bottleneck the top-end cards because they're too accessible to the low-end stuff and 2) more than just slide shows on the 2GB cards. What we end up with is imperfect no matter what, but at least it allows a top-to-bottom comparison in one column. Definitely, the best way to counter any ambiguity would be to perform the same exercise for each resolution, as suggested, and watch how the curve changes.

Of course, at the point we hit 20 games, three resolutions, and all of the latest cards, I hope there's an old-school Charts-like repository to plug in all of that data ;)
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795


I'll bring up the idea of adding resolutions and games. I think the purpose here was to add a better-quantified alternative to the old legacy chart (which is still included underneath), but to start backing some of the current-gen rankings with data. I am at least satisfied that the three genres/chosen APIs do match our review guidance. However, the suggestions are appreciated and, again, I'll bring them up in our next meeting to see if I can get the green light for more data.
 

bit_user

Polypheme
Ambassador

So, only list two scores on the low & high-end cards. As you point out, people don't generally buy a 1080 Ti to drive a 1080p monitor, nor would anyone try to game at 4k with a 2 GB card.

The main thing is to get inside the head of someone shopping for a card. They either have the monitor they're planning to use the card with, or else they will want to know which resolution to get with a given card. Splitting out the scores for 2 megapixel, 4 megapixel, and 8 megapixel display resolutions helps shoppers discern which resolutions a given card can readily handle. That's no small detail.
 

chalabam

Distinguished
Sep 14, 2015
154
37
18,720
Great. Tomshardware GPU articles had turned abstract to me for not having the GPU hierarchy.

I was unable to compare my card with the new ones.

Tom's hierarchy lists and techreport x-y price performance charts are some of the most valuable info on hardware reviews.
 
This really needs to be turned into a price/performance plot. So you can easily see where the sweet spot is (the elbow in the curve).


I wouldn't really call that favoring AMD. What's happened is that game developers are now spending as much time optimizing for AMD as they are for Nvidia. Older games were optimized for Nvidia but not for AMD. So the FPS in current games is a more of a level playing field between the capabilities of cards from the two manufacturers. i.e. Older games favored Nvidia. Current games don't favor either.

That said, to address your point, this really needs to be turned into a dynamic chart (and graph I mentioned above), where the user can select which game they want to use for the comparison. The website should just send a bunch of FPS data to the browser, and a script on the browser can dynamically generate the chart based on which game the user selects.
 
I like the new list. I can see it breaking out, along reasonable lines, though for the resolution being pushed and a bump in the test suite titles. Keeping the old format tier list is good too as it shows relative performance across generations. :thumbsup:
 

80-watt Hamster

Honorable
Oct 9, 2014
238
18
10,715


Can't speak much to 560/460 vs 270X, but the 370 is basically identical to the 270, and IS slower than the 270X.

Oh, and keep the legacy hierarchy in some form, plz! Despite limitations, it's the most complete at-a-glance reference I've found.
 


Sounds about right, considering the new architecture. The 580 is about equal with the 390(x), 570 would likely land just ahead of the 380, so the 560 should be around the 370(x).
 
May 30, 2018
4
0
10
The pricing listed in this article is bad and biased. The Vega 64 is $589 per the provided link, and Geforce 1080 is OOS.

Basically makes the Nvidia look like distinctly better buy when it's not. $110 difference shown, $10 difference reality, and 1080 OOS.
 
Status
Not open for further replies.