• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Question What GPUs do you want to see compared with the Arc B580?

JarredWaltonGPU

Senior GPU Editor
Editor
So, I've totally overhauled both my test PC and my test suite for the coming year, and that means I need to retest every GPU in every game/application for the upcoming Intel Arc A580 review. And... I've run out of time. I will have, for sure, the following:

Intel Arc B580
Intel Arc A770 16GB
Intel Arc A750
Nvidia RTX 4060
AMD RX 7600
AMD RX 7600 XT

Or that's the plan — I still need to test the two AMD GPUs, in the next two days. But, what other GPUs would you most like to see in the charts? I'll likely need to add some of them after the fact, but there are a lot of potentially interesting comparisons. To name a few:

Arc A580, RX 6700/6750 XT, RX 6600/6650 XT, RX 6600, RTX 3060 12GB, RTX 4060 Ti/16GB, RX 7700 XT. And the easy answer is that you would want to see all of those GPUs and more, but to do that will take a lot of time. It's a solid 10 hours per GPU on my test suite, assuming I don't do anything else other than testing. (There's also a ~two hour time where I can start SPECworkstation and go do other stuff like writing the actual review.) So, best-case, I can probably do two GPUs per day, and that's extremely unlikely unless I work 24/7. I'm not a robot, so that's not happening.

There will be a new GPU hierarchy in the coming weeks, once I've at least managed to test all the RTX 40, RX 7000, and Arc GPUs. Then I'll work backward through the generations as far as it makes sense. Most likely, that means I'll stop at perhaps Vega/Polaris on AMD and Pascal on Nvidia, but anything without at least 6GB VRAM will also get skipped — there are multiple games in my new test suite that simply won't run on 4GB cards.

Anyway, if you care enough about graphics cards to see this post in our forums, let me know what GPUs you're most interested in seeing tested first in the new suite, particularly as it applies to the Arc B580 on Friday.
 
  • Like
Reactions: palladin9479
1. The RTX 3060, because it is the most popular GPU on the steam hardware survey.

2. The RTX 2060 super & the RX 5700xt, because believe it or not, most people don't actually upgrade their GPU every generation.

3. How about showing us what it can do - rather than giving us the worst case scenario (maxed out settings), like most reviewers.

Truthfully, most reviews (including the ones here) are flippin' useless, because they don't show us what a card can do. Most people don't actually play at ultra settings.

For example - Who cares what a budget card can do at 4k ultra. People buying $250 cards aren't playing AAA games at 4k ultra settings.

The fact that you can run the test doesn't mean that the data is actually useful for making a buying decision.
 
For which cards to add this would be my order:
3060 > 6600 > 1660 Ti/Super or 2060 6GB > 5700 XT > 4060 Ti 16GB > 7700 XT > A580 > any others.

If there are any equivalent performance links to be made I think it's reasonable to just cite in the text rather than testing (like the 4060 being about the same performance as 2080).

edit: accidentally typed 6600 XT instead of 6600 given that the 7600 is slightly faster than the 6650 XT I think that performance tier is mostly covered.
 
Last edited:
1. The RTX 3060, because it is the most popular GPU on the steam hardware survey.

2. The RTX 2060 super & the RX 5700xt, because believe it or not, most people don't actually upgrade their GPU every generation.

3. How about showing us what it can do - rather than giving us the worst case scenario (maxed out settings), like most reviewers.

Truthfully, most reviews (including the ones here) are flippin' useless, because they don't show us what a card can do. Most people don't actually play at ultra settings.

For example - Who cares what a budget card can do at 4k ultra. People buying $250 cards aren't playing AAA games at 4k ultra settings.

The fact that you can run the test doesn't mean that the data is actually useful for making a buying decision.
that's not the point of reviews, but I digress
 
I think the RX 6600 is a worthy comparison, to see if the upsell is worth it for the average person. Also an interesting point of comparison, as it's the GPU that the A750 was frequently compared to. It can give a baseline of what the lowest-end 1080p gaming card worth buying was last gen. Maybe the 6600 XT, though the difference between it and the 6600 aren't radically different.
 
  • Like
Reactions: artk2219
Realistically since the test is for comparing B580 then anything that is supposed to be close in performance to it should be on top of the list. But it would also be nice if you test one of top cards along so we knew what is real difference between newest budget GPU and top option.
 
  • Like
Reactions: artk2219
Include a card that broke the 1440 Ultra 80 FPS range and a non-Intel card in the 40-60 FPS range. These are two controls I would like to see to ensure consistency with the previous data.
 
  • Like
Reactions: artk2219
1. The RTX 3060, because it is the most popular GPU on the steam hardware survey.

2. The RTX 2060 super & the RX 5700xt, because believe it or not, most people don't actually upgrade their GPU every generation.

3. How about showing us what it can do - rather than giving us the worst case scenario (maxed out settings), like most reviewers.

Truthfully, most reviews (including the ones here) are flippin' useless, because they don't show us what a card can do. Most people don't actually play at ultra settings.

For example - Who cares what a budget card can do at 4k ultra. People buying $250 cards aren't playing AAA games at 4k ultra settings.

The fact that you can run the test doesn't mean that the data is actually useful for making a buying decision.
I always test 1080p medium, then 1080p/1440p/4K ultra (if the card can handle it). Because, while my reviews here might be "useless" (thanks for that, what a tactful way of having a discussion), today's ultra is tomorrow's high is the next day's medium and the future's low. Sure, game X running at ultra probably doesn't look all that much better than the same game running at high, but then high usually only looks a bit better than medium and so why test anything beyond 1080p medium? It does vary by game, but again you have to draw the line somewhere.

As for the idea that I "show what the card can do" by doing bespoke tests, that's a fool's errand. It would mean potentially different benchmarks for basically every single graphics card. And what if a slightly different card does better at different settings? Should a review of the B580 use the settings that work best for the B580, or the settings that work best for the RTX 4060?

Unless you find it super useful to "review" cards in a vaccuum by only testing the one or a handful of cards? Which is what a lot of YouTube videos tend to do, because it's easier for a single video to only show three or four cards. Personally, and objectively, I prefer to show cards in context. So, Arc B580 at $249 needs to be compared to similar priced cards, as well as (time permitting) other cards in the vicinity like the step up and step down.

This is why standardization exists. The GPU reviews are intended to show both the pros and cons of a card, including areas where it struggles as well as areas where it does well. I can try to make an 8GB card look better by avoiding any games and settings where 8GB of VRAM is an issue, but is that helpful or is that just doing the marketing BS for the GPU manufacturers? Or I can go the other way and only test games at settings that make an 8GB card choke, which is equally deceptive.

By picking a suite of games and sticking to it, that shows I'm not just cherry picking stuff to try to paint things in a rosier hue. If you prefer places that are constantly changing test suites, there's a very real chance that they are far more biased in what games get tested.

For which cards to add this would be my order:
3060 > 6600 > 1660 Ti/Super or 2060 6GB > 5700 XT > 4060 Ti 16GB > 7700 XT > A580 > any others.

If there are any equivalent performance links to be made I think it's reasonable to just cite in the text rather than testing (like the 4060 being about the same performance as 2080).
I most definitely won't be doing any non-RT capable cards for the initial review, simply because that means, at a minimum, going back to GPUs that are more than four years old. Yes, people are still using them. Yes, I will test the non-RT games on such GPUs when I get around to retesting them. But I'll start at the newest cards and work toward older GPUs for the hierarchy.

While interesting to some readers for sure, I think in general that sort of information is less useful overall in a new GPU review — people who really care and are looking for details will generally be more familiar today with where the RTX 4060 lands rather than using an RTX 2080 as a "similar performance" alternative, not to mention potentially very different feature sets as you go back to older GPUs.

Anyone still running something like an RX 5700 XT or RX Vega 64 can look at the current GPU hierarchy and extrapolate for now. Same with GTX 16- and 10-series GPUs (and older). So if you're using an RX 5700 XT, that lands at about the level of the Arc A750, RTX 3060, and RX 6600 XT. I have to assume, barring data to the contrary, that it won't change too drastically with a newer test suite. (It might!)

For now, after the RX 7600 XT testing is done, I think I'll do the RTX 3060 first, and then try for RTX 4060 Ti and 4060 Ti 16GB, then RX 7700 XT. That should be enough for this review, and from there I can start at the fastest GPUs and begin testing them on the new suite and testbed.

But it would also be nice if you test one of top cards along so we knew what is real difference between newest budget GPU and top option.
The main hierarchy might be fine to have every GPU tested, but for a review of a $249 card there's little sense in including a $1000 or $2000 GPU in the charts. Hardly anyone says, "I was going to spend $250, but then I saw what spending four times as much money could do with me so I said screw the budget and upgraded to the fastest card in the charts!" If someone have that much disposable income, then they probably already own an RTX 4090 and the charts aren't actually important data for them — not my main audience, in other words. 🤷‍♂️ Plus, the initial point was that I don't have test data for any of the GPUs right now because everything is changing for this review (and going forward). I can't include a top GPU without testing it, nor can I include a slower GPU without testing it.

Include a card that broke the 1440 Ultra 80 FPS range and a non-Intel card in the 40-60 FPS range. These are two controls I would like to see to ensure consistency with the previous data.

Similar to the above, including cards that are at some specific performance range would mean I have to have the performance data already — and I don't have that for the new suite, though I could always pick cards based on previous results. A GPU that does 80 FPS or more at 1440p ultra, however, would mean something like the RX 6800 or RTX 3070 Ti, both of which are in a different price bracket (though used you might be able to find them for the same ~$250 as the B580). 4060 Ti does ~75 fps at 1440p ultra, rasterization mode at least, and I do plan on including that once I've retested. Same goes for the RX 7700 XT. Those will be done by next week. :)
 
Thanks for asking this, Jarred.

I think the best way to approach this inquiry would be:
1- Extreme ends: see how far away from the top and bottom this card sits, so 4090 and, say, 1050 or iGPUs?
2- Price range +/- $100. I think most cards that gravitate in the ~$250 price point have very little flex for price alternatives, specially above. So be it $100 or $150 would be a good range. This is also because prices in this range are* quite bouncy.
3- For a very interesting topic: the GPU with the closest die size OR memory/bandwidth configuration (I doubt there's one with both). This is to check how the design looks compared to other cards, including Intel themselves.
4- For a few games (or just one): what are the reasonable quality settings you can get away with for a certain target FPS/minFPS you want to hit? This is to test the marketing claims that usually say "1080p ultra card" and that type of nonsense.

EDIT: Added #4

Regards.
 
Last edited:
  • Like
Reactions: artk2219
I most definitely won't be doing any non-RT capable cards for the initial review, simply because that means, at a minimum, going back to GPUs that are more than four years old. Yes, people are still using them. Yes, I will test the non-RT games on such GPUs when I get around to retesting them. But I'll start at the newest cards and work toward older GPUs for the hierarchy.
That makes sense I always forget the 6000 series was AMD gen 1 RT for some reason, but it's why I put the 2060 with the 1660 Ti/Super.
While interesting to some readers for sure, I think in general that sort of information is less useful overall in a new GPU review — people who really care and are looking for details will generally be more familiar today with where the RTX 4060 lands rather than using an RTX 2080 as a "similar performance" alternative, not to mention potentially very different feature sets as you go back to older GPUs.
I mean saying the 2080 is similar performance to the 4060 as opposed to testing the 2080 (since you will be testing the 4060). As long as you stick to the same vendor comparisons across generations typically hold unless heavy RT or upscaling etc. While comparison data is available, and your GPU Heirarchy is still my first stop for general comparison, it always seems like someone asks "what about..." when there's something close enough in the review.

I'll go look up data to try to find comparisons, but I can say right now I've seen zero reviews with 40 series cards that include mine as a comparison point. Now I know what the rough performance is and what to look at, but I'm also very invested in technology and understanding all of my components. Perhaps once the new GPU Heirarchy is done you can include an aside on GPU reviews which has older generations that fall within single digit percentage of what you're testing.
 
So, I've totally overhauled both my test PC and my test suite for the coming year, and that means I need to retest every GPU in every game/application for the upcoming Intel Arc A580 review. And... I've run out of time. I will have, for sure, the following:

Intel Arc B580
Intel Arc A770 16GB
Intel Arc A750
Nvidia RTX 4060
AMD RX 7600
AMD RX 7600 XT

Or that's the plan — I still need to test the two AMD GPUs, in the next two days. But, what other GPUs would you most like to see in the charts? I'll likely need to add some of them after the fact, but there are a lot of potentially interesting comparisons. To name a few:

Arc A580, RX 6700/6750 XT, RX 6600/6650 XT, RX 6600, RTX 3060 12GB, RTX 4060 Ti/16GB, RX 7700 XT. And the easy answer is that you would want to see all of those GPUs and more, but to do that will take a lot of time. It's a solid 10 hours per GPU on my test suite, assuming I don't do anything else other than testing. (There's also a ~two hour time where I can start SPECworkstation and go do other stuff like writing the actual review.) So, best-case, I can probably do two GPUs per day, and that's extremely unlikely unless I work 24/7. I'm not a robot, so that's not happening.

There will be a new GPU hierarchy in the coming weeks, once I've at least managed to test all the RTX 40, RX 7000, and Arc GPUs. Then I'll work backward through the generations as far as it makes sense. Most likely, that means I'll stop at perhaps Vega/Polaris on AMD and Pascal on Nvidia, but anything without at least 6GB VRAM will also get skipped — there are multiple games in my new test suite that simply won't run on 4GB cards.

Anyway, if you care enough about graphics cards to see this post in our forums, let me know what GPUs you're most interested in seeing tested first in the new suite, particularly as it applies to the Arc B580 on Friday.

a750/ a580

/rx 6600 /7600/rx 6800 ( yes they can be found for around £329 in uk which beats out the (7600xt)

NVidia 3060/2070 or super/ 4060/4060 ti

also be interesting to know how the image quality stacks up against intel's newest. at least how the drivers act as I find amd cards tend to bump up the scaling to 125 or higher by default or recomend it.
 
Last edited:
Because, while my reviews here might be "useless" (thanks for that, what a tactful way of having a discussion), today's ultra is tomorrow's high is the next day's medium and the future's low.
It's tackful, not tactful 😛.

Jeff from Craft Computing hit it on the head when he reviewed the RTX 4060 - showing us how much a card sucks against the halo card isn't actually useful for making buying decisions.

When we are making a buying decision - we have X amount of dollars to spend. We want to know what is the card's floor, average performance, and where it hits the wall - not which card sucks the least compared to a halo card.

Dropping resolutions that are irrelevant to the card means less work, not more. No one is looking for bespoke settings, btw - just something in place of only Ultra, all the time.

There is absolutely nothing in your current testing that lets us extrapolate what the performance would be at lower detail settings. At the end of the day, we turn down detail settings (which costs nothing), before buying a new card (or a new monitor, for that matter). Ultra becomes High, etc - isn't quantifiable, give us actionable data.

I would recommend dropping the 1080p medium (it is 2024) completely, and replacing it with 1440p medium. This has the added advantage of cutting the CPU out of the equation.

Test budget cards ($200 - $399USD) at 1080p High, 1440p Medium, & 1440p Ultra resolution. (3 test runs, not 4) People that are buying a card in this price bracket aren't pairing it with a 4k monitor.

Test Enthusiasts cards ($400 - $700USD) at 1440p Medium, 1440p Ultra, & 4k Medium resolution. (3 test runs, not 4) People buying a $700 card aren't playing at 1080p medium.

Test Stupid Money cards ($700+) at 1440p Ultra, 4k Medium, & 4k Ultra resolution. (3 test runs, not 4) People buying an RTX 4080 aren't playing on a 1080p monitor.

Don't like the price break points? Feel free to move them around. There are many price point breaks - those are just mine.

By doing this, we can see what the delta is between the levels, which allows us to extrapolate performance at other resolutions.

Using the B580 as an example. $259 dollar card - test at 1080p High, which should be considered the performance floor. 1440p Medium is the center & 1440p Ultra will be pushing the card to it's limit.

What cards should it be compared to?
RTX 2060 12gb same performance as the Super, but same amount of memory as the b580 - feel free to swap with the 2060 Super, although it will be handicapped by the 8gb of memory in comparison to 12gb cards. These are getting long in the tooth, and people are starting to migrate from them.
RTX 3060 - Most popular card on Steam survey - price competitive (and still available), same amount of vram.
RTX 4060 - Direct competitor, more expensive, but less vram.
RTX 4060Ti - Next step up - the 16gb is 50% more expensive, but should would probably the the top end of what prospective B580 customer would pay looking at the Nvidia stack.
RX 5700xt - Top of the line RDNA1 card, and now time for replacement.
RX 6600xt - Cost-wise, a competitor, but also with 8gb vram
RX 7600 - Current price competitor with B580, but also an 8gb vram card.
RX 7600xt - same spot as the RTX 4060Ti - this would be the next step up.

These should be tested raster only - at this price point, RT isn't realistic.

This provides much more actionable information than 3 different resolutions at the most punishing detail settings.

The problem isn't your test suite of games - it is that you (and everyone else) only use the settings that put a card in it's worst light.
 
  • Like
Reactions: artk2219 and Flayed
It's tackful, not tactful 😛.

Jeff from Craft Computing hit it on the head when he reviewed the RTX 4060 - showing us how much a card sucks against the halo card isn't actually useful for making buying decisions.

When we are making a buying decision - we have X amount of dollars to spend. We want to know what is the card's floor, average performance, and where it hits the wall - not which card sucks the least compared to a halo card.

Dropping resolutions that are irrelevant to the card means less work, not more. No one is looking for bespoke settings, btw - just something in place of only Ultra, all the time.

There is absolutely nothing in your current testing that lets us extrapolate what the performance would be at lower detail settings. At the end of the day, we turn down detail settings (which costs nothing), before buying a new card (or a new monitor, for that matter). Ultra becomes High, etc - isn't quantifiable, give us actionable data.

I would recommend dropping the 1080p medium (it is 2024) completely, and replacing it with 1440p medium. This has the added advantage of cutting the CPU out of the equation.

Test budget cards ($200 - $399USD) at 1080p High, 1440p Medium, & 1440p Ultra resolution. (3 test runs, not 4) People that are buying a card in this price bracket aren't pairing it with a 4k monitor.

Test Enthusiasts cards ($400 - $700USD) at 1440p Medium, 1440p Ultra, & 4k Medium resolution. (3 test runs, not 4) People buying a $700 card aren't playing at 1080p medium.

Test Stupid Money cards ($700+) at 1440p Ultra, 4k Medium, & 4k Ultra resolution. (3 test runs, not 4) People buying an RTX 4080 aren't playing on a 1080p monitor.

Don't like the price break points? Feel free to move them around. There are many price point breaks - those are just mine.

By doing this, we can see what the delta is between the levels, which allows us to extrapolate performance at other resolutions.

Using the B580 as an example. $259 dollar card - test at 1080p High, which should be considered the performance floor. 1440p Medium is the center & 1440p Ultra will be pushing the card to it's limit.

What cards should it be compared to?
RTX 2060 12gb same performance as the Super, but same amount of memory as the b580 - feel free to swap with the 2060 Super, although it will be handicapped by the 8gb of memory in comparison to 12gb cards. These are getting long in the tooth, and people are starting to migrate from them.
RTX 3060 - Most popular card on Steam survey - price competitive (and still available), same amount of vram.
RTX 4060 - Direct competitor, more expensive, but less vram.
RTX 4060Ti - Next step up - the 16gb is 50% more expensive, but should would probably the the top end of what prospective B580 customer would pay looking at the Nvidia stack.
RX 5700xt - Top of the line RDNA1 card, and now time for replacement.
RX 6600xt - Cost-wise, a competitor, but also with 8gb vram
RX 7600 - Current price competitor with B580, but also an 8gb vram card.
RX 7600xt - same spot as the RTX 4060Ti - this would be the next step up.

These should be tested raster only - at this price point, RT isn't realistic.

This provides much more actionable information than 3 different resolutions at the most punishing detail settings.

The problem isn't your test suite of games - it is that you (and everyone else) only use the settings that put a card in it's worst light.
buy it, test it for yourself (because no sane person is going to test every game at every resolution), and if you aren't satisfied, then return it.


Problem solved.
 
  • Like
Reactions: artk2219
I would recommend dropping the 1080p medium (it is 2024) completely, and replacing it with 1440p medium. This has the added advantage of cutting the CPU out of the equation.

Test budget cards ($200 - $399USD) at 1080p High, 1440p Medium, & 1440p Ultra resolution. (3 test runs, not 4) People that are buying a card in this price bracket aren't pairing it with a 4k monitor.

Test Enthusiasts cards ($400 - $700USD) at 1440p Medium, 1440p Ultra, & 4k Medium resolution. (3 test runs, not 4) People buying a $700 card aren't playing at 1080p medium.

Test Stupid Money cards ($700+) at 1440p Ultra, 4k Medium, & 4k Ultra resolution. (3 test runs, not 4) People buying an RTX 4080 aren't playing on a 1080p monitor.
I'm going to link this HUB video.
View: https://www.youtube.com/watch?v=O3FIXQwMOA4
. Watch it. The same is true for GPUs.
Also, I would like to point out that "it's 2024" is not a reasonable argument for dropping 1080p completely. Not only is it the most popular resolution, but it will likely stay that way for a little while yet. Even though you may be using a 1440p or 2160p monitor, doesn't mean Joe is.
 
  • Like
Reactions: artk2219
There is absolutely nothing in your current testing that lets us extrapolate what the performance would be at lower detail settings. At the end of the day, we turn down detail settings (which costs nothing), before buying a new card (or a new monitor, for that matter). Ultra becomes High, etc - isn't quantifiable, give us actionable data.
Ignoring the mistakes in pricing and specs and other stuff from your post, let me run though this in detail.

I test at 1080p medium because that provides a lower demand performance result. It's generally far enough away from 1080p ultra to be more interesting, and also pinpoints where other limits (usually CPU) are a factor. I have done 1080p medium testing since I began testing GPUs at PC Gamer nearly a decade ago. I also tested at 1080p medium, along with 720p low and other resolutions, when doing laptop reviews at AnandTech for the decade prior to that.

Is every tested setting and resolution combination important on every card? Perhaps not, but they can provide data that forms the basis for a deeper analysis.

I have done testing in individual games many times over the years where I've checked even more settings. As you would expect, the vast majority show lots of overlap, like gears on a bicycle. If you have a 30-gear mountain bike with three rings on the front and ten on the rear cassette, that's 30 gears total. But it's really about the ratios, so if your rear cassette has 10–50 teeth and the front rings have 28/34/40 teeth as an example, then your lowest gear would be 28/50 = 0.56 ratio while your highest gear would be 40/10 = 4.00. But in between? You have a bunch of options that would all have a ratio of around 2.0. (This is why a lot of modern mountain bikes only have a single front ring and then a 12-gear rear cassette. That gives 12 'unique' ratios with no overlap, and basically matches what you might have gotten on a more complex "30-gear" setup.)

For GPUs, often 1080p medium ~= 1440p low, 1080p high ~= 1440p medium ~= 4K low, 1080p ultra ~= 1440p high ~= 4K medium. And yes, it's very approximate and varies by game. The point is, there is always overlap, but finding precisely where it happens would require testing everything everywhere. You can't just assume it will be the same without testing, but testing is a huge time sink. So I compromise with four settings, 1080p medium, 1080p ultra, 1440p ultra, and 4K ultra.

That gives you a clear curve, four distinct "ratios" where there should be zero overlap. And from that basis you can extrapolate. If you look at any budget GPU I've reviewed, you'll see that my focus definitely isn't on 4K ultra or even 1440p ultra for GPUs that don't handle those resolutions. And on extreme GPUs, the 1080p results are typically provided merely as reference points with minimal commentary. I show those results for the curious, but they're definitely a sidenote rather than the main attraction.

Someone wants to know approximately how 1440p medium will run? Look at the 1080p medium vs 1080p ultra results, and extrapolate that against 1440p ultra. It's math and it's not perfect, but we can look at other cards to determine where scaling should land. For example:

Find the scaling factor for a different GPU from the same vendor, one that's clearly GPU limited at 1080p. If RX 7600 runs ~1.7X faster at 1080p medium as at 1080p ultra (which it does), then even though the 7600 starts to choke at 1440p, you can apply that same ~1.7X scaling for a GPU like the 7900 GRE. But CPU limits still come into play for some games, so then you have to look at the 1080p medium results for that GPU and know that 1440p medium wouldn't run faster than 1080p medium.

The 7900 GRE only shows 36% scaling at 1080p medium vs ultra, so CPU limits are definitely a factor. 1080p ultra also runs about 20% faster than 1440p ultra (compared to a 40% delta on the 7600). With that data, which is already in the GPU hierarchy, you can get a pretty good idea about where the GRE will land. It will show higher scaling at 1440p medium vs ultra than at 1080p, but it can't exceed the 1080p medium result. Which means a 7900 GRE should run about 45~55% faster at 1440p medium than 1440p ultra (depending on the game).

Is that an exact answer? No, but it's close enough and doesn't involve massively increasing the amount of testing time. Which gets to the final point.

If VRAM limitations aren't a factor, 1440p medium usually performs about the same at 1080p ultra. I have tested this in the past, and it's usually quite binary: A bunch of games show nearly identical 1080p ultra and 1440p medium performance, and then a handful have different requirements and may run out of VRAM at ultra settings. But otherwise, when VRAM doesn't kill performance, increasing the resolution from 1080p to 1440p drops performance about 25%, while going from medium to ultra drops performance by 25~50 percent.

Testing 1080p medium and ultra shows one thing very clearly on a per-game basis: Is this game hitting some bottleneck — VRAM capacity, bandwidth, or compute? That's a useful piece of information and let's you see when and where 6GB, 8GB, or even 12GB might be a problem.

Suppose it didn't hit a limit, in some of the games. Well, the only place to go from there is up — if you change both resolution and settings you don't know which was more impactful. That's why from 1080p medium to ultra, the only thing that I changed was the settings. And from 1080p ultra to 1440p ultra, I only changed the resolution (and the same for 4K ultra). Eliminate the variables, don't add to them.

If I start testing all the "it makes sense on this card" settings, and then looking at the cards around, above, and below that level as comparison points, I can quickly end up with 1080p, 1440p, and 4K all tested at low, medium, high, and ultra. Which would be lovely to do, given a lot more time (and more test PCs and people doing the testing). But even if you trim out the cruft (like dropping 4K entirely on budget GPUs), it generally ends up being more work, plus going back and swapping GPUs more often because invariably there are things that get missed.

It's also important to do testing of cards at settings that find where they collapse. If I only test 8GB budget cards at 1080p high, 1440p medium, and 1440p ultra, the first two will probably be equivalent workloads while the third may be a problem, but we don't know if the problem was 1440p vs 1080p or ultra vs medium (or more likely both). You're eliminating some questions and adding others.

Basically, I choose to keep my workload somewhat manageable and leave it to the readers to be smart enough to interpolate results where necessary. Most of our TH readers are thankfully pretty intelligent and can do that. It's also why the GPU hierarchy, which is literally hundreds of hours of work (and retesting regularly takes it to thousands of hours) is so useful. It's generally the most trafficked page on TH as a result. Which is what 20 years of actual benchmarking has taught me.
 
and leave it to the readers to be smart enough to interpolate results where necessary. Most of our TH readers are thankfully pretty intelligent and can do that.
Well said. Even know I stay behind the latest greatest curve with GPU's I'm never lacking running top notch.

Your reviews are a refreshing balance where there is some grit in your reviews and not fluff stuff where I will go back to your reviews here or there as one of these days I will have one of the cards being reviewed. Call me a delayed GPU buyer that your reviews will have an impact on what I just might buy in the simi short distance future.
 
  • Like
Reactions: artk2219
I'm just going to give a non specific answer of <$300 currently available GPUs, plus 3060, and 2060 because those are the ones most likely to be upgrading into current sub $300 GPUs. For people who are seriously looking in this segment, it's really not useful to see a lot of $500+ cards.
 
think everyone has pretty well said it. the cards in direct competition should be included. i'm all about bang for the buck. so what do i get for similar money from other brands? and of course what do i get if i spend a little bit more. i'm not looking at a $250 card because i'm undecided between it and a $2000 card. but i am def wondering what $300 will get me vs $250.

i tend to look for that sweet spot right before the diminishing returns gets out of hand.

1080p medium and then ultra is a good idea to me and then 1440p. this will show me what the most taxing game will do at medium settings and then what the lesser taxing games will do at max settings. i know each game will be different and i'll not be able to max out every game. but to be able to estimate using the 2 test settings is a very good educated guess as to what i can expect. i don't chase numbers but it's still helpful to be able to have a basic idea.

at this level 4k is a waste of time. it's fun to see who "wins" at an impressive 22 fps, but really only the top end of the cards really matter at 4k.
 
I think an RX 6600 would have been a good data point to have in there given that it can be found for around $180 or so. Giving a pricing and performance floor from which you could climb given the price point of the B580.
I'll see about testing that after a few of the other GPUs. Time is ticking, I'm still retesting a few things with an updated driver on the B580. LOL
 
  • Like
Reactions: artk2219
All I really care about is if intel was being straight with their published numbers.
I don't care about upscaling smoke and mirrors and I certainly don't care about RT in a GPU that I know isn't going to be great to the point almost no one will turn it on in any recent AAA titles.
I can figure it out and slot it in appropriately as far as where it stacks up with other GPUs. There are hundreds of benchmark runs with all the other GPUs that are already on the market, there is no need to re-skin the cat, again.
 
I'll see about testing that after a few of the other GPUs. Time is ticking, I'm still retesting a few things with an updated driver on the B580. LOL

I hope everyone understands that you guys have a limited amount of time so while it would be nice to have specific benchmarks it's just not feasible with the clock ticking like it is. If you guys really want that kind of information, GN usually does that stuff but it takes awhile for them to publish it.

On the Intel battlemage front, it's a $200~300 USD GPU, nobody is going to be putting it into a high end gaming rig and playing demanding games at 4K Ultra on a 240hz monitor. There are two types of systems this is going into, the first are budget gaming PC's with budget CPU's and memory. The old $600~1000 price range that we used to have before GPUs and inflation made for very bad build decisions. The second type of system is going to be for SFF HTPC's or similar "living room gaming PC's" which are a very specific category. These are computers that sit in a living room and are used to play games on a very large 4K TV at 60hz. The player(s) is/are sitting farther back on a couch and the entire system has to blend in with the aesthetics of the room, we can't have a loud RBG Christmas tree as that spoils the mood.

That scenario I think is the most interesting as that is when you really get into tuning gaming settings and common build suggestions become unfeasible as decor and mood become a bigger decision factor.

For example this is what I'm using.

Its sleek appearance matches my receiver and TV all on a black glass stand. I use wireless 360 controllers to play action platformers and other similar games casually while munching on snacks and a beer. Great experience but that case only has room for a two slot low profile card with limited power and most importantly thermal capacity. We can't have small fans whining nonstop, that would completely ruin the experience. In these situations cards with lower power draw, and therefor lower thermal requirements, end up doing very well.

I'd love to see Toms do something for that segment as I think that's where cards like the B570/80, nVidia 4050/4060 and whatever AMD comes out with soon, do really well.

Oh and on the whole "Ultra" thing, Ultra is pretty useless as a default setting and this is because many game developers use that as a place to stress test their engine and essentially show off. It's an indirect way of trying to market their product as having higher quality because it's "so demanding". In reality it's a bunch of overturned settings that don't generally do much for visual quality.
 
  • Like
Reactions: JarredWaltonGPU