AMD Radeon RX 5600 XT vs. Nvidia GeForce RTX 2060: Which is the best mainstream GPU?

Maybe, and while the thought of an undervolted/powersave setting for a Vega 56 to run 2560x1080 sounds fascinating, I think it's a hard sell considering most of them were run stock, thus somewhat pushed pretty hard (AMD pushing the 56 to outdo the 1070), and there's a good chance they were used for cryptomining, unfortunately, given the prices at the time.

That, and without a warranty, and if I found a Vega 56 at $200, I'd still be hesitant.

OTOH, what they said about some RX 5700 cards dipping below $300 was true, and I took advantage of the NewEgg discount for my son (and posted it to the forum, because I only read the mention of it here after the fact), to get one for $273 after rebates.

Better performance, new with full warranty, and less risk are worth something extra, though.
 
I'd take a used Vega 64 for $50 less or a 56 for 100 less.
Based on my testing, RX 5600 XT is overall a match for the Vega 64. Across 12 tested games, it's slower (by ~5-8%) in Red Dead Redemption 2, Shadow of the Tomb Raider, and Strange Brigade. It's faster (again, by 5~8%) in Division 2, Hitman 2, Outer Worlds, Warhammer 2, Metro Exodus, and Forza Horizon 4. That's using the Sapphire Pulse, of course, so subtract 5% for a reference 12Gbps card. For the RTX 2060, the story is basically the same.

Now, add in the fact that Vega 64 is a 295W card (and often consumes more than that) and it's not really that great an idea. Vega 56 is still worth a thought, but again, power hungry, older architecture, not sure I'd really want it. There's a reason they're selling for so cheap -- but not $200 cheap unless you go eBay. https://www.newegg.com/p/pl?N=100007709 601302833 4814&Order=PRICE

And there is no way I would recommend a used Vega card. Cryptomining was a major factor in Vega selling well for most of its shelf life. Used cards that are more than two years old are a major risk.
 
I'd still like to kick the guys in the nuts who decided when this price became "mid-range". When is the marketing industry going to come to terms with reality and resource allocation. Remember when a new product came out it actually made the former product cheaper and you only paid more when you wanted a non-production line model. Now you're paying for the same <Mod Edit> we used to get from prototyping along with their inconsideration with updates that should be otherwise obliged to full-fill for the customer. Alas, I guess this is the de-evolution of human-kind as we take everything for granted in exchange for complacency.
 
Last edited by a moderator:
Then you'll have to kick us, the consumers, in the nuts.

We're not happy at 1920x1080 anymore. We're not happy with a mere 60Hz anymore.

The R9 285 was released in September 2014. Definitely a mid-range card FOR ITS TIME. I purchased a new one in January 2015, at $235. That's about $260 today by inflation calculator estimates (I went change 2014 to 2019, as I couldn't enter 2020 as a "final year" since it's not the end of 2020 yet).

$290 doesn't seem THAT outrageous.

EDIT: Like the Sapphire Pulse, my R9 285 was also an overclocked card, the Gigabyte Windforce OC R9 285.
 
  • Like
Reactions: JarredWaltonGPU
I'd still like to kick the guys in the nuts who decided when this price became "mid-range". When is the marketing industry going to come to terms with reality and resource allocation. Remember when a new product came out it actually made the former product cheaper and you only paid more when you wanted a non-production line model. Now you're paying for the same <Mod Edit> we used to get from prototyping along with their inconsideration with updates that should be otherwise obliged to full-fill for the customer. Alas, I guess this is the de-evolution of human-kind as we take everything for granted in exchange for complacency.
I think part of the problem is that 'budget' GPUs have become largely pointless. If you can only afford a $100 GPU for gaming purposes, and considering most modern games cost $50+, you should probably not be buying games or hardware to play games. I'd suggest either save your money for at least a $200 GPU, or just use the integrated GPU inside any recent Intel CPU, or get a Ryzen APU. Of course, gaming has moved from being primarily for teens and maybe 20-somethings into a hobby for a lot of adults, and adults tend to have more money for their hobbies.

So if $100 dedicated GPUs are out, the next step up becomes $150, and that's the new 'budget' price. And elsewhere I recommend moving up to the $230 1660 Super as a better choice than an RX 5500 XT 4GB or GTX 1650 Super. That has a cascading effect of pushing 'mid-range' or 'mainstream' up to around $300.

It's also important to put things in a different perspective. A $300 PC graphics card with any decent PC is more powerful than any current gen console, by far. And the PS5 and XB1X are going to probably cost $500 or more is my guess -- I could be wrong, but at a minimum it looks like you get a $150 CPU + $300 GPU + $100 SSD + $100+ in other bits. Even at $500, it would be sold at a substantial loss, and $600 seems more likely at launch. Guess we'll find out come fall, though.
 
Last edited by a moderator:
Based on my testing, RX 5600 XT is overall a match for the Vega 64. Across 12 tested games, it's slower (by ~5-8%) in Red Dead Redemption 2, Shadow of the Tomb Raider, and Strange Brigade. It's faster (again, by 5~8%) in Division 2, Hitman 2, Outer Worlds, Warhammer 2, Metro Exodus, and Forza Horizon 4. That's using the Sapphire Pulse, of course, so subtract 5% for a reference 12Gbps card. For the RTX 2060, the story is basically the same.

Now, add in the fact that Vega 64 is a 295W card (and often consumes more than that) and it's not really that great an idea. Vega 56 is still worth a thought, but again, power hungry, older architecture, not sure I'd really want it. There's a reason they're selling for so cheap -- but not $200 cheap unless you go eBay. https://www.newegg.com/p/pl?N=100007709 601302833 4814&Order=PRICE

And there is no way I would recommend a used Vega card. Cryptomining was a major factor in Vega selling well for most of its shelf life. Used cards that are more than two years old are a major risk.
Or you can undervolt the 64 or 56 for power savings or a performance boost and it’s still $50 more more in savings. Plus you’re gonna have to have your PC running full bore a lot for the extra wattage time make a difference. You’re talking what $0.015 an hour difference?
 
Well, yes, if you want to gamble with a used card with no warranty, that has a well-above-average chance of having been used hard in cryptomining, AND you want to put in the work of dealing with manual tweaking.

But the article is about comparing two different new cards with each other.
 
  • Like
Reactions: TJ Hooker
Gigabyte RX 5600XT Winforce is $279 on amazon
EVGA RTX 2060 KO is $299 on EVGA's website.

Of the two I'd just get the RTX card.
I wouldn't go any lower than RX 5700 for NAVI 1.0.
RX 5700 destroys the RTX 2060 and gives the 2060 Super a run for it's money.

MSI RX 5700 Evoke is $324 on Amazon
 
Gigabyte RX 5600XT Winforce is $279 on amazon
EVGA RTX 2060 KO is $299 on EVGA's website.

Of the two I'd just get the RTX card.
I wouldn't go any lower than RX 5700 for NAVI 1.0.
RX 5700 destroys the RTX 2060 and gives the 2060 Super a run for it's money.

MSI RX 5700 Evoke is $324 on Amazon

Unless you want ray-tracing, there's no point in paying extra for the 2060 KO.

Also, the MSI RX 5700 Evoke OC is, after rebates ($30 instant, $30 mail in) $273 at NewEgg. For now, at least, it sort of nullifies the point of either the 5600XT or the 2060 KO.
 
  • Like
Reactions: TJ Hooker
Maybe, and while the thought of an undervolted/powersave setting for a Vega 56 to run 2560x1080 sounds fascinating, I think it's a hard sell considering most of them were run stock, thus somewhat pushed pretty hard (AMD pushing the 56 to outdo the 1070), and there's a good chance they were used for cryptomining, unfortunately, given the prices at the time.

That, and without a warranty, and if I found a Vega 56 at $200, I'd still be hesitant.

OTOH, what they said about some RX 5700 cards dipping below $300 was true, and I took advantage of the NewEgg discount for my son (and posted it to the forum, because I only read the mention of it here after the fact), to get one for $273 after rebates.

Better performance, new with full warranty, and less risk are worth something extra, though.

When undervolted and overclocked, Vega 56 can match and surpass GTX 1080 in performance at lower power draw than GTX 1080.
Adding to that, Vega 56 has far more potent compute hw for productivity (which is part of the reason its more power hungry).

Sure, 5600XT will probably beat V56, but it does have lower compute.

Also, even if V56 you get is used, it was probably undervolted to run more efficiently and therefore, the chip/hw itself should be fine for time to come.

It depends on what you use the GPU for.
If its mainly for gaming, might be better to go with 5600XT... if its for both gaming and productivity, V56 might be better.

Also worth noting is that you can undervolt 5600xt too.
 
  • Like
Reactions: spentshells
I take most claims of amazing undervolting + overclocking with a healthy dose of skepticism. Some cards will do better, some worse -- just like with any overclocking. My experience with Vega cards is that they're nowhere near as good as some make them out to be in terms of undervolting. Yes, you can improve performance characteristics in most cases, but that's true of just about any card. It's the usual 5-10% improvement.

Over the years of testing GPU hardware, I've had cards that refuse to run at 'stock' settings after a couple of years. My GTX 970 (Zotac) and R9 390 (Sapphire) both fall into that category, and my original Vega 56 card went belly up (and had to be replaced). I also had a GTX 980 Ti (Gigabyte) bite the dust, and a second GTX 970 (Zotac) that appears to have died in the past six months of sitting around on my shelf. And I have a super flaky RX 570 4GB card -- it never did work right (from Asus). The Vega 64 (reference) still works, but it requires tweaks to the fan speed if I don't want it to throttle way down.

Those are only the most recent examples. I don't think I've treated any of the cards particularly badly. The various other GTX, RTX and RX cards are all still functional and appear to work properly, but going back a bit further I had a lot of HD 5000/6000/7000 cards that had issues over time. Fans burned out, mostly. That's my experience with GPUs of the past five or so years, anyway.

As for the statement that "even if V56 you get is used, it was probably undervolted to run more efficiently and therefore, the chip/hw itself should be fine for time to come." There is zero evidence to back that up. It's just a straight up guess and a shot in the dark what sort of GPU you get if you buy used. Realistically, 80-90% of gamers don't even bother with GPU overclocking -- or GPU undervolting -- and not many sell used hardware (via eBay) either.

Fundamentally, Vega was a rather poor GPU design, for gaming in particular -- just like Fiji / Fury before it. For gaming, the RX 5700 XT even beats the Radeon VII, which is 20-30% faster than a Vega 64. And looking at performance in newer games, there are a bunch where the Navi architecture shows how badly Vega is aging. If you have one, sure, keep using it, but I wouldn't recommend buying a new (or used) Vega card these days. Just like I wouldn't recommend buying used GTX 900 or GTX 10 series GPUs. The Turing architecture is simply better at every price point, and you won't be getting a card and GPU that are four or more years old.
 
  • Like
Reactions: TJ Hooker
When undervolted and overclocked, Vega 56 can match and surpass GTX 1080 in performance at lower power draw than GTX 1080.
No way. If this was possible, AMD absolutely would've done it and claimed the performance crown. This sounds like wishful thinking.

The Vega 56 was a power hog because AMD was trying to push it to run fast enough to outperform a GTX 1070, which it did by maybe, I suppose, 8-10% 11% or so.

Now, if, as was shown on the review in Toms Hardware:
Using the secondary BIOS with a power limit reduced by 25% gets us 159.4W and 32.7 FPS. Compared to the stock settings, just 71.6% of the power consumption serves up 89% of the gaming performance.
So, it would've performed at the 1070's level, maybe a hair less, and consumed maybe a tiny bit more power than the 1070.

Fundamentally, Vega was a rather poor GPU design, for gaming in particular -- just like Fiji / Fury before it. For gaming, the RX 5700 XT even beats the Radeon VII, which is 20-30% faster than a Vega 64. And looking at performance in newer games, there are a bunch where the Navi architecture shows how badly Vega is aging. If you have one, sure, keep using it, but I wouldn't recommend buying a new (or used) Vega card these days. Just like I wouldn't recommend buying used GTX 900 or GTX 10 series GPUs. The Turing architecture is simply better at every price point, and you won't be getting a card and GPU that are four or more years old.
The only reason I think it was "bad" is that they pushed its limits, ruining power/performance, because they needed to outdo the 1070 (for the 56) and the 1080 (for the 64). Otherwise, I think, for its time, it wasn't too shabby.

I don't know if the same kind of cutting of power would work as well for the Vega 64 as it did according to the review of the Vega 56.


And yet, all this, and this article is STILL about new cards, so I don't understand why people keep bringing up why Vega used is better than a new card when the article is about comparing two current generation cards.
 
Last edited:
Couple of notes about the drivers (GPU-independent).
  1. NVIDIA has a very good OpenGL support in their drivers, AMD was always lacking in this regard.
  2. When it comes down to tweaking, both drivers allow to customize per-app settings but GeForce drivers provide more options (that is if you don't use third party software for tweaking).
 
No way. If this was possible, AMD absolutely would've done it and claimed the performance crown. This sounds like wishful thinking.

The Vega 56 was a power hog because AMD was trying to push it to run fast enough to outperform a GTX 1070, which it did by maybe, I suppose, 8-10% or so.

Now, if, as was shown on the review in Toms Hardware:

So, it would've performed at the 1070's level, maybe a hair less, and consumed maybe a tiny bit more power than the 1070.

The only reason I think it was "bad" is that they pushed its limits, ruining power/performance, because they needed to outdo the 1070 (for the 56) and the 1080 (for the 64). Otherwise, I think, for its time, it wasn't too shabby.

I don't know if the same kind of cutting of power would work as well for the Vega 64 as it did according to the review of the Vega 56.

And yet, all this, and this article is STILL about new cards, so I don't understand why people keep bringing up why Vega used is better than a new card when the article is about comparing two current generation cards.
I guess 'poor design' might be a bit harsh, but it was very underwhelming. It's a relatively large chip (486mm^2), which is larger than the GP104 (314mm^2), plus it has to have the interposer for HBM2. That makes it more expensive to produce, which is obviously a problem. Then AMD pushes the clocks higher to try to 'win' the performance competition with the 1080 vs Vega 64 and 1070 vs Vega 56, which pushes the chips out of the ideal efficiency zone.

Except that's a big part of the problem. You look at how much power scales up on Vega as you increase clockspeeds, and it's definitely worse than the competing Nvidia chips. I could maybe go do testing to confirm, but it feels like overclocking Nvidia's GTX 10-series GPUs is more linear -- so a 10% overclock uses 10-15% more power for 10% more performance. Vega a 10% "overclock" (by AMD) required 30% more power for 10% more performance.

Anyway, I stand by my opinion that Vega was not anywhere near what we needed/wanted from AMD for it to be competitive. RDNA is better but still behind Nvidia Turing architecturally. RDNA 2 maybe will close the gap? Guess we'll find out with Ampere and Navi 2x this fall/winter.
 
If you do decide to try something like that, I'd definitely be curious to see what happens.

That setting that brings Vega 56 down to 159.4W mentioned above. Does it match the GTX 1070 in performance, exceed it by a hair, fall short by a hair, etc...

And Vega 64, say similar things were done, does it match, or even still outdo the GTX 1080? Or, perhaps, slot between the 1070 Ti and the 1080? How much power does it use in that case? etc.

I can't imagine they were losing money even if they sold at MSRP (presumably to compete against their targets of the 1070 and the 1080). Of course, cryptomining threw all reality out the window. I can't recall, but I thought they were originally released to undercut the 1070 and 1080 prices as well (and the 1070Ti was released just to try to top the Vega 56 without infringing on 1080 territory, albeit it got a little awkward)

But you've piqued my curiosity - I don't have the resources to try the testing you suggested, but if the curiosity about this is nagging at your brain as much as it is mine, I'd love to see what results you get.

As a quick reference, this page is what I quoted from about the "71.6% of the power gives 89% of the performance."
 
If you do decide to try something like that, I'd definitely be curious to see what happens.

That setting that brings Vega 56 down to 159.4W mentioned above. Does it match the GTX 1070 in performance, exceed it by a hair, fall short by a hair, etc...

And Vega 64, say similar things were done, does it match, or even still outdo the GTX 1080? Or, perhaps, slot between the 1070 Ti and the 1080? How much power does it use in that case? etc.

I can't imagine they were losing money even if they sold at MSRP (presumably to compete against their targets of the 1070 and the 1080). Of course, cryptomining threw all reality out the window. I can't recall, but I thought they were originally released to undercut the 1070 and 1080 prices as well (and the 1070Ti was released just to try to top the Vega 56 without infringing on 1080 territory, albeit it got a little awkward)

But you've piqued my curiosity - I don't have the resources to try the testing you suggested, but if the curiosity about this is nagging at your brain as much as it is mine, I'd love to see what results you get.

As a quick reference, this page is what I quoted from about the "71.6% of the power gives 89% of the performance."
1070 FE was $449 at launch, and 1070 cards were generally $379 after that. 1070 Ti was weird because it landed half-way between 1070 and 1080 but with a price closer to 1080 -- it was $449, at a time when 1080 was officially $499. Vega was supposed to be $399 for the 56, and $499 for the 64 -- with a launch premium that pushed prices up $100. And then mining took off and it all went out the window. LOL

I'm not sure how easy it is to reliably undervolt Nvidia. I know I've toyed with stuff and it can definitely cause lockups if you push clocks or voltages too low (same on AMD). And even if you're 99% stable, it's that 1% that's a concern. Every game you run works fine until one doesn't -- just like with overclocking, really. I've got lots of other things to do right now (retesting the full suite of GPUs for example), but if there's ever a chance I may revisit some of the older GPUs and talk a bit about them.
 
I'd say in my own (mental) case, I was interested in the results of the Vega cards simply because they were pushed to very inefficient levels, and wondered how they'd do if they were in a more efficient power zone, whereas the Nvidia cards were already pretty power efficient.

If you ever do have the time and the curiosity does push you to revisit it, let me know. I figure though, that spare time is hard to come by given how much time the full retest suite, among other things, is taking up.