Review AMD Radeon RX 6500 XT Review: The Return of the 'Budget' GPU

Jan 19, 2022
1
0
10
Been reading the content on this site for a decade and finally had to register to comment on the amount of nonsense in this 'review.'

It's obvious this was not released with AAA-titles in mind. It was obviously released for esports games such as CS:GO / OW / DotA. Hence you excluding resolutions from the review for games that run excellently even with much less VRAM makes absolutely no sense.

Were you aware you can run the most popular esports titles listed above on 2k resolution paired with a 165 Hz monitor comfortably with a decade old 2GB GPU such as GTX770? Apparently not.

The price and x4 performance on PCIe 3.0 are major negative sides worth noting but given how excessively you advertise yourself as a GPU guru with nearly 20 years of experience it's shocking how ignorant a review can be. This card is literally what the community needed among the GPU shortage and it serves anyone playing some of the most popular games at 1k/2k at high refresh rates with low power draw and noise.

Entry level cards are crucial since old hardware dies over time and the vast majority of games run exceptionally even with these specs. The target audience is far off in this review.
 
Last edited:

InvalidError

Titan
Moderator
Were you aware you can run the most popular esports titles listed above on 2k resolution paired with a 165 Hz monitor comfortably with a decade old 2GB GPU such as GTX770? Apparently not.
People generally don't buy new GPUs to run titles that already do 200fps on 10 years old cards. They buy new GPUs when their current one cannot keep up with the resolutions, details and frame rates they want to play more GPU-intensive stuff at and most of those games will quite easily pass the 4GB mark. That is where the RX6500 drops behind every other 4GB GPU from the last four years that has 3.0x16 or 4.0x8 PCIe.
 
You threw out the highly inconsistent runs? Those may have been a side-effect of AMD only giving it an x4 interface on top of 4GB 64b memory and possibly be the most important detail people need to be aware of. AMD has cut far too many corners on this thing to be worth more than $150 were it not for current market conditions.
The inconsistent runs, mostly in Far Cry 6, occurred on all 4GB cards. At one point I ran the GTX 1650 Super through the 1080p Ultra + HD test sequence maybe a couple dozen times — I even restarted the game and restarted the PC a couple times just to try and get the results to stabilize at one value. Performance ranged from as low as ~7 fps average (once) to as high as ~55 fps (once). Most of the results clumped into the ~44 fps range and the ~15 fps range. At times it seemed like every other run would be "bad." It's almost certainly a game bug/issue, where the game in DX12 mode isn't managing VRAM effectively and sometimes ends up in a bad state. I could have tested in DX11 mode and hoped the drivers would "fix" things, but then that opens the door to having to test both APIs on every GPU — and then do the same for every other game that supports multiple APIs.

For those wondering about the test suite, I need to test all GPUs on the same set of games, for the GPU benchmarks hierarchy as well as other guides. I specifically focused on the 1080p medium performance as that's what this card is designed to handle. However, even at those settings it's a questionable card. This is profiteering from AMD and the GPU market in general. The RTX 3050 will at least potentially warrant a $200-$250 MSRP (in "normal" times). The RX 6500 XT would have been a $100-$120 GPU normally. It probably would have been called the RX 6400 actually, leaving the RX 6500 name for a card with better specs and performance.

If I were to only test lighter fare to show how this card can handle games that even cards from eight years ago run just fine, what good does that do anyone? "Hey everyone, this is PEBKAC6 Marketing, letting you know that games that don't demand much from your GPU don't demand much from your GPU! Isn't that amazing? Aren't you happy to know that CS:GO, which can run at reasonable performance even on Intel's integrated graphics solutions from five years ago, can run on a $200 graphics card? You should all just go buy it! This is the card we all need! Never mind that it's slower than the card it replaces and has worse specs (RX 5500 XT 8GB). 5-stars, would buy again!"
 

King_V

Illustrious
Ambassador
I was afraid of this on a few points.

Even back in the Polaris days, while my memory is a bit hazy, I seem to recall that there was greater benefit in increasing memory bandwidth than in GPU clocks. Yes, they went with fast GDDR6, but the 4GB limit is an issue.

Cutting down the PCIe lanes on the 5500 was a problem. Cutting it even more for the 6500 was a mistake.

I mean, even look at the difference between the GTX 1650 GDDR5 vs GDDR6; they were able to reduce the clock speeds of the GPU, and it still performed notably better, while still keeping things in the same TDP envelope.

Similarly, the performance/watt of RDNA with the 5000 series seemed to suffer going into the budget level. The 5500 had a notably lower performance/watt than the 5600. The 6500XT does have lower power, but the cut down GPU needs higher clocks... so the TDP is not reasonable for the performance it's delivering, especially given the RDNA2 architecture.

On the other hand, I guess they'd need, say, a 6600 sized chip, or at least something not quite cut down THIS small... and run the clocks lower, to get decent performance with less power consumption. I'd say that they may be trying to make the most of limited supply, but this is on N6, so, I would guess there's no supply constraints on that.

The particular cuts they made seem to be the wrong kind. Keeping in mind that I am NOT an electronics engineer of any sort.

Call me unrealistic, but I think this is a card that should have been designed to manage at least equal the 5500 XT in performance, and do so without the need of a PCIe power connector.
 
  • Like
Reactions: JarredWaltonGPU

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Assuming you can actually purchase the Radeon RX 6500 XT for $199, or at least somewhat close to it, it's not a bad card.
Sorry, as much as I like AMD, this is a bad GPU! And unless someone is "forced", no one should buy it.

We should not accept this kind of garbage GPUs, anymore than we should accept nvidia's scumminess with their new policy of NO reviews before launch and NO more MSRP again!

Whatever the haters want to say about them, again HUB and GN are the voice of impartiality and say it exactly how it is: in this case a bad GPU from AMD.
View: https://www.youtube.com/watch?v=M5_oM3Ow_CI


View: https://www.youtube.com/watch?v=ZFpuJqx9Qmw
 
If repeatability is so problematic on 4GB cards overall, it may be a sign that graphs need to get variance bars to represent how (un)repeatable results are.
Usually it's only a problem if you change settings and don't exit and restart the game (in a game that lets you do that). I've done lots of testing over the years, and generally speaking cards with 8GB and more VRAM can go from 1080p to 1440p to 4K in testing without exiting and restarting. With a 4GB card, you sometimes end up in a severely degraded performance state and need to exit and relaunch. But with low-level DX12/Vulkan APIs, sometimes it's just a periodic glitch that causes performance to suffer. I've seen issues with 4GB and even 6GB cards in Watch Dogs Legion at times. Usually, exiting and restarting the game will clear the problem. Far Cry 6 was unusual in that it had high run to run variance seemingly whenever it exceeded the card's VRAM. The game also showed changes in the amount of VRAM it thought it needed, which was certainly odd.

Anyway, the problem with charts is that there are lots of people who can only really manage to grok relatively simple bar charts. Start adding variance and all that other stuff and only stock investors and traders, and stats majors, are likely to get what we're showing. Plus, how many runs should I do for each setting? I try to stay with three, but do more if there's clear variance happening, and sometimes a lot more if I'm just unable to figure out what's happening (e.g. with FC6). If I had to do 10 runs of each setting, then run the results through some stats to get variance bars and such, it would dramatically decrease my throughput and workflow, and likely the potential gains (people understanding a bit better what's happening) would not even be there.

Fundamentally, I want to provide a relatively "modern" look at how graphics cards perform in a variety of situations. I have a couple of very recent games in my updated test suite, along with some slightly older games that are still useful (and relatively easy to use). I don't want to show a bunch of games that specifically won't tax 4GB cards, but neither do I only want to look at games that basically require 8GB or more. And then when I create the aggregate scores for the GPU hierarchy, I definitely don't want a bunch of random "bad" results that penalize slower GPUs. So, I figure most people will use settings that run at closer to acceptable levels of performance, and running a few extra tests to see where the high water mark is helps keep the charts more sensible. For people who care, there's the text that will call out anomalies. 🙃

As an aside, RDR2 strictly prevents you from selecting settings that exceed a card's VRAM, but you can edit the config file to try and get around that. I was able to force 1080p "ultra" and 1440p "ultra" to run, though 1440p resulted in periodic graphical corruption. 4K ultra is too much, however, and just crashes to desktop. Anyway, I need some meaningful number to avoid skewing the overall results — not having a low score present can increase the calculated average, but putting in a "0" result can go too far the other direction.
 

btmedic04

Distinguished
Mar 12, 2015
473
361
19,190
This gpu should have been labeled as the rx6500 and made oem only. the 4x electrical connection while good enough on pcie 4.0 is far too insufficient at pcie 3.0 to justify this as a budget gaming gpu where most systems in that class are limited to pcie 3.0. at that spec, you're basically getting the same performance as the 5.5-year-old gtx 1060 with unserviceable ray tracing and no hardware acceleration for media encoding.
 

InvalidError

Titan
Moderator
Sorry, as much as I like AMD, this is a bad GPU! And unless someone is "forced", no one should buy it.
It would have made more sense as an RX6400, then people wouldn't be bashing it quite as hard for performing roughly like a rehashed RX580 and often worse RX5500.

Price-wise, it cannot be helped much since the manufacturing cost is ~$130 largely thanks to GDDR6 and support components' prices. Add shipping, marketing, AMD+middlemen profit margins, etc. and $200 just about as low as this sort of garbage can go for everyone to get their cut under current conditions. So much of the baseline costs being the same across all GPUs is why the likely far superior RTX3050 coming out next week only MSRPs $50 higher.

How does this card get 3 stars? How bad would a card have to be to get 1 or 2 stars? Would it need to poison your pet's water bowl a little every night? Burn down your house?
As long as you go out of your way to make sure you don't exceed what comfortably fits within 4GB, it is usable.

Although the pricing really sucks for about the same performance as $200 GPUs from 4-5 years ago, there currently aren't many options anywhere near its MSRP, especially if you are looking to buy new only. At least until the RTX3050 launches next week and delivers 1.5-2X the performance for $50 more while supplies last.

It is one of those "best of the worst" kind of situation.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
As long as you go out of your way to make sure you don't exceed what comfortably fits within 4GB, it is usable.

Although the pricing really sucks for about the same performance as $200 GPUs from 4-5 years ago, there currently aren't many options anywhere near its MSRP, especially if you are looking to buy new only. At least until the RTX3050 launches next week and delivers 1.5-2X the performance for $50 more while supplies last.

It is one of those "best of the worst" kind of situation.
The card shouldn't be rated by best case scenario. It should be rated based on the target market. This card is a bottom rung budget gaming card by every spec on the spec sheet. It's going to be used in PCIe3 motherboards more than gen4 boards so that's the performance it should be judged on. Missing H264/HEVC encoding and AV1 decoding, which all the other 6000 series GPU's have, is a pretty significant missing feature as well. A low power card like this could end up in a HTPC now or somewhere down the line when it can't game any more in 6 months. Cards should not be rated by their 3rd party scalper prices either, but the MSRP is what it is, and it's terrible for this card. As for other options, Gamers Nexus actually recommended checking out the used market first before considering this card. I have never seen that in a GPU review. 3 stars should be an average product that doesn't do anything to make itself stand our from the crowd. This card is a complete pile of garbage and should be rated lower than 3 stars.
 
The card shouldn't be rated by best case scenario. It should be rated based on the target market. This card is a bottom rung budget gaming card by every spec on the spec sheet. It's going to be used in PCIe3 motherboards more than gen4 boards so that's the performance it should be judged on. Missing H264/HEVC encoding and AV1 decoding, which all the other 6000 series GPU's have, is a pretty significant missing feature as well. A low power card like this could end up in a HTPC now or somewhere down the line when it can't game any more in 6 months. Cards should not be rated by their 3rd party scalper prices either, but the MSRP is what it is, and it's terrible for this card. As for other options, Gamers Nexus actually recommended checking out the used market first before considering this card. I have never seen that in a GPU review. 3 stars should be an average product that doesn't do anything to make itself stand our from the crowd. This card is a complete pile of garbage and should be rated lower than 3 stars.
I consider this more of a "60%" product, not a "3-star", but perhaps that's my fault. It's not unusable, but it has serious caveats that you need to consider. D's still get degrees after all. But if you can get this for $200, compared to other $200 cards that are out there I think that's at least justifiable in the current market. I would not get a used GPU over a new card, unless the price was significantly better. Right now, GTX 1060 6GB — which was still generally slower than the RX 6500 XT — currently goes for a minimum of $160 on eBay, with an average price over the past 30 days of $313. I don't necessarily think this is a better card in every way, but I'd take a new card over a potentially five years old used 1060 6GB for sure. Now if it's $300 for the 6500 XT and $200 for the 1060, that would be more difficult.
 
  • Like
Reactions: King_V

InvalidError

Titan
Moderator
Cards should not be rated by their 3rd party scalper prices either, but the MSRP is what it is
You cannot compare cards based on MSRPs when manufacturing costs for some of them have almost doubled since their MSRPs were originally published and most of them cannot actually be obtained for anywhere near that price either.

As for other options, Gamers Nexus actually recommended checking out the used market first before considering this card. I have never seen that in a GPU review.
He also said that the RX6500 at ~$200 actual retail price is still decent vs what you can get used anywhere near that price on eBay, especially if you have 4.0x4 available.

The RTX3050 will provide massively superior bang-per-MSRP-dollar, except it'll likely be out of stock in 5 minutes, never to be seen again under $400 until ETH goes PoS.
 
This is profiteering from AMD and the GPU market in general

This for me, is the most relevant. We can discuss the performance and be disappointed with that, but it's what AMD is doing here that bugs me more than the performance itself. It's just taking advantage of 'numbers' (6500/5500) for unsuspecting consumers who will actually buy these cards. Taking full advantage of FOMO for those not in the know.
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
I consider this more of a "60%" product, not a "3-star", but perhaps that's my fault. It's not unusable, but it has serious caveats that you need to consider. D's still get degrees after all. But if you can get this for $200, compared to other $200 cards that are out there I think that's at least justifiable in the current market. I would not get a used GPU over a new card, unless the price was significantly better. Right now, GTX 1060 6GB — which was still generally slower than the RX 6500 XT — currently goes for a minimum of $160 on eBay, with an average price over the past 30 days of $313. I don't necessarily think this is a better card in every way, but I'd take a new card over a potentially five years old used 1060 6GB for sure. Now if it's $300 for the 6500 XT and $200 for the 1060, that would be more difficult.
The 6500XT is slower than the card it supposedly replaces, the 4GB 5500XT. Have we ever seen that? If you're stuck on PCIe3, the 6500XT is a lot slower. Over the last month, 4GB 5500XT's have typically been selling used for $250-$300. These aren't 6 year old cards, and likely haven't been used for mining. Based on a guesstimate of what the street price of a 6500XT is going to be, would you recommend it over a used 5500XT for people that don't have a PCIe4 motherboard? 8GB 5500XT's are selling for about $350.

aec9b7956a6ced45a6c29b30f18bd503f5b25d14946397d29e117dc085bf7190.gif
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
You cannot compare cards based on MSRPs when manufacturing costs for some of them have almost doubled since their MSRPs were originally published and most of them cannot actually be obtained for anywhere near that price either.


He also said that the RX6500 at ~$200 actual retail price is still decent vs what you can get used anywhere near that price on eBay, especially if you have 4.0x4 available.

The RTX3050 will provide massively superior bang-per-MSRP-dollar, except it'll likely be out of stock in 5 minutes, never to be seen again under $400 until ETH goes PoS.
You have to factor in a price when rating a product. All you're doing is saying no to everything. So what price would you attach to this card?

Steve also also said, none of these card are likely to actually list at $200, since AIB models are pretty much always more than the base MSRP. Going by recent history, they could be a lot higher than the base MSRP even before they get picked up by scalpers.
 

larkspur

Distinguished
I consider this more of a "60%" product, not a "3-star", but perhaps that's my fault. It's not unusable, but it has serious caveats that you need to consider. D's still get degrees after all.
I think your conclusions are waaaaayyyyy too nice to this GPU. It's a prime example of how not to design a modern GPU. And for everyone on PCIe 3 (majority of people) it is simply not viable for modern gaming. This card should be completely avoided unless you have PCIe 4 or better. That should be made clear in your conclusion and in the cons section at the beginning. Something like, "Significant performance drop on PCIe 3 systems".

On a side note, I graduated from university like 20 years ago so maybe this has changed, or maybe it varies by institution, but when I was in college a D was a failing grade and meant you had to retake the class. In high school a D was passing.
 

InvalidError

Titan
Moderator
You have to factor in a price when rating a product. All you're doing is saying no to everything. So what price would you attach to this card?
In a normal market, $120-150.

However, the cost of GDDRx and VRM components has doubled over the last couple of years, the cost of shipping has tripled thanks to covid, the cost of silicon wafer starts has increased by 10-30%, prices of most metals have gone up by 10+% and pretty much everything that had its inputs' costs go up tacked on another 10% on top of costs getting passed along, so even junk like the RX6500 still needs to retail near $200 under current market conditions to be worth manufacturing.

Unfortunately, AMD cannot realistically sell a GPU for cheaper than it costs to make and get to store shelves. The alternative to the RX6500 costing $200 right now is having nothing at all.
 
  • Like
Reactions: King_V

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
At least until the RTX3050 launches next week and delivers 1.5-2X the performance for $50 more while supplies last.
I agree with what you said, except this part.

We will see about performance, but the "$50 more" part is very optimistic.

It was already sold in Japan at $500, so expect at least that price. It's a BS show that nvidia is doing too, this 3050 will be at least double the price and vanish in minuts thanks to 8GB Vram, a magnet for miners and then for scalpers.

Those 5 gamers that will buy it, don't count at all.

Both of these GPUs are terrible in their own ways. The Radeon one being the worst, technically and nvidia one price wise.
 
  • Like
Reactions: Roland Of Gilead

InvalidError

Titan
Moderator
It was already sold in Japan at $500, so expect at least that price.
It isn't launching for another whole week. Bribing sellers into selling you one pre-launch is going to come with its own special tax on top of everything being more expensive in Japan to begin with.

Also, I doubt Nvidia would be launching the 3050 using GA106 silicon unless it had a growing pile of silicon that failed to make the 3060 grade, so I'm taking it as a hint that RTX3060s are about to get cheaper.

ETH 2.0 is currently slated hit the next phase of its merge in Q2. Anyone buying GPUs at a premium for crypto-mining now may not have time to recover their premium before GPU mass sellouts occur. That would be another reason why Nvidia is launching the RTX3050 now: flood the entry-level tier with new GPUs before the market gets crashed by a flood of used mining GPUs.
 
I think your conclusions are waaaaayyyyy too nice to this GPU. It's a prime example of how not to design a modern GPU. And for everyone on PCIe 3 (majority of people) it is simply not viable for modern gaming. This card should be completely avoided unless you have PCIe 4 or better. That should be made clear in your conclusion and in the cons section at the beginning. Something like, "Significant performance drop on PCIe 3 systems".

On a side note, I graduated from university like 20 years ago so maybe this has changed, or maybe it varies by institution, but when I was in college a D was a failing grade and meant you had to retake the class. In high school a D was passing.
My college (university) required C- or better in major courses, but D- or above was okay for electives and general studies. Also, if you stay under the 4GB VRAM, the penalty of PCIe Gen3 was only 8% on average. Granted, in some cases (well, one game of seven) that was closer to 25%, but that's likely because in those games the 4GB was a factor even at medium settings. I don't know... Borderlands 3 was a bit of an outlier for 1080p medium. Low-level APIs also come into play, as they often turn memory garbage collection over to the software rather than drivers. Sigh. I almost need to go back and retest with DX11 and see if that runs better. I know in the past DX12 games have often refused to work on the GTX 1050, because it only had 2GB VRAM, and even had some issues on 4GB cards.