Review AMD Radeon RX 6500 XT Review: The Return of the 'Budget' GPU

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The 6500XT is slower than the card it supposedly replaces, the 4GB 5500XT. Have we ever seen that? If you're stuck on PCIe3, the 6500XT is a lot slower. Over the last month, 4GB 5500XT's have typically been selling used for $250-$300. These aren't 6 year old cards, and likely haven't been used for mining. Based on a guesstimate of what the street price of a 6500XT is going to be, would you recommend it over a used 5500XT for people that don't have a PCIe4 motherboard? 8GB 5500XT's are selling for about $350.
The 8GB 5500 XT is certainly a better card, which I mentioned in the conclusion. It all basically comes down to the actual street price. GTX 1650 Super goes for $325 on eBay, but Nvidia cards often cost more even on eBay. This will probably end up in the $250-$300 range. At $250, for some it might be worth getting this over the previous gen card, but honestly I'd try really hard to go up to a true midrange (performance) card like the RX 6600 or RTX 3060, or just wait out the shortages. Even Xbox Series S is a better option right now than a $300 ultra-budget GPU like the 6500 XT. If you need something to tide you over, and you can get this card or something similar for $200, that's probably okay — not great, but okay.

Based on other RTX 30-series cards, unfortunately the RTX 3050 may end up selling for $450-$550 on eBay, at which point it's again a hard pass. Or, just be patient and bid on auctions or make offers until you can get one for no more than $350. Looking at the eBay history, about 1% of RTX 3060 12GB cards sold last month for $500 or less. Ouch.

FWIW, the lack of encoding support doesn't really bother me much. I don't think many streamers are going to buy a card like this, and decoding support is still fine. Again, it's a step back so "fine" is not at all "good," but I could certainly live with it. Just like I could live with running this on a PCIe Gen3 interface if I had nothing better.
 
Also, if you stay under the 4GB VRAM, the penalty of PCIe Gen3 was only 8% on average. Granted, in some cases (well, one game of seven) that was closer to 25%, but that's likely because in those games the 4GB was a factor even at medium settings.
For me, the real benchmark of how badly GPUs suffer from having only 4GB of VRAM and truncated PCIe bandwidth is to compare them against the highest details the 1650 Super or 4GB RX5500 on 4.0x8 are comfortable at.

The RX6500's performance tends to fall off a cliff under conditions that are still playable on anything else.

As I have been saying for a few years, 4.0x16 will become essential to keep 4GB GPUs viable much beyond 1650S-level performance.
 
  • Like
Reactions: larkspur
In a normal market, $120-150.

That's still a cop out theoretical answer that isn't addressing the reality of the market. This card isn't going to cost anywhere near $120-150. You can't argue that you can't get any other similar option for $200 in the current market based on their market inflated prices when you're not going to be able to buy a 5600XT for $200 either.


Unfortunately, AMD cannot realistically sell a GPU for cheaper than it costs to make and get to store shelves. The alternative to the RX6500 costing $200 right now is having nothing at all.
Some how, Nvidia was able to launch an ok GPU in the 3050 with 8GB and 16 PCIe lanes for only $50 more. Nvidia has never been known as the value champion and is likely still pulling in a decent profit with that card, so I'm not buying AMD can't slap unneeded mobile GPU's and 4GB of RAM on a PCB and turn a profit without charging $200.
 
You can't argue that you can't get any other similar option for $200 in the current market based on their market inflated prices when you're not going to be able to buy a 5600XT for $200 either.
The RX5600 launched at $280. If you meant RX6500, that was available at $200 for a few hours. Since launch units were shipped to stores via priority air, supply was necessarily quite limited by extreme shipping costs. You'll have to wait about a month for the first bulk priority shipments by sea to land and another two or three months beyond that for normal supply. Availability will be spotty until then.

Some how, Nvidia was able to launch an ok GPU in the 3050 with 8GB and 16 PCIe lanes for only $50 more. Nvidia has never been known as the value champion and is likely still pulling in a decent profit with that card, so I'm not buying AMD can't slap unneeded mobile GPU's and 4GB of RAM on a PCB and turn a profit without charging $200.
The RX6500's die costs ~$30, GDDR6 costs $10-12/GB depending on speed grade and brand, the HSF costs ~$15, the VRM costs ~$40. Add the PCB, assembly, box, shipping, marketing, support, US import taxes, etc. and there isn't much room left for profit at $200, especially while cards are being shipped by express air at ~$30 a pop.

This also means there won't be much of a profit margin to be had on the RTX3050 either the extra $50 on MSRP barely covers the added VRAM cost. The RTX3050 being a far more desirable product also guarantees it'll fetch a much higher price on the scalper market than the RX6500 which isn't really scalping-worthy at all.
 
  • Like
Reactions: King_V
The RX5600 launched at $280. If you meant RX6500, that was available at $200 for a few hours. Since launch units were shipped to stores via priority air, supply was necessarily quite limited by extreme shipping costs. You'll have to wait about a month for the first bulk priority shipments by sea to land and another two or three months beyond that for normal supply. Availability will be spotty until then.

I remember hearing the exact same thing early in the 3000 series launch. Early models were airlifted on limited Covid flights. Wait a couple months until the container ship shipments arrive for improved availability. That turned out well. Time will tell how this one will turn out. One good sign was that Newegg 6500 XT's were listed on their website, and none were on today's shuffle.


The RX6500's die costs ~$30, GDDR6 costs $10-12/GB depending on speed grade and brand, the HSF costs ~$15, the VRM costs ~$40. Add the PCB, assembly, box, shipping, marketing, support, US import taxes, etc. and there isn't much room left for profit at $200, especially while cards are being shipped by express air at ~$30 a pop.
This also means there won't be much of a profit margin to be had on the RTX3050 either the extra $50 on MSRP barely covers the added VRAM cost. The RTX3050 being a far more desirable product also guarantees it'll fetch a much higher price on the scalper market than the RX6500 which isn't really scalping-worthy at all.

Anyone can buy GDDR6 for about $13-15/GB

Digi-Key

AMD and Nvidia probably pay about half that with the amount they order.
 
  • Like
Reactions: JarredWaltonGPU
It isn't launching for another whole week. Bribing sellers into selling you one pre-launch is going to come with its own special tax on top of everything being more expensive in Japan to begin with.

Also, I doubt Nvidia would be launching the 3050 using GA106 silicon unless it had a growing pile of silicon that failed to make the 3060 grade, so I'm taking it as a hint that RTX3060s are about to get cheaper.

ETH 2.0 is currently slated hit the next phase of its merge in Q2. Anyone buying GPUs at a premium for crypto-mining now may not have time to recover their premium before GPU mass sellouts occur. That would be another reason why Nvidia is launching the RTX3050 now: flood the entry-level tier with new GPUs before the market gets crashed by a flood of used mining GPUs.
Do you really believe everything you said here?

Who cares if the 3050 is even $400 in US, when in the majority of the rest of the world will be like it is in Japan, more expensive? It's always like that... except for US, everywhere is more expensive. US a a fairy tale land for tech prices, it does not matter how "good" are the prices there at all. There are 20x more people in the rest of the world that will suffer and are already suffering worse prices. I don't give a flying F about the US.

So, it will be $500 at least if not more and it will vanish very fast too. Those are 2 big minuses right there.

Also people are still dreaming of ETH going POS? Really? Haven't you people learned by now not to trust BS promises and predictions when it comes to crypto?
How many times since this last cryto boom started did we hear that it will crash and we will have better GPU prices and the market will be flooded with used GPUs and everybody will be in ecstasy? And how many times those people were proven wrong for more than a year now?
And even if this incredible dream (of some) happens this year, ETH going POS, you really think mining will be over? Pffft...

How naive people can still be...

I'll believe all these "high as a kite" feelings and impressions and dreams and wishes when they actually come true. Not a day sooner.
 
Anyone can buy GDDR6 for about $13-15/GB
You cannot: all of the GDDR6 listed on DigiKey is 0-stock and unable to accept back-orders due to no ETA on supply. The $15 listed there likely hasn't been updated since the last time DK had any in stock, possibly over a year ago.

Mouser has 1099 1GB 16Gbps GDDR6 in-stock at $43 per chip.

Thing to keep in mind is that OEMs usually pay a higher contract price than then-prevailing bulk market price when they want to secure a long-term supply of a hot commodity. Distributors like Digikey and Mouser get whatever is left after that, hence little to no ability to take back-orders and ludicrously high prices when OEMs have contracted nearly all production directly from manufacturers.

Do you really believe everything you said here?
PoW isn't sustainable. It is only a matter of time before more governments jump in to protect the environment, power grid, energy production, etc. from the resource drain that generates near-zero benefits for the local economy.
 
PoW isn't sustainable. It is only a matter of time before more governments jump in to protect the environment, power grid, energy production, etc. from the resource drain that generates near-zero benefits for the local economy.
And that will still take years... because as you can see there still is much division in the world between who is PRO and who is AGAINST crypto, in all it's forms.

We have countries banning, others adopting, people hating others loving and even banks and other mediums adopting crypto. It's not as as simple and like I said, it will take years, it's not now, not this year...

ETH going POS can be delayed again, like it was so many times before. The probability of the delay is much much higher than actually happening this time.
 
Imagine you have an old man, but it's sturdy enough that has a healthy routine of going to the gym, going out, biking even. Then you have a teenager, full of energy, but someone broke their legs, blinded them and forced them to use a crane that hunch their backs so badly, you'd think they're 150 years old.

Now you make these two run a marathon as they are. That is what the 6500XT is. That poor teenager that got roughed up against its will and put in a situation out of its control. Under other circumstances, it would've been a fair competition, but AMD just decided to rough up Navi24 so hard it's borderline tech-gore.

If this card was $100, I think it would be decent enough to be passable, but this card, as presented, is complete garbage. I feel sorry already for whomever is forced to buy it. As mentioned above, this card should've been a 6400XT or 6300XT at best for way less shekles.

Oh welp. This is our new "budget option". Better get used to it? Personally, I'd be surprised if nVidia even tried. The 3050 is going to be up against a tough crowd, given it's price range and, while not as terrible as the 6500XT is, I think it won't fare much better.

EDIT: This is a good video to explain things, not only from AMD's perspective.
View: https://www.youtube.com/watch?v=8nAwu6CnmgY


Regards.
 
Last edited:
  • Like
Reactions: VforV
Imagine you have an old man, but it's sturdy enough that has a healthy routine of going to the gym, going out, biking even. Then you have a teenager, full of energy, but someone broke their legs, blinded them and forced them to use a crane that hunch their backs so badly, you'd think they're 150 years old.

Now you make these two run a marathon as they are. That is what the 6500XT is. That poor teenager that got roughed up against its will and put in a situation out of its control. Under other circumstances, it would've been a fair competition, but AMD just decided to rough up Navi24 so hard it's borderline tech-gore.

If this card was $100, I think it would be decent enough to be passable, but this card, as presented, is complete garbage. I feel sorry already for whomever is forced to buy it. As mentioned above, this card should've been a 6400XT or 6300XT at best for way less shekles.

Oh welp. This is our new "budget option". Better get used to it? Personally, I'd be surprised if nVidia even tried. The 3050 is going to be up against a tough crowd, given it's price range and, while not as terrible as the 6500XT is, I think it won't fare much better.

EDIT: This is a good video to explain things, not only from AMD's perspective.
View: https://www.youtube.com/watch?v=8nAwu6CnmgY


Regards.
Despite our wishes, if you watch MLiD he clearly shows why this could not have been under $200 or them re-releasing the 5500 XT which would have cost more then $250...

I think this ugly little GPU can only work if you never go past Medium settings in older games and mostly stay on Low settings in new ones. So clearly this is for those that really "need" any kind of GPU now.

3050 won't fare much better because at $500, that's mid range GPU territory, 3070 level, or at least it should be and paying that much is as stupid as paying $1100 for a 3070 now. Or any other 2x+ MSRP example we have, because we have many.

Now I'm really curious to see how intel handles Arc launch and more important than performance, at what MSRP and at what real price they will sell.
 
  • Like
Reactions: cyrusfox
You cannot: all of the GDDR6 listed on DigiKey is 0-stock and unable to accept back-orders due to no ETA on supply. The $15 listed there likely hasn't been updated since the last time DK had any in stock, possibly over a year ago.

Mouser has 1099 1GB 16Gbps GDDR6 in-stock at $43 per chip.
The cards that AMD and Nvidia are currently selling or about to release. When were the contracts signed for the memory being used on them? This week, or months to over a year ago? GDDRX6 in particular. Does anyone besides Nvidia use it? If Micron wants the volume to get a return on investment on that one or to be able to partner with other companies in the future, ripping off Nvidia isn't a winning strategy.
 
  • Like
Reactions: JarredWaltonGPU
The cards that AMD and Nvidia are currently selling or about to release. When were the contracts signed for the memory being used on them? This week, or months to over a year ago?
They use the same chips across their entire product portfolio, so I doubt they have supply contracts all of the memory manufacturers they use for every single chip they may pair with each given SKU on a per-combination basis and the contracts from pre-RX6000/RTX3000/PS5/XB-XS launches before GDDR6 prices shot up have likely run out a while ago. In all likelihood, they have multiple staggered supply contracts to mitigate the liabilities from all of them simultaneously expiring in the middle of a pricing bubble.
 
Been reading the content on this site for a decade and finally had to register to comment on the amount of nonsense in this 'review.'

It's obvious this was not released with AAA-titles in mind. It was obviously released for esports games such as CS:GO / OW / DotA. Hence you excluding resolutions from the review for games that run excellently even with much less VRAM makes absolutely no sense.

Were you aware you can run the most popular esports titles listed above on 2k resolution paired with a 165 Hz monitor comfortably with a decade old 2GB GPU such as GTX770? Apparently not.

The price and x4 performance on PCIe 3.0 are major negative sides worth noting but given how excessively you advertise yourself as a GPU guru with nearly 20 years of experience it's shocking how ignorant a review can be. This card is literally what the community needed among the GPU shortage and it serves anyone playing some of the most popular games at 1k/2k at high refresh rates with low power draw and noise.

Entry level cards are crucial since old hardware dies over time and the vast majority of games run exceptionally even with these specs. The target audience is far off in this review.
People don't buy a graphic card with no intention of playing other than the usual old E-sports titles. Even e-sports may have new games added to it. And do you think a new game will utilise old game engines? And in this case, AMD infamously declared that we made this card so that gamers can get a GPU. It did not say that e-sports gamers can get a GPU. Given the asking price, the limitations on this card, I would ask any e-sports gamers to stay put with whatever card they are using UNLESS it dies and this is the only GPU you can find at close to MSRP. Entry level GPUs always gets worst specs for sure, but I don't recall seeing one this bad that everything is thrown out just so that AMD can keep it at a certain cost. Even if they were to sell the card for 5 to 10% more, while offering a minimal x8 bus support and at least AV1 decoder, I think people will still buy it. This card however is bare bone to the max. This is the only GPU that is launched in my country where there are still ample stock for sale. And the cheapest Sapphire Pulse is at least 265 USD excluding tax. Most of them are selling this for 300 USD or higher. The cheaper GTX 1650 4GB GDDR6 costs around 220 USD, and in my opinion, a better option for people whose computer don't have PCI-E 4.0 slots.
 
Last edited:
Some how, Nvidia was able to launch an ok GPU in the 3050 with 8GB and 16 PCIe lanes for only $50 more. Nvidia has never been known as the value champion and is likely still pulling in a decent profit with that card, so I'm not buying AMD can't slap unneeded mobile GPU's and 4GB of RAM on a PCB and turn a profit without charging $200.
Nvidia's card will certainly invite scalpers to try and flip it at a steep premium, even though the paper MSRP is just 50 bucks more. Especially so when the RX 6500 XT flopped really hard where it can barely beat a GTX 1650 Super.

Personally, I feel the RTX 3050 is not going to perform very well, though not anywhere close to the poor performance of the RX 6500 XT. I estimate performance could be around a GTX 1660 Super level. While the CUDA core count is double of the GTX 1650 Super, it doubled only because Nvidia changed the way they count the cores. But overall, it is not expected to have all the draconian cutbacks that AMD inflicted on their RX 6500 XT, and so will quite a fair bit faster and a lot better in value at MSRP.
 
Nvidia's card will certainly invite scalpers to try and flip it at a steep premium, even though the paper MSRP is just 50 bucks more. Especially so when the RX 6500 XT flopped really hard where it can barely beat a GTX 1650 Super.
If you count FC6 with texture pack and other scenarios that push 4GB cards to their limit though, the RX6500 drops off a cliff while everything else can still chug along fine by borrowing system memory thanks to having 2-4X as much PCIe bandwidth available.
 
If you count FC6 with texture pack and other scenarios that push 4GB cards to their limit though, the RX6500 drops off a cliff while everything else can still chug along fine by borrowing system memory thanks to having 2-4X as much PCIe bandwidth available.

Yes, but as long as you have a pcie Gen4 mobo, which for people looking for a budget GPU is probably not the case.
 
Yes, but as long as you have a pcie Gen4 mobo, which for people looking for a budget GPU is probably not the case.
Now this argument is a bit weird, I have to say.

I'm not saying the overall premise is wrong; not at all. I do agree that people with either old systems or budget options are PCIe3 dwellers, but no one has made any assumptions on CPU. If a budget user needs/wants to upgrade whatever bottom of the barrel GPU they had before, is it too far of a stretch to also assume they have a bottom of the barrel CPU? I say this, because most of the huge gaps in games shown are only using CPUs which* pretty much get rid of the bottleneck a CPU would have.

So, we know i5s pre 9K series are all 4c/4t and are already capping FPS'es hard, so if someone with such a CPU (or lower/older, mind you) were to upgrade to this thing (6500XT), would they still see the same drop in FPS as in reviews? Would the PCIe x4 debacle be as important then?

I'm not saying this to refute findings, but add another angle/perspective to the discussion I think many have ignored. This is also to ask, if under those conditions, wouldn't just getting the cheaper option ultimately benefit whomever wants to buy a new/used GPU? Be it a used 1650S or new 6500XT, if you're still being CPU bottlenecked, then get the cheaper alternative if you don't care about missing encoding features?

Regards.
 
Now this argument is a bit weird, I have to say.

I'm not saying the overall premise is wrong; not at all. I do agree that people with either old systems or budget options are PCIe3 dwellers, but no one has made any assumptions on CPU. If a budget user needs/wants to upgrade whatever bottom of the barrel GPU they had before, is it too far of a stretch to also assume they have a bottom of the barrel CPU? I say this, because most of the huge gaps in games shown are only using CPUs which* pretty much get rid of the bottleneck a CPU would have.

So, we know i5s pre 9K series are all 4c/4t and are already capping FPS'es hard, so if someone with such a CPU (or lower/older, mind you) were to upgrade to this thing (6500XT), would they still see the same drop in FPS as in reviews? Would the PCIe x4 debacle be as important then?

I'm not saying this to refute findings, but add another angle/perspective to the discussion I think many have ignored. This is also to ask, if under those conditions, wouldn't just getting the cheaper option ultimately benefit whomever wants to buy a new/used GPU? Be it a used 1650S or new 6500XT, if you're still being CPU bottlenecked, then get the cheaper alternative if you don't care about missing encoding features?

Regards.

I get where you are going, and yes you are right, very old system with 4c/4t CPUs will limit GPU performance somehow. But in world where no budget GPUs are available launching a product with such "limitation" (PCIe gen 4X) is really a sad and strange move. I mean AMD surely knows whats the potential market for a low end GPU, right ? I don't know everything is soo strange.

What I mean is a lot of gamers that are also still using i5 9400/i3 10xxx or ryzen 1600/2600/3600/3100/3300X and those are still decent CPUs, not the best but decent, and they are all most likely runing PCIe Gen 3.
 
So, we know i5s pre 9K series are all 4c/4t and are already capping FPS'es hard, so if someone with such a CPU (or lower/older, mind you) were to upgrade to this thing (6500XT), would they still see the same drop in FPS as in reviews? Would the PCIe x4 debacle be as important then?
If you look at 4.0x4 vs 3.0x4 benchmarks at lowered details to make everything fit within 4GB to take that variable out of the equation, the RX6500 typically loses around 8% simply from frame data taking too long to shove onto the GPU so it can do its thing. That handicap should persist wherever the CPU isn't a hard bottleneck and old i5s were still perfectly fine driving RX580s under most circumstances.

But in world where no budget GPUs are available launching a product with such "limitation" (PCIe gen 4X) is really a sad and strange move.
When asked what's up with the RX6500's crippled features, AMD responded that Navi24 was originally only intended for use in Ryzen 6000 laptops. The lack of media encode/decode and limited video output ports makes sense if the platform was always meant to rely on the IGP for those functions. The narrow 4.0x4 PCIe and memory busses help cut down board space and power too.
 
If you look at 4.0x4 vs 3.0x4 benchmarks at lowered details to make everything fit within 4GB to take that variable out of the equation, the RX6500 typically loses around 8% simply from frame data taking too long to shove onto the GPU so it can do its thing. That handicap should persist wherever the CPU isn't a hard bottleneck and old i5s were still perfectly fine driving RX580s under most circumstances.
True. Then again it is not far fetched to consider that a person upgrading their GPU will toy around with the settings in order to find a new sweet spot, so if the previous GPU they had was 4GB or under, they'll most likely keep the same settings, so what will matter there is not PCIe, and I agree... But this thing also has 64bit BUS making its effective bandwidth way less than mid-tier cards even from 5 years ago. So, this is to say, even at the same quality settings, assuming you're not moving over the 4GB boundary, would performance drop anyway even using an old CPU? Those are the sort of questions I would like answered.

I am starting to believe more and more this entry level card should also be tested with bottom of the barrel CPUs like Pentiums, Celerons, Core i3s and whatever AMD has in the lower end, even Bulldozer. That will really let us know if this card can do something for people with older GPUs. I'm guessing it will be more like a sidegrade at worst, but for most going from iGPU and/or "just video output" cards, this thing may actually be their only alternative if they want warranty, care about drivers support and not play lotery with used items. Which is really sad.

Regards.
 
True. Then again it is not far fetched to consider that a person upgrading their GPU will toy around with the settings in order to find a new sweet spot, so if the previous GPU they had was 4GB or under, they'll most likely keep the same settings, so what will matter there is not PCIe
If you take the absolute worst cases for the RX6500, it drops all the way down to GTX1050Ti level of performance when you blow through the 4GB cap. So, someone who habitually plays at levels of details where the 1050Ti relies significantly on system memory, the RX6500 can be a $200 sidegrade.

As for 64bits GDDR6, cards from five years ago used GDDR5, so bandwidth is similar to 128bits RAM from back then. Add some architectural efficiency on top and it isn't really that much of an issue. Polaris was a stinker on efficiency, which is how the RX6500 can beat the RX580 most of the time despite having nearly half as much memory bandwidth as long as you stay comfortably within 4GB.

I am starting to believe more and more this entry level card should also be tested with bottom of the barrel CPUs like Pentiums, Celerons, Core i3s and whatever AMD has in the lower end
That is kind of redundant since benchmarking the CPUs using the fastest GPUs available will tell you the best those CPUs are capable of and then you can simply look up GPU benchmarks to find the cheapest GPU that should be able to keep up with your CPU most of the time for your games of interest.

Testing every possible CPU-GPU combination possible isn't realistically feasible. The best you can hope to ever have there is an occasional sanity check in something like a "(ultra-)budget build" story.
 
I was about to say that the 6500XT is AMD's version of the GTX 1650 (GDDR5 version)...

The 1650 couldn't keep up with an old RX 570 4GB.

BUT . . no, the 6500XT is worse. At least the 1650 could outdo its predecessors, the 1050 and 1050Ti.

The 6500XT matches or falls short of the 5500XT even in 4GB form. It ONLY manages to improve in terms of lesser power consumption.
 
  • Like
Reactions: JarredWaltonGPU
You didn't try to cause a buzz like some youtubers who reviewed this card with bad faith.
I don't see the "bad faith" in bashing the RX6500 for often failing to deliver performance on par with many other 3-5 years old 4GB GPUs at the same quality settings those older GPUs are able to deliver playable performance at mainly thanks to having 128bigs VRAM bus and x8/x16 PCIe. It is an overall regression for the $200 price point and 50(0)-tier which rightfully should be bashed as such to reduce the likelihood AMD or Nvidia will do it again.

The RX6500's performance and feature-cutting may have been excusable if it was branded as a 6400 instead.