AMD Radeon RX 5500 XT vs. GeForce GTX 1660: The Battle for Mainstream Gaming Supremacy

joeblowsmynose

Distinguished
I would have given features to AMD easily for the goodies they add to their driver suite.

Overclocking GPU and VRAM, voltage adjustments on both, power state programming for both, fan curve tuning (when it works properly), Radeon Boost, etc. ...
 
  • Like
Reactions: alextheblue

AlistairAB

Distinguished
May 21, 2014
229
60
18,760
You tested the old 1660, and the 8GB RX 5500? The two cards nobody should buy?

Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
 
Last edited:

artk2219

Distinguished
You tested the old 1660, and the 8GB RX 5500? The two cards nobody should buy?

Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.

If you plan on keeping it a few years those extra 4gb are definitely needed. It will be a better card in the long run, as even the 6gb on the 1660 can be a bottleneck in some titles (the 6gb VRAM is also something i'm not keen on with the new RX 5600). If you plan to upgrade every two years, meh.
 
  • Like
Reactions: alextheblue
We can’t call the Radeon RX 5500 XT a bad deal. It’s far and away a better card than you would have bought for the same price a year ago.
That's simply not true. A year ago, you could get an RX 580 8GB for about the same price, and a 5500 XT isn't much more than 5% faster. Even 3 years ago, one could get an RX 480 8GB for around $200, which is within 15% of the performance of this card. The only notable advantage the 5500 XT has over those older models is improved power efficiency. The performance gains are relatively poor after 3 years, and even Nvidia's lineup is offering better performance for the money right now.

What the heck? You tested the non-super 1660, and the 8GB RX 5500? The two cards nobody should buy?

Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
Why should no one buy a 1660? Keep in mind that while the 1660 SUPER might perform around 10% faster, at current prices (online, in the US), it also typically costs at least 10% more, so both versions of the card tend to offer similar value for someone buying a graphics card as an upgrade.

And in the case of the 5500 XT, the 4GB model's limited VRAM is a notable concern, particularly since the card only uses an x8 PCIe connection, causing performance to tank more than usual when VRAM is exceeded. Neither version of the 5500 XT is all that competitively priced right now. The 4GB model needs to be priced lower than a 1650 SUPER, not higher, and the 8GB model needs to be priced lower than a 1660, which is simply a faster card. Both versions of the card should have launched for around $20 less than they did.
 

alextheblue

Distinguished
I'd like to see those cards tested battling while overclocked. IMHO the 5500 XT is a bit overpriced, but they seem to have decent headroom.
Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
This has already been discussed. If you're running these kinds of settings, there are already games that are hindered by the lack of RAM. The is only going to get worse. This problem is exacerbated if you're on a PCIe 3.0 platform, since Navi 14 only has 8 lanes.

But yes, if you can get a good deal on one, the Super offers superb bang for the buck.
 

NP

Distinguished
Jan 8, 2015
74
15
18,535
Strangely, the RX 5500 XT is a stronger competitor with medium settings, but why would you settle for medium if you don’t need to?

I own an nVidia, and will probably buy another geforce card when I need to upgrade, but I still don't under why you assume the answer would be "Obviously for no reason" ?

I think there is valid reason for not settling for, but deliberately picking medium. You pick medium, because the added frame rates allow you to perform better in any competitive online FPS you intend to play with either RX 5500XT or GTX 1600 range gpu.

That either is or is not a consideration for you, but it is nevertheless among considerations that many people would regard of key importance when buying a new gpu.
 

InvalidError

Titan
Moderator
The RX5500 is a pitiful upgrade over what was already available for sub-$200 three years ago. If there was true competition between AMD and Nvidia GPUs, the 4GB RX5500 would be a $120 GPU.

Also, most people who buy RX5500-class GPUs will be stuck on PCIe 3.0x8 for the foreseeable future where it incurs sometimes significant performance penalties, which makes it a "nope" GPU in my book.
 
The RX5500 is a pitiful upgrade over what was already available for sub-$200 three years ago. If there was true competition between AMD and Nvidia GPUs, the 4GB RX5500 would be a $120 GPU.

Also, most people who buy RX5500-class GPUs will be stuck on PCIe 3.0x8 for the foreseeable future where it incurs sometimes significant performance penalties, which makes it a "nope" GPU in my book.
True - but a couple months from now, the RX5500 4Gb will be a nice pairing with B550 based motherboards in PCIE 4.0 for mainstream gaming rigs. By then its price may have dropped a bit.

In the mean time, I'll pray the my reference RX480 8Gb doesn't fry - still going strong since I bought it in summer 2016.
 
Dec 18, 2019
7
1
15
I agree, the article looks weird.
It claims that difference between 100W and 130W consumption should matter little in game. And i agree. But then the article discusses this consumption at lengths.

However what might matter is noise during game.
It may matter if other family members object my gaming because computer then generates unpleasant noises.
The article does not mention it. Different card models might have different coolers, and it is now directly and generically applicable to the chip itself? Maybe, but power consumption also is dependent upon vendor and model. Voltage, factory AC, RAM chips, power lines made of lego of different efficiency components. Still, it is discussed.

Another thing that may matter is cool and silence OUT of the games. Say, if out of 24 hours of day my computer spends 1 hour gaming, 3 hours WWW surfing and generic "office work", and rest 20 hours it is "downloading torrents" and providing music and movies files to other devices in my family with no person sitting at desk - then i would argue the comfort and efficiency of non-gaming modes became even more important than gaming peak performance. However this was not mentioned at all.
 
I wonder how a full Navi 14 would match up? A card with the full 24 CU 1536 shaders that only Apple gets that can run at 1607 base, 1845 boost, and 8GB GDDR6 1750MHz memory. Guess that may come once AMD and TSMC gets the 7nm refinement process down for larger quantity of near perfect Navi 14 chips.
 
I agree, the article looks weird.
It claims that difference between 100W and 130W consumption should matter little in game. And i agree. But then the article discusses this consumption at lengths.

However what might matter is noise during game.
It may matter if other family members object my gaming because computer then generates unpleasant noises.
The article does not mention it. Different card models might have different coolers, and it is now directly and generically applicable to the chip itself? Maybe, but power consumption also is dependent upon vendor and model. Voltage, factory AC, RAM chips, power lines made of lego of different efficiency components. Still, it is discussed.

Another thing that may matter is cool and silence OUT of the games. Say, if out of 24 hours of day my computer spends 1 hour gaming, 3 hours WWW surfing and generic "office work", and rest 20 hours it is "downloading torrents" and providing music and movies files to other devices in my family with no person sitting at desk - then i would argue the comfort and efficiency of non-gaming modes became even more important than gaming peak performance. However this was not mentioned at all.
Noise will depend on the cooler of the card, something AMD and Nvidia don't dictate. It's up to the card manufacturers to decide what sort of cooling solution they put on each card. Generally, the higher priced variants of a given model will be provided with more substantial multi-fan coolers that don't need to run their fans as fast, potentially making them quieter under load, though that isn't always the case. Seeing as both cards have relatively similar power draw, I would expect their typical noise output under load to also be relatively similar for a given cooler design.

And as far as idle power consumption goes, it isn't likely to matter that much. Even most high-end cards tend to not draw much more than 10 watts when doing typical desktop tasks. Some manufacturers will even set their fans to shut off entirely at idle, though again, that can vary from one manufacturer to the next.
 

Exia00

Distinguished
Dec 27, 2013
219
8
18,695
You tested the old 1660, and the 8GB RX 5500? The two cards nobody should buy?

Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.

The thing is people can be able to get a 1660 used for a great price and already a lot of people has tested out the 1650 Super vs the 5500 XT 8gb which the 1650 Super ends up being on par or better in gaming and a 4gb 5500 XT is slightly weaker which would make the 1650 Super better all around than the 5500 XT 4gb version.
 

InvalidError

Titan
Moderator
and a 4gb 5500 XT is slightly weaker
Drastically weaker if you exceed the 4GB frame buffer on a platform that only has PCIe 3.0, which is the bulk of platforms likely to see a 1650S or RX5500 in the foreseeable future. Between being limited to PCIe x8 and only having 4GB of VRAM, the 4GB RX5500 is almost as good as DoA in my book
 

Gruia Novac

Honorable
Oct 3, 2013
15
0
10,510
If you plan on keeping it a few years those extra 4gb are definitely needed. It will be a better card in the long run, as even the 6gb on the 1660 can be a bottleneck in some titles (the 6gb VRAM is also something i'm not keen on with the new RX 5600). If you plan to upgrade every two years, meh.

wrong. i bet ur tauting the same idea for 5 - 10 years.
memory is not that big a deal
 

Gruia Novac

Honorable
Oct 3, 2013
15
0
10,510
i find it disgusting that the article dismisses the compusption part like that.
every darn W matters. and it matters a lot more than 5 degrees in temperature. W = money. a few degrees mean <Mod Edit>
 
Last edited by a moderator:
i find it disgusting that the article dismisses the compusption part like that.
every darn W matters.
The article was pointing out that power consumption is relatively similar between these two cards, and that the exact amount of power drawn by either card can vary depending on how the manufacturer decides to configure voltage and clock rates for a given model.

As for "every watt" mattering from a cost perspective, that's not really going to make a huge difference with cards of relatively similar efficiency, like these. If the cards were under full load 24/7, sure, something like a 10 watt difference might cost you a little over $10 more a year at the average cost of electricity in the US. However, people are typically not running their card under load all day, every day. With an average of 3 hours of gaming a day, a 10 watt difference might amount to a little over a dollar a year. That's not exactly going to result in a significant difference in value over the life of the card.

Now, with the previous generation of cards, the RX 580 definitely did draw a fair amount more power than the GTX 1060 under load, to the point where a few hours of gaming each day could result in more like a $10 difference in electrical costs over the course of a year. We don't see anything like that with the current generation of cards though, as AMD's more efficient process has put them pretty much on par with Nvidia for the time being.
 

TRENDING THREADS