AMD Radeon RX 5500 XT vs. GeForce GTX 1660: The Battle for Mainstream Gaming Supremacy

joeblowsmynose

Distinguished
Jun 14, 2011
445
170
18,960
0
I would have given features to AMD easily for the goodies they add to their driver suite.

Overclocking GPU and VRAM, voltage adjustments on both, power state programming for both, fan curve tuning (when it works properly), Radeon Boost, etc. ...
 
Reactions: alextheblue

AlistairAB

Honorable
May 21, 2014
178
33
10,710
0
You tested the old 1660, and the 8GB RX 5500? The two cards nobody should buy?

Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
 
Last edited:

artk2219

Distinguished
Jun 30, 2010
388
5
18,865
27
You tested the old 1660, and the 8GB RX 5500? The two cards nobody should buy?

Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
If you plan on keeping it a few years those extra 4gb are definitely needed. It will be a better card in the long run, as even the 6gb on the 1660 can be a bottleneck in some titles (the 6gb VRAM is also something i'm not keen on with the new RX 5600). If you plan to upgrade every two years, meh.
 
Reactions: alextheblue
We can’t call the Radeon RX 5500 XT a bad deal. It’s far and away a better card than you would have bought for the same price a year ago.
That's simply not true. A year ago, you could get an RX 580 8GB for about the same price, and a 5500 XT isn't much more than 5% faster. Even 3 years ago, one could get an RX 480 8GB for around $200, which is within 15% of the performance of this card. The only notable advantage the 5500 XT has over those older models is improved power efficiency. The performance gains are relatively poor after 3 years, and even Nvidia's lineup is offering better performance for the money right now.

What the heck? You tested the non-super 1660, and the 8GB RX 5500? The two cards nobody should buy?

Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
Why should no one buy a 1660? Keep in mind that while the 1660 SUPER might perform around 10% faster, at current prices (online, in the US), it also typically costs at least 10% more, so both versions of the card tend to offer similar value for someone buying a graphics card as an upgrade.

And in the case of the 5500 XT, the 4GB model's limited VRAM is a notable concern, particularly since the card only uses an x8 PCIe connection, causing performance to tank more than usual when VRAM is exceeded. Neither version of the 5500 XT is all that competitively priced right now. The 4GB model needs to be priced lower than a 1650 SUPER, not higher, and the 8GB model needs to be priced lower than a 1660, which is simply a faster card. Both versions of the card should have launched for around $20 less than they did.
 

alextheblue

Distinguished
Apr 3, 2001
2,961
52
20,870
2
I'd like to see those cards tested battling while overclocked. IMHO the 5500 XT is a bit overpriced, but they seem to have decent headroom.
Test the 1650 Super and the RX 5500 4GB next time. Useless article. If you're not getting one of those, you should be looking at a 1660 Super, or RX 5600 or 5700.
This has already been discussed. If you're running these kinds of settings, there are already games that are hindered by the lack of RAM. The is only going to get worse. This problem is exacerbated if you're on a PCIe 3.0 platform, since Navi 14 only has 8 lanes.

But yes, if you can get a good deal on one, the Super offers superb bang for the buck.
 

NP

Honorable
Jan 8, 2015
7
0
10,510
0
Strangely, the RX 5500 XT is a stronger competitor with medium settings, but why would you settle for medium if you don’t need to?

I own an nVidia, and will probably buy another geforce card when I need to upgrade, but I still don't under why you assume the answer would be "Obviously for no reason" ?

I think there is valid reason for not settling for, but deliberately picking medium. You pick medium, because the added frame rates allow you to perform better in any competitive online FPS you intend to play with either RX 5500XT or GTX 1600 range gpu.

That either is or is not a consideration for you, but it is nevertheless among considerations that many people would regard of key importance when buying a new gpu.
 

InvalidError

Titan
Moderator
The RX5500 is a pitiful upgrade over what was already available for sub-$200 three years ago. If there was true competition between AMD and Nvidia GPUs, the 4GB RX5500 would be a $120 GPU.

Also, most people who buy RX5500-class GPUs will be stuck on PCIe 3.0x8 for the foreseeable future where it incurs sometimes significant performance penalties, which makes it a "nope" GPU in my book.
 

mitch074

Distinguished
The RX5500 is a pitiful upgrade over what was already available for sub-$200 three years ago. If there was true competition between AMD and Nvidia GPUs, the 4GB RX5500 would be a $120 GPU.

Also, most people who buy RX5500-class GPUs will be stuck on PCIe 3.0x8 for the foreseeable future where it incurs sometimes significant performance penalties, which makes it a "nope" GPU in my book.
True - but a couple months from now, the RX5500 4Gb will be a nice pairing with B550 based motherboards in PCIE 4.0 for mainstream gaming rigs. By then its price may have dropped a bit.

In the mean time, I'll pray the my reference RX480 8Gb doesn't fry - still going strong since I bought it in summer 2016.
 
Dec 18, 2019
4
1
15
0
I agree, the article looks weird.
It claims that difference between 100W and 130W consumption should matter little in game. And i agree. But then the article discusses this consumption at lengths.

However what might matter is noise during game.
It may matter if other family members object my gaming because computer then generates unpleasant noises.
The article does not mention it. Different card models might have different coolers, and it is now directly and generically applicable to the chip itself? Maybe, but power consumption also is dependent upon vendor and model. Voltage, factory AC, RAM chips, power lines made of lego of different efficiency components. Still, it is discussed.

Another thing that may matter is cool and silence OUT of the games. Say, if out of 24 hours of day my computer spends 1 hour gaming, 3 hours WWW surfing and generic "office work", and rest 20 hours it is "downloading torrents" and providing music and movies files to other devices in my family with no person sitting at desk - then i would argue the comfort and efficiency of non-gaming modes became even more important than gaming peak performance. However this was not mentioned at all.
 
I wonder how a full Navi 14 would match up? A card with the full 24 CU 1536 shaders that only Apple gets that can run at 1607 base, 1845 boost, and 8GB GDDR6 1750MHz memory. Guess that may come once AMD and TSMC gets the 7nm refinement process down for larger quantity of near perfect Navi 14 chips.
 
I agree, the article looks weird.
It claims that difference between 100W and 130W consumption should matter little in game. And i agree. But then the article discusses this consumption at lengths.

However what might matter is noise during game.
It may matter if other family members object my gaming because computer then generates unpleasant noises.
The article does not mention it. Different card models might have different coolers, and it is now directly and generically applicable to the chip itself? Maybe, but power consumption also is dependent upon vendor and model. Voltage, factory AC, RAM chips, power lines made of lego of different efficiency components. Still, it is discussed.

Another thing that may matter is cool and silence OUT of the games. Say, if out of 24 hours of day my computer spends 1 hour gaming, 3 hours WWW surfing and generic "office work", and rest 20 hours it is "downloading torrents" and providing music and movies files to other devices in my family with no person sitting at desk - then i would argue the comfort and efficiency of non-gaming modes became even more important than gaming peak performance. However this was not mentioned at all.
Noise will depend on the cooler of the card, something AMD and Nvidia don't dictate. It's up to the card manufacturers to decide what sort of cooling solution they put on each card. Generally, the higher priced variants of a given model will be provided with more substantial multi-fan coolers that don't need to run their fans as fast, potentially making them quieter under load, though that isn't always the case. Seeing as both cards have relatively similar power draw, I would expect their typical noise output under load to also be relatively similar for a given cooler design.

And as far as idle power consumption goes, it isn't likely to matter that much. Even most high-end cards tend to not draw much more than 10 watts when doing typical desktop tasks. Some manufacturers will even set their fans to shut off entirely at idle, though again, that can vary from one manufacturer to the next.
 

ASK THE COMMUNITY

TRENDING THREADS