[SOLVED] Upgraded Radeon RX580 to RX5700RT

eryanh

Distinguished
Dec 22, 2014
9
0
18,510
Upgraded like title says. Did so because I was having a bottleneck. CPU is Ryzen 7 1800X and usage was 10%-15% playing Star Wars Battlefront 2 and GPU usage was 98%-99% constant with 15-20FPS at low settings, monitor is Dell 34" ultrawide at 3440x1440 and running through DisplayPort cable . Upgraded card to GIGABYTE RX5700RT like stated. Updated drivers for card after install. Same game is now running 30FPS and but GPU usage 98%-99% still at low settings. Both old and new card are 8 GB. Only change that was noticeable was the card was running cooler. I was under the impression that this card was made for 2K games? Or am I wrong? I can give other system info if needed. I'm not very knowledgeable when it comes to this stuff.
 
Last edited:
Solution
Even when changing a card within a brand, that uses the same drivers, I still will do a complete driver clean with Display Driver Uninstaller, then remove the old card, then install the new card, then power up and install the drivers.

That's what I did when I upgraded my son's PC from an RX 580 to an RX 5700.
In this case I think it might possibly be the bug that causes low fps when using DX12 API. A Ryzen 7 1800X and RX 580 should have been capable of 60+fps on medium+ settings.
Upgraded like title says. Did so because I was having a bottleneck. CPU is Ryzen 7 1800X and usage was 10%-15% playing Star Wars Battlefront 2 and GPU usage was 98%-99% constant with 15-20FPS at low settings, monitor is Dell 34" ultrawide at 3440x1440 and running through DisplayPort cable . Upgraded card to GIGABYTE RX5700RT like stated. Updated drivers for card after install. Same game is now running 30FPS and but GPU usage 98%-99% still at low settings. Both old and new card are 8 GB. Only change that was noticeable was the card was running cooler. I was under the impression that this card was made for 2K games? Or am I wrong? I can give other system info if needed. I'm not very knowledgeable when it comes to this stuff.
Are you using DX12 in the graphics menu? Try using DX11 and see if it improves fps. I think I remember reading and commenting last year of people having issues with BF! or some other Frostbite engine game that has DX12 rending available in the options.
 
full system spec? include make and model of the psu
what the cpu/gpu temp?

I'm not sure how to do that, but I can when I find out.
A snapshot of AMDs usage panel while in BF2 looks like this:

FPS 34
GPU UTIL 99%
GPU SCLK 1853 MHz
GPU PWR 190W
GPU TEMP 66C
GPU JUNCTION TEMP 88 C
GPU FAN 1921RPM
GPU VRAM UTIL 8017MB
GPU MCLK 1750MHz
CPU UTIL 20%
RAM UTIL 11.3GB

Has 24 GB of RAM also
 
Are you using DX12 in the graphics menu? Try using DX11 and see if it improves fps. I think I remember reading and commenting last year of people having issues with BF! or some other Frostbite engine game that has DX12 rending available in the options.

I do not know, how do I change it to DX11?
 
Even when changing a card within a brand, that uses the same drivers, I still will do a complete driver clean with Display Driver Uninstaller, then remove the old card, then install the new card, then power up and install the drivers.

That's what I did when I upgraded my son's PC from an RX 580 to an RX 5700.
 
Even when changing a card within a brand, that uses the same drivers, I still will do a complete driver clean with Display Driver Uninstaller, then remove the old card, then install the new card, then power up and install the drivers.

That's what I did when I upgraded my son's PC from an RX 580 to an RX 5700.
In this case I think it might possibly be the bug that causes low fps when using DX12 API. A Ryzen 7 1800X and RX 580 should have been capable of 60+fps on medium+ settings.
 
Solution