AMD Radeon RX 560 4GB Review: 1080p Gaming On The Cheap

Status
Not open for further replies.

firerod1

Distinguished
Jul 25, 2011
17
2
18,525


I meant this card since it’s 1050 ti price while offering 1050 performance.
 
4GB on the Radeon RX 560 = "Mining Card"

The minimal arch (even with the extra CUs) can't use 4GB for gaming like the big brother 570. The 2GB RX 560 even trades blows with its 4GB twin, along with the 2GB GTX 1050, at the $110-$120 price point for the gamer bunch.

Leave the RX 560 4GB for the "Entrepreneurial Capitalist" crowd ...

 

bit_user

Titan
Ambassador
I think your power dissipation for the 1050 Ti is wrong. While I'm sure some OC'd model use more, there are 1050 Ti's with 75 W TDP.

Also, I wish the RX 560 came in a low-profile version, like the RX 460 did (and the GTX 1050 Ti does). This excludes it from certain applications. It's the most raw compute available at that price & power dissipation.
 

senzffm123

Prominent
Oct 3, 2017
1
0
510
correct, i got one of those 1050 TI with 75 W TDP in my rig, doesnt have a power connector as well. hell of a card!
 

jdwii

Splendid
Man Amd what is up with your GPU division for the first time ever letting Nvidia walk all over you in performance per dollar, performance per watt and overall performance, this is very sad.

Whatever Amd is doing with their architecture and leadership in the GPU division needs to change. I can't even think of a time 2 years ago and before where nvidia ever offered a better value.
 

nukedathlonman

Commendable
May 3, 2016
6
0
1,520
I had flashed my XFX RX-460 to an RX-560 (no issues in doing this simple BIOS flash) - I got no complaints about it. Performs well (60FPS is all I aim for given my older 60hz non-freesync display) in gameing at 1920x1200 at high (Deus Ex Mankind) or medium high settings (GTA V) with an overclocked Phenom II x6 backing it up. GPU is still the bottleneck, buy I don't care given the systems age and how little I've spent on it.
 

nukedathlonman

Commendable
May 3, 2016
6
0
1,520
I flashed my 4GB XFX RX-460 to an RX-560 - got no complaints about either RX-460 or RX-560. I only aim for 60fps (older 60hz panel, no freesync) and get that with high or medium high settings at 1920x1200. I do like how quiet my card is.
 
But they said they "couldn't wait" to review it, when they apparently could. : P

And technically, the RX 560 was released in the spring, not the summer, though it's possible that they might not have got a unit in for review until a bit later. It is worth pointing out that the GT 1030 came out around the same time though, and they had no problem getting a review up for that over two and a half months ago.

It also seems like an RX 560 review might have been worth prioritizing, in light of the fact that any higher-end cards from AMD have been priced out of the market for months due to cryptocurrency mining. Had it not been for the miners, the RX 570 would have likely been available for not much more than $150 by this point.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795


Good catch--should be 75W. Fixed!
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795


We didn't get RX 560s for our U.S. and German labs until recently--after all of the other craziness this year.
 
As someone pointed out, you can unlock the shaders on the RX 460 to full equivalent RX 560. Was figured out when all of the Mac versions of the RX 460 were already shipping fully unlocked.
 

bit_user

Titan
Ambassador

It's actually not bad, if you look at the benchies. Particularly in DX12 and Vulkan, it's very close to the more expensive 1050 Ti. Even beats it, in one case.

As to your question about how this came about, the game changer seems to have come when Nvidia switched to tile-based rendering, in Maxwell (900 series). Ever since, AMD hasn't been able to catch up.
 

Cryio

Distinguished
Oct 6, 2010
881
0
19,160
RX 560: As fast or faster than 1050.

If we add Tessellation Override to x16 or x8 in driver: Substantially universally faster than 1050, probably on the same playing field or faster than 1050 Ti.

Conclusion: Must buy as a low-end GPU.
 


I don't think it's so much that they're not trying, it's that their cards were found to be better for cryptocurrency mining than Nvidia's, resulting in them being in short supply, and prices rose accordingly. From the launch of the RX 400 series last year, up until earlier this year, they were offering very good performance per dollar, and had compelling products readily available at the levels most people buy.

Just six months ago, you could find plenty of RX 480s for well under $200, offering performance close to a GTX 1060 for considerably less. At times, some 4GB RX 480s even went on sale with rebates bringing them down near $150-$160, about what you would currently pay for a 1050 Ti with far less performance. The only real reason to consider a 1050 Ti then would have been if you had a pre-built system with an underpowered PSU or small form factor, since for a little more you could get an RX 470 or 480.

They did take too long to fill in the high end of their range with Vega though. And I suspect that Vega would have been a much more impressive launch had it not been for mining messing up the market. Vega 56 and 64 might have had significantly lower official launch prices, and the cards would have probably been available for those prices, and not marked up further. Considering that their official launch price for the 8GB RX 580 was $229, it wouldn't have surprised me if Vega 56 would have been around $329 to $349, and Vega 64 around $429 to $449.

I would definitely like to see AMD work on their efficiency though, since Radeon cards used to be quite good when it came to that, often better than Nvidia. I'd rather not have the noise and heat from a 200+ watt graphics card in my system if possible, and Nvidia currently has them beat on that. Of course, with the recent mining shortages, having better efficiency at a similar performance level could have actually made availability even worse.
 

Nintendork

Distinguished
Dec 22, 2008
464
0
18,780
Using dual cores is plain obsolete when for pretty much the same price you get the Ryzen 5 1400/1500X which are near the same as a non OC 6700K.
 

Nintendork

Distinguished
Dec 22, 2008
464
0
18,780
Just stop using dual core, you want mainstream, put the Ryzen 5 1400/1500X, they basically the i7 6700 at half the price. Why keep using intel...
 

rafael_1414

Honorable
Apr 22, 2012
11
0
10,510
Because most current budget systems that will benefit from a GPU upgrade with a 560 or 1050 have an Intel dual-core CPU.
 

ddferrari

Distinguished
Apr 29, 2010
388
6
18,865

So based on your fictional drivers, you're stating this card is a must buy? Wow- is there a single AMD fanboi who doesn't do all their thinking in the "if and when" zone?? The 1050 was equally fast overall, and cheaper. But of course, the 560 will surely age like fine wine and eventually surpass the 1080 Ti...
 
Status
Not open for further replies.