AMD Radeon RX 480 8GB Review

Page 15 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


It's funny you should mention Doom. It's Vulkan rather than DX12, but the results are the same (if not even better for AMD).
 


Nvidia killed the 1060 when they removed SLI from it . I dont know what they are thinking ... this is a stupid move .
 
What are the use cases for SLI'd 1060s that can not be accomplished cheaper or more simply using a single more powerful card? The mock SLI tests I've seen through DX12 put them about on par ( or ever so slightly ahead ) of a single 1080. They will not give you a noticeable benefit at 4K and every other lower resolution can be done just as well with a 1080 or 1070, the latter of which is cheaper than two 1060s.

If there are other tests I've not seen, by all means I'd love to look at them. But I think the lack of SLI is almost a non-issue.
 


True but my point was that DX12 is not a API that is used enough and DOOM is using a more popular API, OpenGL, with the option to use Vulkan that was added later.

However the 10 series from nVidia is also benefiting from Vulkan. Not as much as AMD though.

My only point is that if I were to buy a GPU tomorrow I would look at the games I am and plan to plays performance over benchmarks. It is much like 3DMark. I have never considered that or any other synthetic benchmark important.
 


Because some people cant pay $650 from the beginning and add the second card later after a year or so , next holiday.. and it is not about 4K , 1440P is common now and triple FHD .

The RX 480 is the mid range winner IMO ...
 


Yeah that makes sense. I guess it also depends how long you tend to keep your graphics card - people who upgrade every 1-2 years don't really have to worry about DX12 and Vulkan much at this point, while those who keep their cards longer are in a different position.

I do believe Vulkan will pretty soon become more popular than OpenGL. I can't see many reasons to stay with OpenGL. How Vulkan will fare against DX12 is more uncertain.
 


I can think of 3 major scenarios panning out that will decide how Vulkan does in the market:

1.- Windows 10 adoption: since DX12 is tied (very stupidly) to Win10, those (like me) that want to stay with 7 or 8, have no other choice but to suck it up and wait.
2.- XB1 will dictate most devs should still use DX12 due to porting. PS4 using a custom OGL (AFAIK) doesn't help Vulkan either.
3.- Vulkan and DX12, since based of MANTLE, should be close enough to make it some-what easy to jump from one to the other.

I really wish Vulkan takes off. Most of the huge criticism towards OGL were left behind with Vulkan. I really hope Devs embrace it now 🙁

Cheers!
 


I am sure it will. It is finally at the point where it is now superior. It had a rough start but it is good to see it leap OGL. I just hope they continue the support. One of OGLs biggest downfalls was when it would get ignored for a period of time and DX would have features OGL didn't.



1. Same thing happened with DX11 and XP/Vista. At some point you will just have to move on.

2. DX12 is not much for the devs since their version of DX11, a custom variant, was already using a lot of what DX12 had to offer.

3. That's the problem. People assume it is based on Mantle. While it is employing similar features, these features are not Mantle specific but are done in a different way. Vulkan is Mantle based but DX12 is not so it will not be as easy to just switch back and forth.

Mantle was good in that it forced Microsoft to push DX12 faster but I don't think it deserves all the credit when it was not the first to do what it did (GLIDE anyone?) and DX11 was already doing "closer to the metal" on the XB1.
 


🙁 Glide , I miss the Old days ... I wish 3dfx never went bankrupt or at least intel Bought it like AMD bought ATI...

Now we are stuck with AMD and Nvidia only ... I miss Matrox too , they should enter the 3D market ...
 


3DFX got bought by nVidia and their SLI tech became nVidias in the long run.

It would be nice to have more competition but the biggest problem is that that would dilute the market with even more hardware configurations which would end in even more issues at launch for games.
 


Well, I always believe the market solves most of the issues segmentation brings; as long as the patent trolls stay in their cages, haha.

In particular, in a world with even more graphic APIs, then GPU makers would have to make the chips bigger and clunkier or make cards that are specialized on one or another (if they are very different, which wouldn't be the case). And for the developers, then the Graphical Frameworks would be a nice market for devs to explore into. I mean, it still is with just 2 major APIs and GPU makers (ignoring Intel on purpose, haha), so you can imagine how good it would be, cash wise, for them.

History has proven that when there is too much choice, at the end you get 3 at best. Carnivores, Herbivores and Omnivores! xD

Cheers!
 


looking at the Mobile 3D GPU , we have Adreno , Mali, powerVR , Apple etc , mobile games work on all of them without issues .

They should enter the PC market !!!!

Intel made a stupid Mistake twice , First when they did not buy 3dfx , and second when they did not buy ATI ..

I dont know who was the idiot CEO of Intel who ignored those 2 Purchases ...

I prefer ATI inside Intel CPU instead of AMD CPU ...

Also , I wish 3dfx lived inside intel giving us 3 Options ... AMD (ATI) , intel (3dfx) , Nvidia

oh yes and I have no idea why Silicon Graphics ignored the PC market , Nvidia itself was made by Engineers who left SGI

 
Don't buy this. Looking at the specifications and reading all reviews, this card does not have audio capability over HDMI. Neither does nVidia 100 series. I've asked if anyone new anything about sampling rates, but clearly it does not exist.
 

AMD's GPUs have had audio over HDMI at least since the HD5xxx series, likely sooner. When I check audio outputs on my PC, the AMD HDMI Audio driver lets me select 24 bits at 48kHz max.
 
None of that addresses any of my points. So let's try again and hopefully you'll follow along.

Who said you had to drop $650 on graphics for 1440p? Did you even look at the benchmarks? Both the 480 and 1060 are pushing 50 fps or more in nearly every game in these benchmarks at 1440p ( aside from The Division - around 40 fps - they're above 45 fps ). Meaning you can get good 1440p performance for $300 or less. If you want to max out 1440p, a 1070 is only $430 right now. Hell, you could even get a 970 on the cheap right now and still get good 1440p performance.

And no, actually, 2560x1440 and 5760x1080 are not common resolutions, going by Steam's stats ( not all inclusive, but a pretty good representation of the users who would actually consider SLI 1060 level performance ). 2560x1440 primary displays make up only 1.56% of Steam gamers surveyed. Triple FHD is even lower at 0.92% ( only 2.52% of multi-screen setups means 2.52% of the 36.5% of people with 1920x1080 primary displays ). Now those numbers are a little skewed because it includes all the people doing casual gaming on old laptops, integrated GPUs, etc, and those people aren't in the dGPU market. But even if you were to drop out every entry in those stats below 1600x900, that doesn't quite double the remaining percentages. You still have less than 5% total Steam users with 1440p or triple 1080p setups.

As for the tired argument that you can spread the cost across a longer time with getting SLI "in the future," I asked about the use case, not the money. Tell me why you would want SLI'd 1060s down the road. What would you hope to gain out of it that it would be a better overall option to the many alternatives? Especially if you consider the new cards that will become available in that future when it comes time to buy a second 1060. Diffusing the cost alone does not completely answer that question. I can diffuse the cost of getting a good CPU by buying an i3 now and then buying an i7 in the future. But that doesn't mean it's a great idea to do that with the many alternative routes.
 


Well , I always buy High end cards ... for me , get the fastest first , then go Dual . But for people who cant pay at the beginning , they go the mid range SLI road . and Nvidia made a mistake losing those buyers . they do exist , I ve seen them all the time.

As for triple FHD well they are common around me almost every one I know use 3 monitors , being 3 FHD or 3x1440P .. in the past it was expensive , but now you can get FHD monitor for $100 only ... 3x FHD became affordable .

 

It is only a "mistake" if Nvidia actually wanted to keep that segment of the market (increasing the ticket price by $50 along the way would be a strange way to go about it if that was Nvidia's intention) and fails to convert a significant chunk of those presumably lost sales into extra 1070/1080 sales.

From Nvidia's perspective, getting rid of low-end SLI is a good thing: it saves development costs on "budget" GPUs and eliminates potential competition within its product lineup.
 


Well , the way I see it is that they handed the mid range market to AMD on a silver plate .. The RX 480 is still CF enabled .. Actually Just a week ago , a Student I know bought the RX 480 and ignored the GTX 1060 for this particular reason .
 
SOME go that road. Most do not. Many people in the $200 - $300 GPU budget range don't upgrade more than every three years. Consider that cards such as the GTX 650, 660, and 760 as well as the 7700/260 and 7800/270/370 are all still very common on Steam stats. Each more common than even the 980.

I never said they don't exist. I'm saying, just like InvalidError, they are an almost insignificant segment of the GPU market and catering to them doesn't make NVidia any money.

You're making the classic Kael error of assuming something is much more prevalent than it really is simply because you're surrounded by it and insulated from everything else. The truth is Nixon did win, SLI/CF is relatively uncommon across all gamers, and triple-screen gaming even less so. This is a short, but good article on this very phenomenon in the tech world.

Which is among the rarest setup of all. So seldom encountered, it's simply grouped under "Other" in Steam's stats.

Certainly monitor prices have come down. That wasn't in debate. However, just because you can get a 5760x1080 setup for ~$350 doesn't mean many people do it. First, it take a LOT of desk space. Second, not all games support it. Third, it requires a lot more graphical horsepower, meaning more expensive GPUs. Many gamers can afford ~$300 for a decent 1080 monitor and a $150 - $200 GPU to game on. Far fewer can afford the $450 extra it takes for triple screen gaming. I think this is one reason the 21:9 monitor market is growing as well as it is. You get a little extra periphery without monopolizing deskspace.
 


:) alot of people with low budget love to play on three screens . and as I said , they get the other card for more GPU power later ...

They start by getting a PC with one monitor and mid range card , and the next holiday they add a second card and buy 2 extra monitors ...


Maybe not that common if you want High Percentage , but the market exists.

21:9 screens are not as good , they fail in productivity outside gaming .. people use PC for other things not only for gaming , 3 screens are much much better.

Thanks for your reply.
 
Problem is that dual-GPU support is shifting to be the responsibility of the game developers, at their own expense. SLI and Crossfire are on their way to becoming a thing of the past. They are well on their way there now. The value of purchasing that second GPU will only continue to deliver diminishing returns over time. I think Nvidia is just being realistic about this evolving trend and not trying to make promises that won't pan out.

RX 480 CrossFire loses out on its overall relative performance big time due to the number of games that don't scale well (6 out of 16)... In games that do scale, you're treated with upwards of a 85% performance uplift. From our Radeon R9 Nano CrossFire review till now, we see that AMD hasn't really spent a lot of time optimizing CrossFire for more games. We can't fault AMD too much, though, because since then, a lot of games that came out don't support multi-GPU at all due to engine limitations. There's not much AMD or NVIDIA can do about such games, and this is what scares us about multi-GPU solutions going forward.

With the advent of DirectX 12, we are promised new multi-GPU rendering modes thanks to Microsoft giving developers more control over per-GPU resource allocation, but we doubt that we'll see widespread use of those techniques. Nowadays, games are developed for consoles first (which are single-GPU), and publishers have little interest in spending a lot of developer time (= money) on adding support for exotic multi-GPU configurations that are used by only a small percentage of their customers.

https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/19.html
perfrel_1920_1080.png
 
And as usual, you fail to cite any supporting evidence or facts for this. So why should I, or anyone, take your word over mine or anyone else's?

Do you even think about the things you post before you hit the submit button? You just admitted to nearly everything I and others have been saying, yet tried ( unsuccessfully ) to twist it in your favor. Just because a market exists does not make it profitable. On the contrary, it's typically difficult to make a profit on a small market with low margins, and lower margins are exactly what mainstream and mid-range GPUs are.

This suggests you know very little about monitor markets. You do know 21:9 was marketed first to business and professional users before gamers picked up on them, right? Partially because it's the same aspect many movies are shot in, therefore it works well for video editing. But also because you can split windowed applications to share the screen and still have a good view of both( 1280x1080 and 1720x1440 are much more productive than 960x1080 ). Meaning a single monitor can function as two while taking up less space and requiring a single video port.

Complete strawman and non sequitur. I never suggested computers were only for gaming. And pivoting to productivity when we're talking gaming GPUs makes no sense at all.
 

I think far many more people simply don't see the appeal of having triple displays with bezel gaps in-between. I have tried a few games across three displays and found the bezel gaps to be simply too distracting and irritating to bother with and would much prefer a single larger display for gaming. Also, as Jaron said, nobody I know has a computer desk large enough to comfortably host three displays larger than 20". I own one of the biggest computer desks of anybody I know and I have barely enough space to fit my three monitors on it, almost makes me wish I could replace them with a single 30" UHD display.
 


Thanks InvalidError,

I'm a new buyer. I don't want to read or research old reviews and specifications. I care about the current specs and reviews. If they don't state what the current sampling rates are (like 88.2 KHz or 176.4Khz), then I'm not interested.
All reviews I've seen so far are incomplete and useless. I don't game, but I would like a powerful card for MadVR use and for my music collection which includes the mentioned sampling rates.
Card like these are not just for gamers, but all reviews and specs are for gamers only. Useless.

Does anyone know the audio sample rates supported?
 
Status
Not open for further replies.