AMD Radeon RX 480 8GB Review

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Not open for further replies.


Aug 7, 2015
To me, it appears that AMD built the Reference cards to barely meet the minimum requirements of the GPU and their marketing.

Regarding the power issue, as long as you never overclock the Reference card it seems to work as well as any modern, high-performance card. The moment you OC, though, the GPU needs more than the Reference card can deliver.

Don't overclock the Reference cards!



I am going to suspect it will be a temporary power limit driver fix, to be followed up with a new bios. Hopefully it doesn't affect performance or stability.... I am half keen on one of these cards if they can hardware fix this issue in partner cards. IMO if you have a pcie power connector, that should be maxed out first with the pcie slot power taking up the slack. There is very little chance a pcie power connector is going to fry, compared to the pcie slot and noise and distortion caused by overdraw across the whole motherboard.
Yes but with the 960 only 1 watt behind the RX480 means its over drawing. The post even said as much as its pulling more power than normal from the 75watt PCI-E power plug. It could damage the PSU which in turn damage the motherboard. Even with all the PSU safe guards there is still a risk. Note this behavior was once the same issue as noted here from last year.,review-33113-8.html
Just like now AMD has a beta driver to fix this issue already. Here is AMD's statement.
As you know, we continuously tune our GPUs in order to maximize their performance within their given power envelopes and the speed of the memory interface, which in this case is an unprecedented 8Gbps for GDDR5. Recently, we identified select scenarios where the tuning of some RX 480 boards was not optimal. Fortunately, we can adjust the GPU's tuning via software in order to resolve this issue. We are already testing a driver that implements a fix, and we will provide an update to the community on our progress on Tuesday (July 5, 2016).
I'm doing it for many years. So unlike watching the fire or other people working, I'm getting tired of it.
It's fun to watch them hyping before release and than getting another facepalm actual product. but that is like 5 mins fun :)



Well if you go to the first post on this thread and click on the "Stop tracking this thread" link then that should stop notifications.


They might not test it< I wouldn't ... don't just read the gaming benches ... read the power section. Wait for the AIB cards, I expect they will come with 8 pin connectors and thereby avoid grossly exceeding the PCI spec.




RX480 Crossfire review at TPU:

Radeon RX 480 CrossFire is not a viable solution if you plan to buy two cards upfront. When averaged over all our games, it is consistently slower than a single GeForce GTX 1070 at all the resolutions that matter - 1080p, 1440p, and 4K. Instead of buying two cards upfront, you're much better off putting your monies into a single GTX 1070, not just for better performance but to dodge the spectre of application multi-GPU support, which continues to haunt both SLI and CrossFire.


Sep 25, 2015

I wonder if that bit better performance will cost a bit more money?

As a side note: A piece of me does find it humorous that manufacturers have managed to convince a generation of consumers that to get the performance you paid for (and for some reason deserve?), you MUST tune your products up yourself. Under performing cards from the factory are now a benefit, and you only get your "money's worth" if you push your equipment out of spec.

This applies to a design team presenting a 'reference' unit for sale, or OEM manufacturers improving on the reference design at their factory.......even if clocked higher than reference at their factory, the card is still expected to be 'nicely overclockable' to be worth more than mere dust.
My guess (which is just a guess) is that it will be priced around 250. or 230 if nvidia decides to seriously beat the AMD team.
The overclocking generation forced manufacturers to accept that some users want to get the most out of their HW and thus should be allowed to play with settings while the stock settings will work for all.
Don't forget that the overclockabilty of a part depends not just on silicon lottery, but also on environment like ambient temperature, airflow within case which vary greatly.

NVidia does not set prices. 3rd parties like Asus do that, so it will boil down to what is selling. If a GTX1060 isn't selling they will be forced to drop the price.

NVidia just sells the GPU's to the companies (ignoring the Founders/reference edition. I'm not sure how NVidia's profit works with that)

Nvidia (and AMD) set the prices since the GPU is the most expensive part of the card. basically way more than the rest of the components together.
So the price of the GPU will define it's price range. The partners margin is very slim.


Yes it was on one of the AiB GTX 960s not all of them.

Not true if you buy a factory overclocked GPU. Those chips are binned, so you pay to "win" the silicon lottery. My EVGA 970 SSC ACX 2.0+ GPUs overclock to 980 performance without even breaking a sweat and needing any voltage increase. I am running the same overclock on those GPUs as these guys did:


Don't take the sentence out of context :p My claim as a whole is true. And even within the binned parts, you still have lottery


Jul 2, 2016
what are the settings for GTA 5?
because i have rx 480 and i have very high settings and getting 35 to 50 fps
am i missing something here?

I didn't...I directly addressed your silicon lottery comment and nothing else. There's a reason factory overclocked GPUs overclock higher than reference speed GPUs from the same baseline starting point. Ambient temps and case airflow that you mentioned are just controls. Now what can be debated is how well each OEM bins their chips for factory overclocked GPU models. For example, ASUS's STRIX GPUs have a long and distinguished history of not overclocking nearly as well as say a Gigabyte G1 GPU.

GPU binning is expensive process. Most (nvidia) GPUs are just slightly overclocked at the factory compared to their full potential. So there is no need to bin them. The only true binned GPUs that i'm aware of are EVGA's which have minimum ASIC quality levels. And even that, does not ensure overclock levels. No company is building mainstream GPUs and testing each of them. They assume some performance level that will work in close to 100% under average conditions based on the average chip specs.


I don't get all the negativity this card is receiving. It performs on par with last gen parts a tier above it such as the 390 and 970. It also destroys the 960 and 380. This is what I expect for a new GPU. Chase the previous gen one tier higher, and beat the last gen same tier cards. The 960 was slow, the 380 wasn't a huge leap from the 280, but this is different. If you have an older card such as a 7850, GTX 760, or even a 280X, this thing is a solid upgrade. Hell, it's a solid upgrade from a 380X or a GTX 770 even. View it for what it is, a solid mid range card using a smaller, cheaper die, not a flagship part with a massive die. A 280X still holds it's own at 1080, and the 770 seems to be holding on too. But the 480 is a definite upgrade from either of those cards.

As far as the power draw, there have been plenty of cards that ate more than 75 watts from the PCI-e slot with nobody complaining. The 960 was one of them as were a few 950's. The 960 was known to spike up to over 200 watts at times, surely drawing more than specified. The gaming load tested was also Metro at 4K, something no mid range card can actually run. It must have looked like a powerpoint presentation. I wouldn't be at all afraid to slot one of these in my board, especially for 1080p at 60FPS. It can even handle 1440p decently. I'm quite impressed with it personally.

Looking forward to the 490's for sure.



Occasionally peaking above 75, sure. Averaging 80W with continuous peaking above 100W on PCIe slot 12V? Not so much - that's 50% over the 66W spec on those PCIe slot 12V pins and if you pass 50% more current through those pins, pads and solder joints, that's 125% the I2R losses before accounting for the increased local heating.
Not open for further replies.