AMD Radeon RX 480 8GB Review

Page 12 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

truegenius

Distinguished
BANNED
here is the test, done it on hd7950, to see effect of temp on power consumption, unless finfet is designed to run at 90'c ( in that case i will be wrong ) this will mirror the rx480 power consumption situation.
i tested with furmark stress test to find relation between temp and powerconsumption, gpu at stock clock, +20% power to stop it from throttling, fixed 25% fan speed at begning and when gpu hit 90'c i uped the fan to 100% to reduce temps
now in below graph ( 4th ) you can see reduction in power input by gpu vrm with decrease in gpu/vrm temp ( 3rd & 5th graph )
at 90'c gpu temp power consumption was 155w and at 67'c it went down to 110w. 41% more power consumption with 34% higher temp over 67'c, thus we can expect atleast 25% less power consumption if gpu and vrm of 480 run around 65'c max.
graph 1 shows gpu usage which is constant at like 99% ( frumark extreme burn test ) and graph 2 shows gpu clock which i constant thus not throttling.

20288577ab6ca6a9a0.png
 

Alpha and Omega

Commendable
Apr 24, 2016
2
0
1,510
There is no problem with the card, 150W is an approximation and not to mention since it's initial release date there was no new crimson driver to fully supported yet,....
 

TJ Hooker

Titan
Ambassador

First off, the problem isn't so much the total power draw as it is the power draw through the PCIe slot.

Secondly, AMD has released Crimson 16.6.2, which adds support for the 480, and that's the driver that Tom's used in their review.
 

bit_user

Titan
Ambassador
Well, no.

E = I * R

So, if you decrease resistance (as a consequence of lower temps), then I'd expect power consumption to rise. Of course, I can imagine all sorts of reasons why that might not hold. Like, if the GPU is reducing its voltage, at lower temps.
 

truegenius

Distinguished
BANNED
there is a CHiL ic in hd7950 cards which support power consumption monitoring the graph is from that using hwinfo64 software, and i also crosschecked it using wall meter to measure total system power consumption ( its something like kill a watt ) , i usually use it to make sure my 500w powesupply won't blow due to excess overclock on 1090t and hd7950 :whistle:
 

TJ Hooker

Titan
Ambassador

I don't understand how Ohm's law ties into what you're saying.

I'm not certain, but here's an educated guess on why lower resistance means less power.
Switching transistors involves charging and discharging their gate capacitance. This means current must flow through a trace. Lowering the resistance of the trace reduces how long it takes for the capacitance to charge for a given voltage, meaning that with less resistance voltage could be lowered and still keep the same switching speed. Alternatively, keeping the same voltage would make the capacitance charge faster, which would be like driving the transistor harder, reducing rise/fall times. This would reduce the time during which both transistors in a CMOS pair are conducting, which would reduce power consumption.
Finally, a point related to temperature rather than resistance: IIRC higher temps make it easier for electrons to tunnel through the gate dielectric, increasing leakage current and therefore power comsumption.

Again, I'm no expert, just doing some semi-educated speculation.
 

truegenius

Distinguished
BANNED
@bit & TJ ohms law will apply on conductors but in semiconductors it applies only if temps are constant, in easy term when gpu is at 100% utilization gpu chip behaves as a resistor with constant resistance R and we need constant current I to keep transistor toggling their state thus we need constant V which is equal to IR, so if temps are constant and load on gpu is altered then it will become like a high resistance resistor and thus lower current requirement and lower power consumption, close example is a dc motor at stall load and no load.
But unlike dc motor (in which higher temp will increase coil's resistance and thus will decrease power consumption) when temps rises in semiconductor chips ( as TJ stated ) electron leakage increases because of tunneling effect, due to which electonic chip consume more power. And i think due to this tunneling effect we are seeing exessive power consumption of rx480, AMD cheaped out on heatsink while they should actually used less vrm phase and lower capacity vrms, engineers behind arch done great job and we can expect 6k cores with clocks between 1-1.3ghz under 250w with hbm2 ( maybe 8k too under 300w :D ), but card designer was probabily drunk at work :p
But all of this is if rx480 behaves like my hd7950 at 90'c ( well it should as they both are semiconductors ) which is why we need tests on this by professionals with nasa's equipments.
 
Semiconductors tend to experience thermal runaway, some (e.g. germanium) to a much greater extent than others (e.g silicon); the hotter they get, the more current flows through them, and the more current flows through them, the hotter they get. Good cooling can keep this somewhat under control, but there's a reason germanium is not used in high-power circuits.
 

joebrann

Commendable
Jul 5, 2016
1
0
1,510
It seems like a lot of nvidia fan boys just trying to shoot down AMD by far I'm not saying this is the top of the line card but it's much better what a lot of these comments saying.

If anyone knows about Bitcoin/ Alt-coin mining I have the cards set at stock with the only exception is I set the max temperature to 73? it has yet to pas 60% for the fan. And I can barely here it at all sitting next to it. I am posting 2 pics right for how there running.

I will also test my 2 cards on a few games and throw it on a video on YouTube. Here are the pics.. :) enjoy and you decide for yourself.

http://imgur.com/yx8Qesi
http://imgur.com/VN3UwG0
 


I would even say it is better to just wait for AiB boards anyways.
 

InvalidError

Titan
Moderator

If the VRM phases are really split four on PCIe and two on AUX as they said in their follow-up, I would really worry about how much other those two AUX phases are going to get if their load gets increased by 10-15%.
 


I guess it depends on which AiB you get them. An Asus Strix or RoG will be a custom PCB design as will most of Sapphire, Gigabyte and MSIs high end GPU designs which should allow them to increase the VRM Phases on the AUX to help mitigate the increased load due to overclocking of the VRAM and core.

The stock ones I would imagine they wont touch clock speed wise since it is evident that it wont be a very happy camper and is probably why certain boards have been fried (apart from the guy who uses them for Bitcoin which fried his board at stock speeds or lower).
 
I believe Sapphire is the original AMD factory, I'm not sure if they are building all the reference models on the RX-480s or if AMD has another partner doing it. Who ever is doing it may have some of their models that are identical to the references models. I hope the board partner's models come out soon.

Is it just me or did others notice on the AMD website in the discussions about damaged motherboards that the majority seemed to be ASRock motherboards?
 


That always seems to be the case. Even on the Nvidia side, where the stock GTX 1080 would thermal throttle.
 


Sapphire is the one who designs the PCB and cooler for AMDs reference model and that is what becomes the reference design but I would assume AMD has a lot of input on it.

It is possible but I have seen an MSI and an Asus. ASRock is used a lot because they normally offer a lot of the same features as Asus and MSI but are normally a bit cheaper.



I agree. Unless people plan to water cool as most water blocks are designed with the reference PCB in mind however EKWB looks to be coming out with water blocks for ones such as the Asus Strix now.
 
ASRock boards are usually very feature rich at their price points the issue I always worry about when selling them is the motherboards seem a bit thinner than the top 3 brands ASUS, Gigabyte and MSI. Anyway I expect AMD to have the fix out by the end of the week.
 
Some of the ASRock boards I bought a few years ago were indeed rather thin, but the newer ones are much better. Gigabyte and Biostar both have thick sturdy boards; MSI is somewhere in the middle. The H110 I'm looking at now is a little thin, but not dangerously so like it will crack.
 

nate1492

Distinguished
Nov 23, 2012
44
0
18,530
Can we get a new review with compatibility mode enabled? I mean, it's really a bit cheating that AMD default to a mode that is highly out of spec... But it'd be a real shame if no one re-reviews with the actual fix and people are duped into a card that 5-10% worse.
 

neblogai

Distinguished


Benchmarks so far show performance loss of about 1-3%, so it's no big deal. But I'm sure this will be properly reviewed somewhere when AIB model reviews are done. Right now there is this>
http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-RX-480-Power-Consumption-Concerns-Fixed-1671-Driver/Performance-Va
 
I used to do a lot of MSI motherboards then I started to have issues in the late 90s early 2000s with their board breaking when customer tried to upgrade the CPU. So I started pushing ASUS and Gigabyte more. After my RMA nightmare with ASUS I push more Gigabyte these days. MSI addressed the perception of their boards being fragile with their Military Grade components campaign. But now it seems all mfrs are copying that advertisement.
 


I thought the Asus Sabertooth 55i was the first to use "Military Grade" components

https://www.asus.com/Motherboards/SABERTOOTH_55i/
 

No it's real. I don't know if it's truly "Military Grade", but it does indicate the upgraded high quality components. I can't link it, but there was an old article where it was tested and found that there was some truth to the claims.
 
Status
Not open for further replies.