Sapphire Launches A New Flagship: Nitro R9 Fury

Status
Not open for further replies.
"Sapphire said the Nitro R9 Fury’s PCB includes 6-phase circuitry on an 8-layer PCB, which features 2 Oz of copper, enabling Sapphire to push 360 Amps through the GPU. "


Well, 360 Amps in a single GPU is absolutely incredible! Perhaps a little too incredible... :)

Maybe you meant 360 watts.
 

Epsilon_0EVP

Honorable
Jun 27, 2012
1,350
1
11,960
From what I remember, the power delivery circuitry in the card reduces voltage to about 1V by the time it gets to the GPU. So 360A is exactly equivalent to 360W; they're just quoting the figure in amps.
 

Moeseph de tyre

Reputable
Jan 12, 2016
1
0
4,510
360 amps would be correct. P=VI, 1.2 volts times 360 amps equals 432 watts. So there is the headroom for a hungry Fiji when over clocked.
 

kcarbotte

Contributing Writer
Editor
Mar 24, 2015
1,995
2
11,785


it's possible that Sapphire meant watts, but the release clearly states 360 amps.
Just because the PCB can handle that much power, doesn't mean it will ever happen in the real world. I believe this is just Sapphire over building the PCB, but you bring up a good point.

I will reach out to Sapphire for clarification of this.
 
From what I remember, the power delivery circuitry in the card reduces voltage to about 1V by the time it gets to the GPU. So 360A is exactly equivalent to 360W; they're just quoting the figure in amps.

True, but I don't think they are reducing the voltage and greatly increasing the amperage. That would seem like a bit of unnecessary circuitry.
 
That card's GPU operates at 1.22 Volts when running a 3D load. The input voltage of 12 Volts needs to be dropped down to 1.22 Volts by the VRM circuit.

Since that card can draw over 440 Watts, when running Furmark, then Ohm's Law tells us that the GPU will draw:
440 Watts ÷ 1.22 Volts = 361 Amps.
 

Epsilon_0EVP

Honorable
Jun 27, 2012
1,350
1
11,960


That's not Ohm's Law, but the calculations themselves are correct.
 


When I took Electrical Engineering in university it sure was.
ohms-law.gif
 

Epsilon_0EVP

Honorable
Jun 27, 2012
1,350
1
11,960


From the ECE/Physics courses I took, what is now known as Ohm's Law is only the equation V = IR (and any directly related equation). The formula P = VI is the electric power formula. Perhaps it was taught differently where/when you were taught.

The diagram you provide only uses the two laws in conjunction to write any one of the four variables in terms of the two others. Note that no single expression in the diagram uses more than two variables, indicating two degrees of freedom consistent with two equations.
 

Epsilon_0EVP

Honorable
Jun 27, 2012
1,350
1
11,960


As explained already, the card's power delivery transforms the 12V input into around 1.22V by the time it gets to the GPU. 360A at that voltage is perfectly reasonable.
 
Historically, that ain't how specs on GPUs -or CPUs for that matter- are reported. Presumably at 360w you may compute the volt delivery available from the system required for the card:

by the PCIe x16 slot: 75w
. . . . . by a 6+2pin: 150w

Max system power deliver = 375w
Card power requirements = 360w

See how that worked out?

 

dwhapham

Distinguished
Oct 22, 2007
24
0
18,510
That card's GPU operates at 1.22 Volts when running a 3D load. The input voltage of 12 Volts needs to be dropped down to 1.22 Volts by the VRM circuit.

Since that card can draw over 440 Watts, when running Furmark, then Ohm's Law tells us that the GPU will draw:
440 Watts ÷ 1.22 Volts = 361 Amps.
That card's GPU operates at 1.22 Volts when running a 3D load. The input voltage of 12 Volts needs to be dropped down to 1.22 Volts by the VRM circuit.

Since that card can draw over 440 Watts, when running Furmark, then Ohm's Law tells us that the GPU will draw:
440 Watts ÷ 1.22 Volts = 361 Amps.

Guys - 361 Amps in impossible.. Think about it, Most of us have our PCs plugged into an 110v, 15 or 30amp outlet. So tell me how your video card is going to pull 361 amps from that same outlet? (Spoiler: It's not).
 


No one said it would be pulling 361 Amps from the wall outlet. The GPU chip may draw up to 361 Amps from the graphics card's VRM circuit at 1.22 Volts DC.

The graphics card, itself, will draw less than 37 Amps from the PSU's +12V rail and less than 5 Amps AC from a 115VAC outlet and that's using a PSU conversion efficiency of 88% for example.
 


If you're basing your example on history then why didn't you cite the AMD Radeon R9 295X2 that only has two 8-pin PCIe power connectors and can draw up to 646 Watts?

AMD has been known to violate the PCIe specs. For the Radeon R9 295X2 to function they even state that the PSU needs to be able to provide at least 28 Amps (i.e. 336 Watts) from the +12V rail(s) to each 8-pin PCIe connector.
 
... If you're basing your example on history then why didn't you cite the AMD Radeon R9 295X2 that only has two 8-pin PCIe power connectors and can draw up to 646 Watts?
You're conflating total system power load, Skippy, with the individual power consumption of a 2-GPU video card.

11-R9-295X2-Power-Consumption-Gaming-Detail.png


But please feel free to ramble on . . .



 

TJ Hooker

Titan
Ambassador


You're right, GPU specs typically mention power, not current. However, in this case, they're specifically talking about the PCB and VRMs, in which case discussing current makes complete sense.

Here's another example: http://www.tomshardware.com/news/amd-fury-x-fiji-preview,29400.html
"The board features a six-phase power design that was engineered to handle up to 400 A of current."




It's still drawing nearly 450 watts, significantly more than 375W that PCIe + 2x8 pin connectors are rated to supply. So ko888's point still stands.
 


The Tom's Hardware graph, that you've linked to, shows the graphics card's power consumption during gaming not the system's total power consumption. Gaming does not show the card's maximum power draw.

The only one confused is you.

Here's another review of that card where they actually measure the graphics card's maximum worse case power consumption when running Furmark. It doesn't measure the useless system power consumption because they're not doing a system review.

power_maximum.gif


This German review that measures only the graphics card's power consumption also backs up what I said. When they ran Furmark the card drew 630 Watts.
 
Status
Not open for further replies.