Electric Bill: AMD and Nvidia Mathematics Calculated!

We commonly see on the forums and everywhere: "Nvidia uses less power and heat". It is very common to see this as an argument on why one should choose an Nvidia graphics card. We also commonly see the argument that AMD has better performance per dollar. Others argue back that the electricity bill is higher with AMD, so in the long run Nvidia is worth it. This thread compares common graphics cards and shows how much you can save in electricity one way or the other.

United States: Electricity costs in each state: http://www.eia.gov/electricity/state/

Above are the electrical costs in each state in cents per kilowatt hour. One kilowatt is 1000 watts. This is how many cents are charged for using a constant 1000W of power for one hour. Hawaii is the most expensive with 34 cents per kilowatt hour.

Nvidia GTX 980Ti vs AMD R9 Fury X
Gaming Conditions
According to Nvidia's website here, the GTX 980Ti is labelled as 250W. The Tomshardware review shown here says it draws in an average of 233W under gaming load. The Guru3D review of the 980Ti here labels it at 250W. The non-reference MSI GTX 980Ti Lightning review here on Guru3D shows it draws in 302W, 52W more than the reference. The Techpowerup review here labels it at 211W under average gaming load. Contrarily, the MSI 980Ti Gaming review shown here shows it to consume 262W of power. This averages to 252W.

AMD's website does not list power consumption. The Tomshardware review here shows the Fury X to draw in 221W on an average gaming load. This is lower than the GTX 980Ti. On the contrary, the Fury X when stress tested does draw in more power. The Guru3D chart here contrarily shows the Fury X to draw in 294W. Those are two very different values. It may be that in the Tomshardware review the GPU was not stressed at 100%, whereas it may have been in Guru3D's review. This one requires more research. Techpowerup's review here has average gaming load as 246W. This averages to 254W.

Considering high similarities between the Fury X and 980Ti power consumption, calculating any difference in power consumption is trivial, because in this scenario it will depend on the specific 980Ti being compared to. As can be seen, the non-reference MSI 980Ti cards draw in far more power than the reference, but any assumptions can not be made when comparing the power consumption of the two cards. Undoubtedly, the Fury X draws in more power under a stress test, but under gaming load it tends to vary review to review.

Nvidia GTX 980 vs AMD R9 390X
Gaming Conditions
According to Nvidia's website here, the GTX 980 is labelled as a 165W card. The Tomshardware review here of the Windforce GTX 980 shows it drawing in 186W while gaming. The Guru3D review here shows the reference to be around 171W while the Gigabyte G1 is 191W. This brings it to an average of 178W.

AMD's website does not list power consumption for cards. The Tomshardware review of the MSI R9 390X here shows it drawing in an average 292W while gaming. The Guru3D review here has it at 258W. Considering the large difference there, I dug deeper and found the Guru3D review on the Asus Strix 390X here which labels the card at 289W. PCworld here shows the Asus Strix to draw in 275W of power. The average of these is 279W of power under load.

If electricity is 10 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 980 cost per hour under load: 2.22 cents per hour = $2.03 per month with 3 hours gameplay per day
R9 390X cost per hour under load: 3.49 cents per hour = $3.18 per month with 3 hours gameplay per day

Monthly GTX 980 savings with 3 hours gameplay daily: $1.15 per month = $13.80 savings per year

If electricity is 20 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 980 cost per hour under load: 4.44 cents per hour = $4.06 per month with 3 hours gameplay per day
R9 390X cost per hour under load: 6.99 cents per hour = $6.36 per month with 3 hours gameplay per day

Monthly GTX 980 savings with 3 hours gameplay daily: $2.30 per month = $27.60 savings per year

If electricity is 30 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 980 cost per hour under load: 6.66 cents per hour = $6.09 per month with 3 hours gameplay per day
R9 390X cost per hour under load: 10.47 cents per hour = $9.54 per month with 3 hours gameplay per day

Monthly GTX 980 savings with 3 hours gameplay daily: $3.45 per month = $41.40 savings per year

If you happen to game n hours per day, multiply the monthly/yearly savings value times (n/3).

Nvidia GTX 970 vs AMD R9 390

Gaming Conditions

According to the official Tomshardware review on the GTX 970 and 980 here, the Windoforce GTX 970 consumes about 179W while gaming. Nvidia's reference model is labelled as 145W on their website here. So, let's say a minimum is 145W and maximum is 180W. The GTX 970 will operate somewhere within that range. Guru3D has it labelled as 168W here. Point be told, it is within that vicinity. The average of these 3 results in a 164W power consumption average while gaming.

The R9 390 power consumption is not labelled on AMD's website. According to Tomshardware's review here on the Sapphire Nitro R9 390, the power consumption is 255W under a gaming load. Unfortunately, I cannot find other reputable sources online for the 390, as I only see 390X reviews, so we'll have to stick with this.

If electricity is 10 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 970 cost per hour under load: 2.05 cents per hour = $1.87 per month with 3 hours gameplay per day
R9 390 cost per hour under load: 3.19 cents per hour = $2.91 per month with 3 hours gameplay per day

Monthly GTX 970 savings with 3 hours gameplay daily: $1.04 per month = $12.48 savings per year

If electricity is 20 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 970 cost per hour under load: 4.10 cents per hour = $3.74 per month with 3 hours gameplay per day
R9 390 cost per hour under load: 6.38 cents per hour = $5.82 per month with 3 hours gameplay per day

Monthly GTX 970 savings with 3 hours gameplay daily: $2.08 per month = $24.96 savings per year

If electricity is 30 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 970 cost per hour under load: 6.15 cents per hour = $5.62 per month with 3 hours gameplay per day
R9 390 cost per hour under load: 9.57 cents per hour = $8.73 per month with 3 hours gameplay per day

Monthly GTX 970 savings with 3 hours gameplay daily: $3.11 per month = $37.32 savings per year

If you happen to game n hours per day, multiply the monthly/yearly savings value times (n/3).

Nvidia GTX 960 vs AMD R9 380
Gaming Conditions

According to Nvidia's website here, the GTX 960 consumes 120W of power. According to Tomshardware's review here, average GTX 960 power consumption while gaming of various cards is 99.5W of power. According to Guru3D, the GTX 960 consumes about 123W of power under load, which is far closer to Nvidia's specification than the Tomshardware review. If the 3 are averaged, we get 114W of power draw under gaming load.

AMD does not include TDP on their site for their cards. According to the Tomshardware review of the MSI R9 380 here, the 380 draws in an average of 185W while gaming. According to the Guru3D review of the Asus Strix R9 380 here, the R9 380 consumes 196W under gaming load on average. If we average these two, we get about 191W of power consumption. The difference in the GTX 960 and R9 380 power consumption is proportionally greater than the difference in GTX 970 and R9 390 power consumption.

If electricity is 10 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 960 cost per hour under load: 1.42 cents per hour = $1.30 per month with 3 hours gameplay per day
R9 380 cost per hour under load: 2.39 cents per hour = $2.18 per month with 3 hours gameplay per day

Monthly GTX 960 savings with 3 hours gameplay daily: $0.88 per month = $10.56 savings per year

If electricity is 20 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 960 cost per hour under load: 2.84 cents per hour = $2.60 per month with 3 hours gameplay per day
R9 380 cost per hour under load: 4.78 cents per hour = $4.36 per month with 3 hours gameplay per day

Monthly GTX 960 savings with 3 hours gameplay daily: $1.76 per month = $21.12 savings per year

If electricity is 30 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 960 cost per hour under load: 4.26 cents per hour = $3.90 per month with 3 hours gameplay per day
R9 380 cost per hour under load: 7.17 cents per hour = $6.54 per month with 3 hours gameplay per day

Monthly GTX 960 savings with 3 hours gameplay daily: $2.64 per month = $31.68 savings per year

If you happen to game n hours per day, multiply the monthly savings value times (n/3).

Nvidia GTX 950 vs AMD R7 370
Gaming Conditions

According to Nvidia's website here, the GTX 950 consumes 90W of power. The Tomshardware review of the Asus Strix GTX 950 here shows it consuming 110W under load. Guru3D's review here of the same card labels it as 114W. The average of these is 105W of power consumption.

AMD does not include power specs on their website. The GTX 950 review on Tomshardware also includes power draw for the XFX Radeon R7 370, which is 122W. The Tomshardware review here of the MSI R7 370 labels it as 107W. Guru3d's review of the Asus Strix 370 is 143W. The average of these is 124W, only 19 more watts than the GTX 950.

If electricity is 10 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 950 cost per hour under load: 1.31 cents per hour = $1.20 per month with 3 hours gameplay per day
R7 370 cost per hour under load: 1.55 cents per hour = $1.41 per month with 3 hours gameplay per day

Monthly GTX 950 savings with 3 hours gameplay daily: $0.21 per month = $2.52 savings per year

If electricity is 20 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 950 cost per hour under load: 2.62 cents per hour = $2.40 per month with 3 hours gameplay per day
R7 370 cost per hour under load: 3.10 cents per hour = $2.82 per month with 3 hours gameplay per day

Monthly GTX 950 savings with 3 hours gameplay daily: $0.42 per month = $5.04 savings per year

If electricity is 30 cents per kWh where you live, and power supply is at 80% efficiency:

GTX 950 cost per hour under load: 3.93 cents per hour = $3.60 per month with 3 hours gameplay per day
R7 370 cost per hour under load: 4.65 cents per hour = $4.23 per month with 3 hours gameplay per day

Monthly GTX 950 savings with 3 hours gameplay daily: $0.63 per month = $7.56 savings per year

If you happen to game n hours per day, multiply the monthly/yearly savings value times (n/3).

Conclusion

In an ideal world, the money saved by choosing Nvidia over AMD should double with a card that is double in performance, but this is not always the case. Choosing the GTX 980 over the R9 390X saves one about 30% more money than choosing a GTX 960 over an R9 380, yet the GTX 980 is far more than 30% better than the 960, and the 390X is far more than 30% better than the 380. This means that the 390X is definitely better money per efficiency than the 380 (when compared to their Nvidia competitors). I found the 380 to personally be the worst in terms of efficiency compared to its competitor, the GTX 960.

The GTX 950 has barely any electric bill savings over the R7 370. The 370 on average uses only about 10 more watts than the 950, so the savings are extremely minimal. Of AMD's 300 series cards, the 370 is undoubtedly the most efficient for its performance. When getting into the high-end cards such as the R9 390 vs the GTX 970, going with the GTX 970 will save one around $30 after two years. Usually after two years, someone gets a new card.

The 980Ti compared to the Fury X was the most difficult to find accurate information on. 980Ti power consumption ranged from the low to high watts. When all was averaged out, the Fury X and 980Ti came very close in power consumption. The Fury X does use more power, and is slightly less powerful than the 980Ti. However, it is still the victor below, as Fiji did a great job of lower power consumption.

When comparing the GTX 980 vs 390X to the GTX 970 vs 390, the results are very similar. Both the 980 and 390X use little more power than the cards below them, and considering the 25-30% performance difference between 980-970 and 390X-390, along with the 11% money savings increase, the 390X is actually one of the best AMD cards in terms of power competing. The 380 failed the most. The following outlines the best AMD cards (power-wise) when compared to the other cards, performance differences, and Nvidia counterparts:


1. R7 Fury X
1. R7 370
2. R9 390X
3. R9 390
4. R9 380
 

hpram99

Distinguished
Jul 7, 2010
17
0
18,510
I'm still not sure I understand how electric companies advertised their rates. San Diego is $0.42/kwh, I've never seen these "average" rates living in 2 states over the past 12 years.
Every single watt of idle usage is $3.68/yr
I gained more savings by going with an 80plus 450w PSU. We tend to overpower our systems and waste a lot of idle power. Seeing these numbers makes me happy I went with a GTX 970.

Looking forward to seeing the R9 Nano on the list :)
 


I'm just going by the government website. It could indeed be that in your specific city the electrical costs are more expensive. For sure at 42 cents per kilowatt hour a 970 is worth it over a 390. And I made the calculations very easy so anyone could modify it to their own prices and daily usage.

Efficiency on idle plays such a minor role, because the whole computer may be drawing in about 40W of power, so even if it is poorly efficient it does barely anything to the bill.
 

hpram99

Distinguished
Jul 7, 2010
17
0
18,510
That's not entirely correct. Efficiency drops dramatically below 20% output, this is common of all power supply designs. 50% efficiency at <10% load is a very real number, a 40w idle load could mean 80w at the outlet. assuming 100% idle 24x7, that's $147/yr savings!

Also, I think government websites only list the lowest tier rating. San Diego is $0.11/kwh for Teir1. Unless you invest in solar power, you're guaranteed Tier4 ($0.42/kwh), many cities used tiered billing that isn't accurately reflected in government reports.

Regardless, I appreciate the simple math, it's a nice quick reference and I wish this was a more common practice with reviewers.
 

g-unit1111

Titan
Moderator
So what happens when you add CPU power consumption to the mix? Because those might be completely different numbers. Minimum power draw for each of the CPUs is:

Intel Core i5-4690K / Intel Core i7-4790K: 88W
Intel Xeon E3-1250: 80W
Intel Core i5-6600K / Intel Core i7-6700K: 91W
Intel Core i7-5820K/i7-5930K: 140W
AMD FX 6300 / 6350: 125W
AMD FX 8320/8350/8370: 125W
AMD FX 8320E/8370E: 95W
AMD FX-95XX: 220W

So going by your post plus that information those numbers could look drastically different. If you plug in, say a GTX 980TI + an i7-5820K, you would be looking at estimated wattage output of: 140W + 250W = 390W

But an i7-6700K + a 980TI = 91W + 250W = 341W which would be 49W less than the 5820K setup.

Or AMD, an i7-6700K + a R9-390X = 91W + 275W = 366W

It might not dramatically add up, but it will add up in your electric bills over time.
 


The power supply is not going to be under 10% load while gaming, and under 50% efficiency would be quite bad for that. I've never seen that in a low load test in Johnnyguru reviews. I did 80% as a good average that included lower end power supplies.
 


But this is solely for comparing GPUs. I guess realistically, you could calculate CPU savings or losses separately and simply combine them with the GPU savings or losses for a good idea.