Question Will I see much difference in energy cost between a 200 watt and 260 watt GPU?

Adam1998

Distinguished
Dec 26, 2015
95
3
18,535
This is probably a silly question but I'm looking at a new GPU and torn between a 3060 ti, sapphire 6700 xt and the 6700 xt nitro plus (I can get one grade B at Overclockers)

Will I see a noticeable cost increase between running these two cards? Bare in mind, I maybe only game for long stretches on a Sunday or something and other times it's unreal engine work.
 
This is probably a silly question but I'm looking at a new GPU and torn between a 3060 ti, sapphire 6700 xt and the 6700 xt nitro plus (I can get one grade B at Overclockers)

Will I see a noticeable cost increase between running these two cards? Bare in mind, I maybe only game for long stretches on a Sunday or something and other times it's unreal engine work.

I have no idea how much power those cards will actually use in reality.

However....where I live, I pay 18 cents per KWH.

60 watts difference in my energy bill would cost me about 11 cents a day if the 60 watt device ran 10 hours a day.

About 40 dollars a year.

Adjust depending on your KWH cost, your hours per day, and whatever the real power usage difference between those 2 cards would be.
 

Karadjgne

Titan
Ambassador
Also depends highly on your psu. An 80+ (white) is only 80% efficiency, so if one card pulls 50w extra from the pc, that's closer to 63w extra just from the gpu, from the wall, which can add up in a hurry for someone like Lafong. If your psu is Platinum rated, thats 95% efficiency, so that 50w extra would only be 55w from the wall, which still adds up, just not nearly as fast.

Total, if your pc is using 250w, that's 313w with a White or 263w with a Platinum, so overall a 50w savings per hour, and with the 50w card, that's 300w = 375w and 300w = 315w, so a 60w savings with only a 58w difference.

That's 1x 60w lightbulb turned off, or @ 10x overhead led lights every hour.

Over the course of a monthly bill, the energy difference adds upto not much more than a hamburger, ± the cheese.
 
Keep in mind, the official TDPs of cards don't always accurately reflect their typical power draw.

A 3060 Ti has a 200 watt TDP, while a 6700 XT has a 230 watt TDP, and Sapphire advertises their Nitro+ as having a 260 watt TDP. However, judging by the power testing in this TechPowerUp review of the Nitro+, its power draw tends to be almost identical to a reference 6700 XT in most scenarios...
https://www.techpowerup.com/review/sapphire-radeon-rx-6700-xt-nitro/35.html

They show both 6700 XT cards drawing an identical 7 watts for idle power consumption when the system is not under load, which is actually slightly lower than a 3060 Ti's 9 watts at idle, while both drew 21 watts during video playback, slightly higher than the 3060 Ti's 17 watts. So typical desktop power draw between all of these cards is likely to be fairly similar.

They did find the 6700 XT cards to draw more power when idling with 2 monitors running at different resolutions though (which can drive up power draw on some cards), at 33 watts versus the 3060 Ti's 17 watts. That could add up if you leave your computer on most of the time with certain multi-monitor configurations, but is probably more of a niche scenario.

And under their tested gaming load (running Cyberpunk 2077 at 1440p Ultra without RT), both the reference 6700 XT and the Nitro+ drew 221 watts, while the 3060 Ti drew 199. That's only a 22 watt difference, or 11% higher power draw, while they showed the 6700 XT to push around 15% higher frame rates than the 3060 Ti under those same conditions. And across their entire test suite of 22 games, they found the 6700XT cards to be almost 9% faster on average compared to a 3060 Ti in games without RT enabled at 1440p, so the higher power draw roughly matches the cards higher performance. However, it should be noted that in most games with RT effects enabled, the 3060 Ti tends to perform notably better, though they didn't do RT power testing.

The only part of their testing where the Nitro+ was shown to draw more power than a reference 6700 XT was in the FurMark stress test, where the Nitro+ went up to 254 watts, while the reference 6700 XT stayed at 217 watts, and the 3060 Ti at 197 watts. So it seems that stress-testing was the only way they got the card to approach its higher power limit. It's possible that some games or other workloads might also manage to draw more power on that card, but judging by the Cyberpunk testing, along with the card's very similar performance to the reference model across their test suite, that's probably not the norm. They did, however, show both 6700 XTs to draw notably more power than the 3060 Ti when enabling 60Hz V-Sync at 1080p in Cyberpunk to cap frame rates at 60fps though, with them drawing 125-130 watts compared to the 3060 Ti at just 73 watts. So, in frame-limited, moderately demanding workloads, the 3060 Ti might potentially draw less power, though it's hard to say from just testing that one game under those conditions.

Overall, I don't think the power draw differences between any of these cards will make too much of a difference to how much they cost in electricity. On the desktop, power draw should in most cases be similarly low between all of them, with an exception possibly being some multimonitor setups, where the 6700 XTs might draw a little more. In games, the power draw of the 6700 XTs may be a little higher, but performance should be a little higher as well, at least outside of raytracing. Power draw in something like Unreal Engine is hard to say, and might depend on how much demand the editor is placing on the hardware.