News AMD Says Most Gamers Don't Care About GPU Power Consumption

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
And almost none are actually going to take the time to try to estimate what the annual cost difference is going to be for them (if they are even paying the power bill).
I agree, that's why I did it for them.
Personally I care more about the average power, particularly when the PC is on, but a game is not being played. Peak power doesn't mean a lot if it is only happening 5% or the time.
Actually, at peak power, the difference between an RX 7900 XTX and an RTX 4080 is even smaller than when they're gaming. I don't know why but that's why I chose TPU's gaming chart. The difference between those cards was greatest at 52W. The difference in maximums is only 44W. This article is just a fanboy making excuses about why he likes nVidia better. I'm honestly shocked that it got past the editor's desk.

It's funny how articles like this didn't exist when RDNA2 and Ampere were en vogue. I also haven't seen anyone making special articles slamming Intel 13th-gen CPUs for their power consumption when CPU power consumption is a much bigger deal because it's harder to cool and is always being used. There is no time when a computer is being used that the CPU isn't taxed somewhat and when you're doing something like gaming, it's a race between the CPU and GPU as to which gets 99% utilisation. However, when doing 2D tasks, the power-sucking 3D accelerator part of a video card is literally dormant.

Just look at this:
power-applications.png

The most powerful current-gen CPU for AMD is the R9-7950X3D with an average power draw of 128W while the i9-13900K uses 41W more. This is an average across ALL programs and, unlike with a GPU's 3D accelerators, there is no real "down-time" for the CPU as long as the PC is turned on because the CPU is used for literally everything.
For people who have to rely on window ACs in summer, higher power may also require an AC upgrade that isn't necessarily possible - I cannot fit anything much bigger than what I already have in my narrow windows. I doubt a bigger AC would fit within the circuit breaker budget either.
Sure, but a 52W difference (when gaming) won't be enough to cause that. It's less heat than a standard incandescent light bulb and you're not gaming 24/7.
 

Just Curious

Commendable
Sep 16, 2020
3
4
1,515
TLDR I'm not surprised at all.

Outside of laptops, phones, and other mobile electronics for a battery-life reason, I haven't met anyone that was a gamer that cared about the power consumption. It was all about the performance that mattered. I built computers for people as a side gig when someone was looking to buy or build a new computer, and when they were still a thing, people that could afford it would opt for SLI setups. My personal experience spans the last two decades of meeting new people, going to LAN events, traveling for work, and anywhere else.
 
  • Like
Reactions: Avro Arrow

AgentBirdnest

Respectable
Jun 8, 2022
271
269
2,370
Just being honest. I didn't care. Then I built a "powerful" system that produced loads of heat with that power. It made the office/game area too hot to sit in during summertime, particularly while playing or running the powerful PC as a work box while another PC (and person) were in that room with me.
We mitigated parts of it with AC adjustments and a well placed fan or two, but one of my major concerns now has to do with heat generation due to capability.
This, so much!
Upgrading from a 1060 to a 2060, going from 120w to 190w (factory-OC'd model), made my room hot enough that I couldn't game for more than about 90 minutes. And that's with air conditioning. I don't understand how so many people are able to live with PCs that pump 500W for hours straight while gaming...

I put my CPU in ECO Mode and dropped my GPU power limit to 70% last week because it's getting so hot here, and I'm not gonna lower the thermostat for the entire house just to keep one room cooler.

TL;DR - Power matters a lot to me.
 
I think the bigger issue isn't cost, but that power draw= heat, and heat dissipation=noise.
Also heat=heat, and a gaming PC can make a poorly air conditioned room pretty uncomfortable in the summer.
A 500 watt PC is a literal space heater.
Sure, but when the difference is only 52W at most (less than a standard incandescent light bulb), choosing one brand of card over another is not going to make the PC significantly hotter or cooler overall.

It's far more likely that a hotter-running CPU will matter more because the CPU is always active while the power-sucking part of the video card, the 3D accelerator, is not. If you have two PCs only using 2D graphics, the GPUs are probably drawing <20W but the CPUs can be drawing a lot more. If the PC is on, the CPU is active. Hell, if the PC is in sleep mode, the CPU is active. What makes me laugh is people talking about heat output from their PCs when they live in a hot climate and have an i7-13700K. I'm just like "Are you for real right now?". :ROFLMAO:

Now personally, I don't care what the power use is because I live in Canada where it's cold for six months of the year and power is dirt-cheap. The question that must be asked though is "Why don't we see random biased hit pieces like this taking aim at Intel's 13th-gen CPUs?" Kinda makes you wonder, eh?
 
  • Like
Reactions: Makaveli

Dr3ams

Reputable
Sep 29, 2021
255
280
5,060
I could care less about the power consumption. Currently I have the system I want and it does what I want. What I pay for the electricity is acceptable. Also, my GPU upgrade (in about 6 months) will also include a 1000 watt PSU. This should cover my next 2 GPU upgrades.
 
Last edited:
  • Like
Reactions: ilukey77

ilukey77

Reputable
Jan 30, 2021
833
339
5,290
People dont have a choice on power consumption AMD yes has been less efficient with their GPU's this gen
( not a excuse as such but MCM is new so maybe next gen will be better ?? ) Nvidia not as bad but still there not as efficient as last gen ..

AMD's cpu's worse than last gen and Intel just love to see how many watts ( im certain at this point) their cpu's can pull..

This is the price of better computing you want better deal with heat and power consumption!!
 

punkncat

Polypheme
Ambassador
This, so much!
Upgrading from a 1060 to a 2060, going from 120w to 190w (factory-OC'd model), made my room hot enough that I couldn't game for more than about 90 minutes. And that's with air conditioning. I don't understand how so many people are able to live with PCs that pump 500W for hours straight while gaming...

I put my CPU in ECO Mode and dropped my GPU power limit to 70% last week because it's getting so hot here, and I'm not gonna lower the thermostat for the entire house just to keep one room cooler.

TL;DR - Power matters a lot to me.


I am on a heat pump, which sucks when the temps go below around 50. I have found that a healthy overclock in the winter, turn it off in the spring works fairly well with the other mitigations. I actually used leftover parts and an i3 to do a work box low power enough that the 2 PC and people work out well for summer use.
 
  • Like
Reactions: AgentBirdnest

ilukey77

Reputable
Jan 30, 2021
833
339
5,290
I could care less about the power consumption. Currently I have the system I want and it does what I want. How much I pay for the electricity is acceptable. Also, my GPU upgrade (in about 6 months) will also include a 1000 watt PSU. This should cover my next 2 GPU upgrades.
^^^^ this is exactly why !! :)
 
  • Like
Reactions: Dr3ams

RedBear87

Commendable
Dec 1, 2021
150
114
1,760
Yeah, of course gamers care about power efficiency only when AMD's GPU are the most power efficient, another example of peak AMD marketing, like the whole charade about VRAM that they keep selling in different flavours at least since the 5500 XT days (4GB wasn't enough when the 5500XT 8GB launched, but 4GB was again good enough when the 6500XT launched; then 8GB wasn't enough when their old RDNA2 cards with discounts started selling below $500, but it was again acceptable enough when the RX 7600 was launched...). In Italian we call this having your face like your a**.
 

SeaTech

Prominent
Jun 13, 2023
5
5
515
Responding to the article: Oh, no, no and NO. This is NOT simply about power consumption. In my case with the 7900xtx that I have... this about the chiplet design and unoptimized "gaming scenarios" that should have LONG ago been fixed with drivers by now. A number of games work fine. But the ones that don't create a real headache. Memory Junction temperatures along with power consumption go through the roof for certain ridiculous things. Like staring at smoke in-game for too long...I kid you not...forget RT that's a pipe dream. Certain particle effects, shadows, and lighting strain the gpu in bizarre ways that creates uneven load in-game...you go to one area everything is fine...the next it ramping the fans and cranking temperatures like crazy...despite the GPU supposedly working "fine" this behavior is certainly NOT fine. This was never a big issue in the 6000 series. So far I have seen the same problem in Path of Exile, Diablo 4 during the beta test, and Mechwarrior 5 (especially with visual mods...but also a bit in vanilla when busting up a non-flammable gas tank, try staying in the smoke and torso twisting...you will notice very low framerates). As a long time AMD fan it saddens me that the issue is not being fully addressed.
 
Ok, nitpick on this, they've actually traded blows over the years. Nvidia most certainly has not always had the advantage.


That aside, while my preferences generally run AMD, and they're likely not wrong when it comes to gamer sentiment, my own view is "ehh, come on, guys, this is NOT a good look for you."

Or:
d45.jpg
The last generation was a great example of this.

RDNA 2 was more power efficient than RTX 3000 series. Due to 7nm TSMC vs 8nm Samsung.

This author seems to have a lot of time to waste on his hands to write a nothingburger article like this. I've been hearing this stupid argument going back and forth for years and I find it really interesting that this author decides to cover this instead of where power use matters a lot more, CPUs (because CPUs are always active while 3D accelerators are not). Even then, the cost difference isn't much.

Let's look at two top-end cards since that'll exaggerate the power-draw difference the most. The two cards will be the RX 7900 XT and the RTX 4080. Here's their respective gaming power draw numbers:
power-gaming.png

So, we have a 52W delta between them. Now to see just how much that extra 52W costs using the energy calculator at Sust-it.net:

In the the UK, it would take 50 hours of gaming to cost an extra $1USD (£0.78).
In the USA, it would take 100 hours of gaming to cost an extra $1USD.
In Canada, it would take 300 hours of gaming to cost an extra $1USD ($1.35CAD).

I don't know what amazes me more, that people argue about this or the fact that I'm the first one to get the idea to go and see just how significant it is. Even using the astronomical electrical costs in the UK, it's not a big deal. Now maybe people can smarten up and worry about important things instead of being distracted by Enquirer-grade "articles" like this.
Great post and example. When you run the numbers from the people that complain about this the most its literally a few dollars extra yet they complain like they need a second mortgage.
 

shady28

Distinguished
Jan 29, 2007
447
322
19,090
I don't much care about power consumption, while gaming.

But, AMD has a problem with its idle power draws. This is actually something I do care about, as my PC is run about 16 hours a day and about 14 hours of that is not gaming. That's actually a very normal thing, I don't need my GPU sucking down 35W while I'm using Excel or telnet. It's the one thing I've found annoying having gone from Nvidia 2060 to AMD 6700XT is that idle power draw from the GPU has gone up 200-300%.
 
  • Like
Reactions: umeng2002_2

InvalidError

Titan
Moderator
If people were that worried about power consumption, we wouldn't have so many case fans, and especially all that unicorn puke RGB.
A few days ago, I had problems with one of my external HDDs periodically disconnecting and taking the whole USB subsystem down with it. Suspecting that it must be a bad or under-powered 12Vdc brick, I decided to jerry-rig it to my PC's PSU with a homebrew AMP-to-5.5mm jack cable and left my PC's side cover off. Later in the night, I was wondering why it was so damn bright around my PC and realized that Asus randomly re-enabled the motherboard's Aura RGB at some point, likely when I updated the BIOS earlier in the year to see if it would do anything about the A750 crashing my PC multiple times per day.

Yeah, involuntary RGB-puke everywhere is annoying, so is paying for it when you don't want it. BTW, the A750's logo was also obnoxiously bright with no controls whatsoever for it.
 

Dr3ams

Reputable
Sep 29, 2021
255
280
5,060
I don't much care about power consumption, while gaming.

But, AMD has a problem with its idle power draws. This is actually something I do care about, as my PC is run about 16 hours a day and about 14 hours of that is not gaming. That's actually a very normal thing, I don't need my GPU sucking down 35W while I'm using Excel or telnet. It's the one thing I've found annoying having gone from Nvidia 2060 to AMD 6700XT is that idle power draw from the GPU has gone up 200-300%.
My Sapphire RX 6700 XT uses between 18 to 20 watts in idle at 1 to 3% utilization. When using excel while running a movie with Media Player Classic, it stays at 20 watts. Utilization rises 4 to 5%. This is acceptable.
 
Last edited:

sitehostplus

Honorable
Jan 6, 2018
404
163
10,870
I must be in the minority as I write this.

While I didn't compare power outputs on GPU's, I certainly did on CPU's.

Of course, the CPU's in my mind were pretty much equal performers, but I picked the one that used like half the power of the other one.

So far, I think it's paid off. My electric bill is about $25 cheaper than last year at this same time. And I live in a state where electricity is pretty expensive (New York).

Don't know what else could have done that. Everything at my house is the same as last year, save for the upgrade of my computer.
 

DavidLejdar

Respectable
Sep 11, 2022
286
179
1,860
Sure, but when the difference is only 52W at most (less than a standard incandescent light bulb), choosing one brand of card over another is not going to make the PC significantly hotter or cooler overall.

It's far more likely that a hotter-running CPU will matter more because the CPU is always active while the power-sucking part of the video card, the 3D accelerator, is not. If you have two PCs only using 2D graphics, the GPUs are probably drawing <20W but the CPUs can be drawing a lot more. If the PC is on, the CPU is active. Hell, if the PC is in sleep mode, the CPU is active. What makes me laugh is people talking about heat output from their PCs when they live in a hot climate and have an i7-13700K. I'm just like "Are you for real right now?". :ROFLMAO:

Now personally, I don't care what the power use is because I live in Canada where it's cold for six months of the year and power is dirt-cheap. The question that must be asked though is "Why don't we see random biased hit pieces like this taking aim at Intel's 13th-gen CPUs?" Kinda makes you wonder, eh?
I.e. RX 7900 XTX reportedly* uses 104 watts in idle state (with a high refresh rate, and apparently especially in cases without a VRR monitor/setting). Considering that some may easily have the computer turned on for 10+ hours per day, such as for work, that's a total of easily 30 kWh per month. That may not be considered much in many a household, but it still is a total of 30 kWh per month more than what the competition can offer. And readers in such a situation may be glad to hear that AMD is at least aware of such issues.

If there is something amiss here, it is rather the lack of questioning AMD about whether they considered introducing like AMD Enduro for desktop computers, for AMD chipset MBs, where it would use the iGPU of the CPU for "idle" graphical tasks (aka GPU switching, with integrated MB support).

*
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,457
1,002
21,060
I.e. RX 7900 XTX reportedly* uses 104 watts in idle state (with a high refresh rate, and apparently especially in cases without a VRR monitor/setting). Considering that some may easily have the computer turned on for 10+ hours per day, such as for work, that's a total of easily 30 kWh per month. That may not be considered much in many a household, but it still is a total of 30 kWh per month more than what the competition can offer. And readers in such a situation may be glad to hear that AMD is at least aware of such issues.

If there is something amiss here, it is rather the lack of questioning AMD about whether they considered introducing like AMD Enduro for desktop computers, for AMD chipset MBs, where it would use the iGPU of the CPU for "idle" graphical tasks (aka GPU switching, with integrated MB support).

*
Maybe AMD should introduce a "OPTIONAL" physically external "Mux Switch" between their iGPU and their own Video Cards.

It'd obviously connect two Display Port Cable feeds into one box and have a single Cable out to your display.

This way people can swap which ones they are using as needed.
 

purpleduggy

Prominent
Apr 19, 2023
167
44
610
the reason is because this AMD generation as well as Nvidia is actually not a new generation but just an overclock of the previous with small improvements and extra features. this is why the wattage is so insane. And to be honest they shouldnt have power limited the AMD cards, or they would have matched Nvidia 4090 on raw speed. the hard power limit is the 7900xt bottleneck. it was also the bottleneck on the 6900xt, which could have swept the floor entirely with the 3090ti, as opposed to being slightly faster or the same. rtx5090/rx8900xt might be a new generation.
 

rluker5

Distinguished
Jun 23, 2014
914
595
19,760
Maybe AMD should introduce a "OPTIONAL" physically external "Mux Switch" between their iGPU and their own Video Cards.

It'd obviously connect two Display Port Cable feeds into one box and have a single Cable out to your display.

This way people can swap which ones they are using as needed.
That "mux switch" might already be there.

Windows is pretty good at managing iGPU/dGPU loads nowadays. I've found that it usually assigns the correct GPU to perform the task if you hook up your video cable to the iGPU. As in if I were to play a game then the dGPU would do the work which would be displayed out the iGPU output. With my setup I think I was losing a few percent performance. Also I used an Intel iGPU, but I don't see why it would make a difference with AMD.

I think that would reduce the idle of that RDNA3 on a high refresh display to minimum idle.

And if you wanted to switch which GPU were running a task it is under System>Display>Graphics.

Just a free option for people with the hardware for it.
 
Jan 15, 2023
11
5
15
AMD's Scott Herkleman recently claimed that most gamer's don't care about power consumption, after answering a question regarding RDNA3's inferior power efficiency compared to Nvidia's RTX 40 series GPUs.

AMD Says Most Gamers Don't Care About GPU Power Consumption : Read more
Most of my friends in Europe care a lot about the power consumption, while also enjoying gaming. This year almost everyone I know went for a 4090 + 7950X as both can be under-volted to use half the power and we're all reducing the power usage.
 
Status
Not open for further replies.