News AMD Says Most Gamers Don't Care About GPU Power Consumption

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
My Sapphire RX 6700 XT uses between 18 to 20 watts in idle at 1 to 3% utilization. When using excel while running a movie with Media Player Classic, it stays at 20 watts. Utilization rises 4 to 5%. This is acceptable.

That's because you have a single monitor.

The 6700XT on dual monitor setups at idle has twice the power consumption of the 3060 Ti and 60% more than a 3080. It's literally 2.53X more than the 2060.

Here's the thing. I don't need dual monitors for games. I need them for work, which is 90%+ of the time on my PC. I don't even care about the power bill part, it's miniscule. But it heats up my home office. The PC / motherboard averages about 40W, but the GPU runs 35W. Even undervolted and underclocked, I can't get it below 30W. So that's almost 1/2 the power from my PC while working / essentially idle, and it raises my office temp about 4-5F.

My fault, not checking my use case in depth. But a weakness on AMD cards nonetheless.

h
power-multi-monitor.png
 
Most of my friends in Europe care a lot about the power consumption, while also enjoying gaming. This year almost everyone I know went for a 4090 + 7950X as both can be under-volted to use half the power and we're all reducing the power usage.
They should have picked up a 7950X3D is uses alot less power than the regular 7950X.

My Sapphire RX 6700 XT uses between 18 to 20 watts in idle at 1 to 3% utilization. When using excel while running a movie with Media Player Classic, it stays at 20 watts. Utilization rises 4 to 5%. This is acceptable.
Since the 23.7.1 drivers I idle at 6 watts on a 7900XTX on a 34" ultrawide VVR display.

I just installed the 23.8.2 drivers and power consumption seems to have gone down alittle during media playback.
 
Last edited:
the reason is because this AMD generation as well as Nvidia is actually not a new generation but just an overclock of the previous with small improvements and extra features. this is why the wattage is so insane. And to be honest they shouldnt have power limited the AMD cards, or they would have matched Nvidia 4090 on raw speed. the hard power limit is the 7900xt bottleneck. it was also the bottleneck on the 6900xt, which could have swept the floor entirely with the 3090ti, as opposed to being slightly faster or the same. rtx5090/rx8900xt might be a new generation.
Mmm, no. Neither series is an overclock of last-gen. They are new architectures that took years to develop.
Video cards can't gain 50% more cores, AV1 encoding, or grow 12x the L2 cache - just by overclocking it.

The 7900XTX can indeed match the 4090... if you increase the power limit to 190% (675 W), and drop the 4090 power limit to 60%, then the Radeon card will barely edge it out in TimeSpy. (Source) Otherwise, matching the 4090 if only they wanted to is a fantasy.



edit: Why I am replying is beyond me...
 
Last edited:
I think most gamers don't care about power consumption, but there are a lot of exceptions.

Back in 2015 I had a newer 4k monitor and sli 780tis and when W3 came out and I could get 4k60 med-hi if I removed their power limits and ran them at 1250mhz on a modded bios, I didn't care if I was sweating with my ac on, ceiling fan on and tower fan in my door turned all of the way up. It was the best gaming experience in my life up to that point. My next GPU purchase was sli 1080tis at 375w apiece. High end mGPU gaming used to use lots more power than today. My 3080 at 336w does not compare.

Nowadays high power use bothers me for 2 main reasons: 1. noisy 2. I want a dual slot cooler in my living room itx. If those 2 problems are addressed then high power is less of an issue to me than a poorly executed fan stop implementation.

But power is cheap where I live. And if I could get a pair of 7900XTs and get 2015 era sli performance scaling with them I would be all over that.
 
My fault, not checking my use case in depth. But a weakness on AMD cards nonetheless.
My RX6600 is at 19W idle with a 1920x1200p60 + 4k60p monitors. I think my GTX1050 was doing 15W. Not too much of a change for me and much of the 3W is likely from having twice as much twice as fast memory.

In HWInfo, memclock always reports as roughly 1740MHz, looks like AMD's drivers are never down-clocking the memory even while GPU clock is all the way down under 50MHz. Pretty sure my GTX1050 was down-clocking memory somewhere in the 400-500MHz range when idle, that is bound to handicap AMD by a handful of memory IO watts if all of their GPUs are doing this while Nvidia's arent.
 
But they do... They've always had better texture streaming compression, colour compression and overall better memory-related optimizations AMD/ATI has never had. This being said, AMD has still had, even with those general disadvantages, some design wins, efficiency-wise, over the years thanks to either manufacturing or taking radically different design choices (VLIW5 and 4, for instance).

Regards.
Ampere was awful for power at stock.
Though frankly, undervolting to 825-850mv cuts the power on my 3080 to 250w, and performance actually goes up when the card can hold 1850-1890mhz in gpu heavy scenes (FE runs at 1800mhz in the most intensive scenarios, like 4K rt on).
On the other hand, I have a 6800 too, that has a 1025mv limit, and does 2540mhz OC with 230w tgp.
So both are really efficient imo.

meanwhile im rocking here at 1440p/ultra with RX 6800 ~100watts and R7 3800X ~50watts
I once managed to run dying light 2 (max but all rt off) at 75-80 watts for the gpu with uv, dlss on and 60 fps cap at 1440p on 3080. I think I had 700mv uv and under 1500mhz clocks, don't remember, it was a year ago or more.
 
  • Like
Reactions: rluker5
The most powerful current-gen CPU for AMD is the R9-7950X3D with an average power draw of 128W while the i9-13900K uses 41W more. This is an average across ALL programs and, unlike with a GPU's 3D accelerators, there is no real "down-time" for the CPU as long as the PC is turned on because the CPU is used for literally everything.
Well according to reliable sources from the internet that would only cost like $1 more for more than 100 hours...
So, we have a 52W delta between them. Now to see just how much that extra 52W costs using the energy calculator at Sust-it.net:

In the the UK, it would take 50 hours of gaming to cost an extra $1USD (£0.78).
In the USA, it would take 100 hours of gaming to cost an extra $1USD.
In Canada, it would take 300 hours of gaming to cost an extra $1USD ($1.35CAD).
Also a CPU is always active but not always running at 100% load, at idle the 7950x3d uses 14% more power and the 7950x uses 18% more power than the 13900k


At single thread the 13900k is way more efficient as well, as long as it's not overclocked to oblivion.
efficiency-singlethread.png
 
When I set out to buy a new GPU last year, I didn't really want to get one that required more power than my Seasonic 750watt PSU could achieve. That and I wanted to avoid the power plug hassle that 40 cards were going through late last year.
As it stands I upgraded PSU and GPU at same time so that time trying to avoid going over was a waste of time.
I wasn't really concerned about efficiency... hence I didn't but a Nvidia GPU.
 
Ampere was awful for power at stock.
Though frankly, undervolting to 825-850mv cuts the power on my 3080 to 250w, and performance actually goes up when the card can hold 1850-1890mhz in gpu heavy scenes (FE runs at 1800mhz in the most intensive scenarios, like 4K rt on).
On the other hand, I have a 6800 too, that has a 1025mv limit, and does 2540mhz OC with 230w tgp.
So both are really efficient imo.


I once managed to run dying light 2 (max but all rt off) at 75-80 watts for the gpu with uv, dlss on and 60 fps cap at 1440p on 3080. I think I had 700mv uv and under 1500mhz clocks, don't remember, it was a year ago or more.
I do the same with my 3080 for fan noise reasons. I'd undervolt my 6800 more often as well, but it is a lot less consistent in applying it. But if I could hold 2150mhz for 400w on that 3080 I sometimes would, not that I can.

Bit of a coincidence somebody else has the same Nvidia/AMD GPU combo 😛
 
  • Like
Reactions: Peter Ferrari
Not the biggest gamer in the world, but I chose components based on power consumption because my AC isn't great and I don't want to deal with the heat. Definitely saves money both on the computer power consumption, but then also from not having to set the AC lower to compensate for it. Feels like many people with not great AC have to pay double since you end up providing more cooling to the rest of the house and not just where you need it.

I set strict power limits on the Ryzen 5600 and RX 6600. Felt like the RX 6600 was the only option below 150 W. The 3050 and 6500 XT just didn't seem like good enough options. Either would have been fine, but I just wish there were better options at the low end. For CPUs, going AMD over Intel this gen was a no brainer.
 
  • Like
Reactions: AgentBirdnest
I do the same with my 3080 for fan noise reasons. I'd undervolt my 6800 more often as well, but it is a lot less consistent in applying it. But if I could hold 2150mhz for 400w on that 3080 I sometimes would, not that I can.

Bit of a coincidence somebody else has the same Nvidia/AMD GPU combo 😛
yeah I bough them for 789 altogether in april, found local deals in great condition and half warranty left, love them both. Didn't even need repasting, weren't mining cards, temps are still great on both.
6800 is super efficient, has 16gb vram and I have huge hopes for fsr3, not just the frame interpolation part, but the actual upscaler is geetting AA now too (shimmer is a huge problem on fsr2).
I use rtx3080 for 1920p dldsr + dlssq and love it, you just can't have better IQ to performance ratio than this atm. In rdr2 comparison screenshots I made, it blew 4K taa out of the water in image clarity while running a lot faster (will link if I find it). Now that it's getting the new rt denoiser, I'm glad I have both. More fun to play with. Too little time tho. Can't wait for alan wake 2 with PT and new dlss, same as playing new amd sponsored games on 6800 with fsr3. hope the temporal stability on the new version of fsr upscaler is at least at xess level.
 
Last edited:
  • Like
Reactions: rluker5
And when the GTX480 and the GTX595 and HD4870X2 and the RX295X2 and many other GPUs would suck so much power it was hilariously dumb to read people trying to justify their existence, efficiency not-withstanding.

As I said: it's about being practical and not cynical.

Regards.
Ah, a classic case of whataboutism defence!!

You really can't compare any of your examples with Nvidia's 3080xx and 3090xx power monsters. Those GPUs unleashed the new and extensive wave of power hungry GPUs.

Nvidia fans were just giddy with their sizzling toys despite the huge power "suckage"!!

Best Regards
 
I think the bigger issue isn't cost, but that power draw= heat, and heat dissipation=noise.
Also heat=heat, and a gaming PC can make a poorly air conditioned room pretty uncomfortable in the summer.
A 500 watt PC is a literal space heater.
yeah, on a hot day it's best to keep the gpu under 200w if possible.
 
[snip]
So, we have a 52W delta between them. Now to see just how much that extra 52W costs using the energy calculator at Sust-it.net:

In the the UK, it would take 50 hours of gaming to cost an extra $1USD (£0.78).
In the USA, it would take 100 hours of gaming to cost an extra $1USD.
In Canada, it would take 300 hours of gaming to cost an extra $1USD ($1.35CAD).

I don't know what amazes me more, that people argue about this or the fact that I'm the first one to get the idea to go and see just how significant it is. Even using the astronomical electrical costs in the UK, it's not a big deal. Now maybe people can smarten up and worry about important things instead of being distracted by Enquirer-grade "articles" like this.

The price you pay isn't the true cost of something.

It's crazy that we're talking about *a graphics card* using upwards of 400W while we see global warming happen outside our windows, which are closed because we have the air conditioning on to remove the insane amount of waste heat generated by all our modern appliances. Not because of important work we might be doing, either... Even worse is the crypto-mining, that only generates a virtual good.
 
I.e. RX 7900 XTX reportedly* uses 104 watts in idle state (with a high refresh rate, and apparently especially in cases without a VRR monitor/setting). Considering that some may easily have the computer turned on for 10+ hours per day, such as for work, that's a total of easily 30 kWh per month. That may not be considered much in many a household, but it still is a total of 30 kWh per month more than what the competition can offer. And readers in such a situation may be glad to hear that AMD is at least aware of such issues.
Yes, you're 100% correct in that regard. The idle power was insane and yes, AMD does know about it but AMD didn't stay silent or try to deflect (like so many do). They got to work on a fix and have been rolling it out:

I can't think of a single company that hasn't had glitches with their products so to judge them on that is useless. It's how a company handles said problem that is worthy of judgement and it's clear that they're doing something to fix it. As long as they're handling it correctly (and they are), that's good enough for me.
 
  • Like
Reactions: King_V
Most of my friends in Europe care a lot about the power consumption, while also enjoying gaming. This year almost everyone I know went for a 4090 + 7950X as both can be under-volted to use half the power and we're all reducing the power usage.
Reducing by half is quite a lot. How is this being measured (before and after)?
 
You know, I just realised something that I didn't notice when I was making my original post that actually proves AMD to be correct. I was focused on the RX 7900 XTX and RTX 4090 and so there was something that I completely missed:
power-gaming.png

If gamers truly cared about power use, then the RTX 3090 Ti would've been a complete flop but it wasn't. Gamers went crazy for that card and it draws 126W more than even the RTX 4090!

I don't remember any hit pieces made about the RTX 3090 Ti's power use. It sure seems like the tech press only cares about the power use of Radeon cards, eh? Me personally, I don't care at all about the power use of a video card. Hell, I ran twin R9 Furies in Crossfire for a few months and they would peak at ~550W together.

If AMD, Intel and nVidia were tall taken to task for their power use (like they should be), then this article wouldn't be problematic. Unfortunately, that doesn't happen.
 
Most of my friends in Europe care a lot about the power consumption, while also enjoying gaming. This year almost everyone I know went for a 4090 + 7950X as both can be under-volted to use half the power and we're all reducing the power usage.
There are so many reasons that I can't take this post seriously that I'm going to have to list them:
  • People who care about power consumption don't buy RTX 4090s
  • People who care about power consumption and gaming don't buy CPUs with sixteen cores
  • Undervolting does save power but it doesn't even come close to cutting it in half, ever.
I'm afraid I'm going to have to say "Pics or it didn't happen!" to this post.
 
Reducing by half is quite a lot. How is this being measured (before and after)?
I don't even think that Der Bauer has ever achieved something like that. You know that there's something wrong when the post starts with someone talking about friends who care a lot about power use and a lot about gaming but buy the most power-hungry card of this generation and a sixteen core Ryzen 9 CPU.
 
Last edited by a moderator:
If gamers truly cared about power use, then the RTX 3090 Ti would've been a complete flop but it wasn't. Gamers went crazy for that card and it draws 126W more than even the RTX 4090!

I don't remember any hit pieces made about the RTX 3090 Ti's power use. It sure seems like the tech press only cares about the power use of Radeon cards, eh? Me personally, I don't care at all about the power use of a video card. Hell, I ran twin R9 Furies in Crossfire for a few months and they would peak at ~550W together.

If AMD, Intel and nVidia were tall taken to task for their power use (like they should be), then this article wouldn't be problematic. Unfortunately, that doesn't happen.
Many Gamers with a preference for nVIDIA have a "Double Standard" for nVIDIA.

nVIDIA can usually get away with more BS than AMD.

If nVIDIA has a power hungry GFX card / GPU, it's ok.
If AMD has one, it's bad, trash, shouldn't even be looked at.
Go look at nVIDIA's Power efficiency and lower power consumption.
God forbid that AMD is only 50 watts behind on a "Slightly Weaker node" & using Chiplets to reduce BoM costs for the consumer.

If nVIDIA does something bad, it's ok
If AMD does the same bad, it's the worst thing in the world, they deserve death & destruction.
 
People dont have a choice on power consumption AMD yes has been less efficient with their GPU's this gen
( not a excuse as such but MCM is new so maybe next gen will be better ?? ) Nvidia not as bad but still there not as efficient as last gen ..

AMD's cpu's worse than last gen
Actually, that's not true (although I understand why you think that it is). I thought the exact same thing and was ripping AMD a new one because the 7000-series didn't appear to be nearly as efficient as the 5000-series. I'll use the next part of this sentence to explain..
and Intel just love to see how many watts ( im certain at this point) their cpu's can pull..
Exactly. Intel basically overclocked their 12th-gen models to the moon to get 13th-gen. Intel knows that gamers don't care about power use (no matter how much the author of this article is trying to push a narrative contrary to that) and just wanted to be as high on the review charts as possible. Just look at the i9-13900KS, a perfect example of that.

So, since AMD also knew that being at the top of the charts mattered more than power draw, they also had to OC their models as high as possible to get max performance on the charts and, just like Intel, they had to sacrifice efficiency to do so. The difference is that AMD knew that it was a farce and so they took what would've been the stock settings of the CPUs and called it "Eco-Mode".

Isn't it amazing how the stock frequencies of the X3D CPUs (that can't be overclocked) are so similar to the "Eco-Mode" frequencies of the non-X3D CPUs? In "Eco-Mode" (aka actual stock settings), the 7000-series is definitely more power efficient than the 5000-series. It just took a bit of digging to realise this.
This is the price of better computing you want better deal with heat and power consumption!!
I couldn't have said it better myself. The first PC I built back in 1988 had a 200W AT-Standard PSU. Now I own two ATX-standard PSUs rated at 1kW EACH!

We have to remember though that back when 200W PSUs were commonly used, we were also using 60W-100W incandescent light bulbs. Now we're using LED light bulbs that draw 8.5-15W. Our lights used to use almost 7x as much power as they do now while our computers only use about 3x as much power on average.
 
Exactly. Intel basically overclocked their 12th-gen models to the moon to get 13th-gen.
The difference of the 12900k and the 13900k in power draw is a whole 12W....for 8 e-cores more.
Intel knows that gamers don't care about power use (no matter how much the author of this article is trying to push a narrative contrary to that) and just wanted to be as high on the review charts as possible. Just look at the i9-13900KS, a perfect example of that.
The 13900ks has the exact same 253W power limit that the 13900k has.
So, since AMD also knew that being at the top of the charts mattered more than power draw, they also had to OC their models as high as possible to get max performance on the charts and, just like Intel, they had to sacrifice efficiency to do so. The difference is that AMD knew that it was a farce and so they took what would've been the stock settings of the CPUs and called it "Eco-Mode".
That's exactly what Processor Base Power and Maximum Turbo Power is for intel since the 12th gen.
PBP is the normal TDP and MTP is the amount of overclocking that they allow under warranty.

The only difference is that intel has another roughly 30% overhead above the 253W ,that people can use for overclocking and that all the reviewers abuse and go well above it, while AMD needs liquid just to get to the advertised number.
AMD pushed ryzen to the absolute limit, to not look completely bad on the charts, while intel kept their CPUs with 30% overhead.
We have to remember though that back when 200W PSUs were commonly used, we were also using 60W-100W incandescent light bulbs. Now we're using LED light bulbs that draw 8.5-15W. Our lights used to use almost 7x as much power as they do now while our computers only use about 3x as much power on average.
Yeah but the amount of light the LEDs give out isn't much more than the old bulbs if they aren't even dimmer.
The CPUs on the other hand have multiplied performance by many many times.
Your 1988 200w CPU is much weaker than a raspberryPI that draws what, 10-20W?
 
Status
Not open for further replies.