News AMD Says Most Gamers Don't Care About GPU Power Consumption

Status
Not open for further replies.
They're not wrong though...

Even when* they've been more efficient, the overwhelming majority of gamers will still choose what they want first as long as they can power it.

Personally, I do like more efficient designs and nVidia has always had an advantage there, but as long as the TBP is under 300W, it's ok for me as long as it hits the performance target I want.

EDIT: Please put the relevant quote as well: "In desktop, however, it matters, but not to everyone. There are some people who are really concerned about power, others don’t care as much. We definitely want to make a better perf-per-watt chip."

It feels disingenuous to not do so. Specially when the title is an absolute stretch to what he really says.

EDIT2: Thanks for the update :)

Regards.
 
Last edited:

PBme

Reputable
Dec 12, 2019
63
37
4,560
Reviewers and a few folks who are dealing with existing power supply actually care. Many care in theory. But my guess is almost all who are buying a card mainly for gaming put most weight into performance for what that they are willing to spend and the least weight on power consumption. I wager that aesthetics of different cards makes much more of a difference in buying decisions that their relative watts to fps does.
 
  • Like
Reactions: artk2219

PBme

Reputable
Dec 12, 2019
63
37
4,560
i mean...they arent wrong in states where power is cheap.

the dif between nvidia and amd is gonna be dollars over a yr.

we generally already expect "high" power as we have gaming PC's & a/c to keep room cool.
And almost none are actually going to take the time to try to estimate what the annual cost difference is going to be for them (if they are even paying the power bill).
 

punkncat

Polypheme
Ambassador
Just being honest. I didn't care. Then I built a "powerful" system that produced loads of heat with that power. It made the office/game area too hot to sit in during summertime, particularly while playing or running the powerful PC as a work box while another PC (and person) were in that room with me.
We mitigated parts of it with AC adjustments and a well placed fan or two, but one of my major concerns now has to do with heat generation due to capability.
 
Also, people. Don't confuse "efficient" and "max power". They're related, but not the same thing.

If you target a certain performance level, two cards can hit it at different power levels, but your choice will be driven by price. If you need to upgrade your PSU, it'll add to the cost of upgrade and it'll probably be a no-go. If it adds too much to the power bill and you mind, etc...

If you already own a PSU that can power a 300W card, then you can buy whatever GPU that is under that, no? The difference will come down to price and performance.

Gotta be practical and not cynical.

Regards.
 

vanadiel007

Distinguished
Oct 21, 2015
381
376
19,060
You buy a video card based on performance, not power consumption unless you are concerned about the power draw and have to include a PSU upgrade to power your new card.
Even then, most just get a new PSU rather than downgrading performance to a lesser power consuming model.
 

King_V

Illustrious
Ambassador
Personally, I do like more efficient designs and nVidia has always had an advantage there,
Ok, nitpick on this, they've actually traded blows over the years. Nvidia most certainly has not always had the advantage.


That aside, while my preferences generally run AMD, and they're likely not wrong when it comes to gamer sentiment, my own view is "ehh, come on, guys, this is NOT a good look for you."

Or:
d45.jpg
 

InvalidError

Titan
Moderator
You buy a video card based on performance, not power consumption unless you are concerned about the power draw and have to include a PSU upgrade to power your new card.
For people who have to rely on window ACs in summer, higher power may also require an AC upgrade that isn't necessarily possible - I cannot fit anything much bigger than what I already have in my narrow windows. I doubt a bigger AC would fit within the circuit breaker budget either.
 
Ok, nitpick on this, they've actually traded blows over the years. Nvidia most certainly has not always had the advantage.


That aside, while my preferences generally run AMD, and they're likely not wrong when it comes to gamer sentiment, my own view is "ehh, come on, guys, this is NOT a good look for you."

Or:
d45.jpg
But they do... They've always had better texture streaming compression, colour compression and overall better memory-related optimizations AMD/ATI has never had. This being said, AMD has still had, even with those general disadvantages, some design wins, efficiency-wise, over the years thanks to either manufacturing or taking radically different design choices (VLIW5 and 4, for instance).

Regards.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,457
1,000
21,060
For people who have to rely on window ACs in summer, higher power may also require an AC upgrade that isn't necessarily possible - I cannot fit anything much bigger than what I already have in my narrow windows. I doubt a bigger AC would fit within the circuit breaker budget either.
Instead of Window AC units, have you thought of going with "Mini-Split" AC's?

It's very common in Asia & EU, it's becoming more common in NA now.

NA is finally catching up to the rest of the world in terms of the preferred AC unit style.

Same with going fully tank-less on Water Heaters and having Instant Hot Water units.
 

punkncat

Polypheme
Ambassador
For people who have to rely on window ACs in summer, higher power may also require an AC upgrade that isn't necessarily possible - I cannot fit anything much bigger than what I already have in my narrow windows. I doubt a bigger AC would fit within the circuit breaker budget either.

We lived in a rental for some time that was not only an ancient house, but the electrical to go along with. We could run one unit that was well deficient for the whole area it was cooling but was also just at the edge of what we could run about a third of the house electrical while running without blowing the main. This place was so close on margin that using the gaming PC and watching TV with the stereo on at the same time wasn't possible with the air on.




Sorry @Kamen Rider Blade, I somehow messed up the reply format and am too lazy to fix ATM...One of my buddies has lived in a mobile home for years. His main unit blew up and was prohibitively expensive to repair. He opted to go with the mini-split heat pump and not only is it working super well, the efficiency is outstanding.
Instead of Window AC units, have you thought of going with "Mini-Split" AC's?

It's very common in Asia & EU, it's becoming more common in NA now.

NA is finally catching up to the rest of the world in terms of the preferred AC unit style.
 
  • Like
Reactions: artk2219
Jun 2, 2023
2
3
15
That's a disappointing, corporate parroting kind of answer.

It's likely there are indeed few gamers out there who game so much that +100 W between cards will feel like a sting on their power bill.

HOWEVER, that's not where power consumption matters the most. Many computers holding a high end GPU pull dual duty as workstation, home/NAS server, gaming station, etc. so the computer is on 24/7. That's where the high idle power consumption of the 7900 XT(X) matters. It's double that of a 4080 when there is any activity at all on the screen (i.e. window (or any) movement; see Tom's news article from Aug 1st). So that's +35 W as long as someone's at the PC or, if you have dual monitors hooked up, +35 W all the time, for some reason.

+35 W 24/7 = +306.6 kWh / year. In my country 300 kWh of power is about €50 in the best case (and up to €1000 worst case as it highly depends on how well your contract of choice fits your power usage patterns). Assuming the best case => a Radeon 7900 XT(X) will cost you €50 extra/year just sitting there, without doing any gaming at all. In 5 years its total initial + running cost will surpass that of a 4080 and not only that but its depreciation will be MUCH greater not just because of the NVIDIA name but also because AMD decided in 2021 to end driver support after 6 years while NVIDIA goes up to nearly 10 years.

So power consumption MATTERS when it clearly makes your product inferior but you need to actually want to see it.
 

Zaranthos

Distinguished
Apr 9, 2014
34
25
18,560
I've skipped entire CPU and GPU generations because of high TDP values. Maybe I'm not most gamers but I definitely care about my electric bill and I have since I became an adult and the parents weren't paying the bills. It matters a lot if you don't constantly think you need the fastest hardware and power consumption over time means less money for computer upgrades or whatever. The economy is getting worse and electric rates are going up not down. Those carefree gamers will sing a different tune when they lose their job or move out of moms basement.
 

ThatMouse

Distinguished
Jan 27, 2014
245
125
18,760
I care more about idle power consumption, but maybe that problem is solved across the board? Seems like PC's are way more efficient than they used to be. Being able to keep hard drives from waking up would be nice, looking at you Windoze.
 
  • Like
Reactions: artk2219
This author seems to have a lot of time to waste on his hands to write a nothingburger article like this. I've been hearing this stupid argument going back and forth for years and I find it really interesting that this author decides to cover this instead of where power use matters a lot more, CPUs (because CPUs are always active while 3D accelerators are not). Even then, the cost difference isn't much.

Let's look at two top-end cards since that'll exaggerate the power-draw difference the most. The two cards will be the RX 7900 XT and the RTX 4080. Here's their respective gaming power draw numbers:
power-gaming.png

So, we have a 52W delta between them. Now to see just how much that extra 52W costs using the energy calculator at Sust-it.net:

In the the UK, it would take 50 hours of gaming to cost an extra $1USD (£0.78).
In the USA, it would take 100 hours of gaming to cost an extra $1USD.
In Canada, it would take 300 hours of gaming to cost an extra $1USD ($1.35CAD).

I don't know what amazes me more, that people argue about this or the fact that I'm the first one to get the idea to go and see just how significant it is. Even using the astronomical electrical costs in the UK, it's not a big deal. Now maybe people can smarten up and worry about important things instead of being distracted by Enquirer-grade "articles" like this.
 

danielcoles1989

Distinguished
Aug 11, 2006
24
5
18,515
"In notebooks, it matters greatly. In desktop, however, it matters, but not to everyone. There are some people who are really concerned about power, others don’t care as much. We definitely want to make a better perf-per-watt chip. A better chip makes the overall board more affordable, which enables us to do different things with the pricing and hopefully supply additional performance to the gamer. It’s also good for the environment and your own electricity bill."

Is what he says not:

"According to Herkleman, AMD believes GPU power consumption is important in the laptop segment. He goes on to note that while some gamers care about power efficiency, most gamers don't care at all."
 

Giroro

Splendid
This author seems to have a lot of time to waste on his hands to write a nothingburger article like this. I've been hearing this stupid argument going back and forth for years and I find it really interesting that this author decides to cover this instead of where power use matters a lot more, CPUs (because CPUs are always active while 3D accelerators are not). Even then, the cost difference isn't much.

Let's look at two top-end cards since that'll exaggerate the power-draw difference the most. The two cards will be the RX 7900 XT and the RTX 4080. Here's their respective gaming power draw numbers:
power-gaming.png

So, we have a 52W delta between them. Now to see just how much that extra 52W costs using the energy calculator at Sust-it.net:

In the the UK, it would take 50 hours of gaming to cost an extra $1USD (£0.78).
In the USA, it would take 100 hours of gaming to cost an extra $1USD.
In Canada, it would take 300 hours of gaming to cost an extra $1USD ($1.35CAD).

I don't know what amazes me more, that people argue about this or the fact that I'm the first one to get the idea to go and see just how significant it is. Even using the astronomical electrical costs in the UK, it's not a big deal. Now maybe people can smarten up and worry about important things instead of being distracted by Enquirer-grade "articles" like this.
I think the bigger issue isn't cost, but that power draw= heat, and heat dissipation=noise.
Also heat=heat, and a gaming PC can make a poorly air conditioned room pretty uncomfortable in the summer.
A 500 watt PC is a literal space heater.
 
So, were you on this planet when Nvidia unleashed their 3080 and 3090 series of GPUs that almost required a nuclear power plant to power them??
And when the GTX480 and the GTX595 and HD4870X2 and the RX295X2 and many other GPUs would suck so much power it was hilariously dumb to read people trying to justify their existence, efficiency not-withstanding.

As I said: it's about being practical and not cynical.

Regards.
 
  • Like
Reactions: Avro Arrow
Status
Not open for further replies.