Discussion Thoughts on increasing GPU power usage and global warming

MasterMadBones

Distinguished
Hello everyone, this is something that's been on my mind ever since the launch of the 3090, and especially now with the 3090 Ti and the upcoming launches of RDNA3 and, in particular, Lovelace, I feel like it's something I want to discuss with the community.

In last couple of years, governing bodies have put regulations in place that intend to reduce the power usage of PC components, in particular in idle scenarios as they represent most "wasted" energy. However, now that competition across both the CPU and the GPU market is intensifying, peak power consumption is growing very quickly. You can make an excuse for CPUs: generally speaking they will rarely hit their peak power level. GPUs, however, are a different story. Most people who buy a GPU intend to use it at or close to its full potential. Lifeless gremlins as we gamers are, that can be for as much as 4 hours or even more every day. With GPUs soon going 450-600W, this is a huge increase in energy usage, not just directly, but also from things such as increased PSU losses and AC load for those that like to keep their rooms cool.

Amidst the current energy crisis and global warming, can such power usage be considered acceptable? Shouldn't the likes of AMD, Nvidia and soon Intel be looking to reduce overall power usage? I understand that these companies want to compete with each other at the highest possible level of performance and a loss of efficiency at the high end is a natural consequence of that, but this is about to get ridiculous. As a SFF enthusiast, these huge cases with massive graphics cards have always puzzled me when compared to the compactness and efficiency of consoles, and it's about to get worse. Is bigger and hungrier what the "PC masterrace" has become?

I think it might be time for regulators to step in and limit not just idle, but also peak power consumption on consumer hardware, GPUs in particular. I don't know where the limit should be yet, but we can't let this problem get out of hand. How much performance do we really need from our graphics cards? Can we accept a little less so we can help the planet?

Let me know what you think. I can't be the only one who is concerned about this.
 
IMO PC gamers make up a very small portion of the market, as a whole. Things like mining have proven to be incredibly energy reliant and have caused issues with the grid/supply in many places, which reacted by banning the practice and going after miners.
Even with that said, things like server farms for various internet activity and ability, cloud storage, and more use incredible amounts of power both on the equipment as well as cooling. I am just not sure that 'gamers' are a large enough percentage of the overall power usage within the PC scenario as a whole to be concerned with.
 
  • Like
Reactions: drivinfast247
People who have high-end or top-end PC components represent a small part of the overall computer usage market. Taking a look at the Steam Hardware Survey, a vast majority of people still use midrange video cards. The first video card to show up that uses more than 200W TBP is in 12th place. The total amount of people who have a modern top-end video card is less than a percent.

If we focus the shift on say server farms or whatnot, I think that's a different story. An increase in power consumption may not be a bad thing as long as there's a equivalent, if not better, increase in the amount of stuff being done. For instance, if we have two servers, one taking 500W of power and the other 1000W of power, it will be better in the long run to use the 1000W server if the work being done is over twice that of the 500W server.
 
In addition to the above, you're sort of assuming that all the power used is generated in a way that contributes to global warming. But that depends on where the user lives. In places like India and South Korea and China that are on the list of the top ten coal generating power plants, you could make that argument. That's why China has banned mining. But on the other hand, a large amount of coal generated power is used in China to refine the chemicals needed to create solar panels, which can result in the panels not achieving green status for a few years after they're installed. Where I live we have something called Hoover Dam and solar so 29 percent of our local power is not fossil fuels. So there's no finite answer as to what contributes more to global warming and you just can't go around banning this and that because you think it will help when there's no proof it really will.
 
  • Like
Reactions: Why_Me
Even with that said, things like server farms for various internet activity and ability, cloud storage, and more use incredible amounts of power both on the equipment as well as cooling. I am just not sure that 'gamers' are a large enough percentage of the overall power usage within the PC scenario as a whole to be concerned with.
If we focus the shift on say server farms or whatnot, I think that's a different story. An increase in power consumption may not be a bad thing as long as there's a equivalent, if not better, increase in the amount of stuff being done. For instance, if we have two servers, one taking 500W of power and the other 1000W of power, it will be better in the long run to use the 1000W server if the work being done is over twice that of the 500W server.
High power in server in my opinion is fine, considering servers are made to do their work as efficiently possible. It's simply a function of the amount of work that needs to be done. In the case of consumer hardware, efficiency is more and more often thrown out the window in favor of topping the TH/AnandTech/GN charts. I find it hard to make an argument that the level of performance we get from it is something we actually need.
 
In the case of consumer hardware, efficiency is more and more often thrown out the window in favor of topping the TH/AnandTech/GN charts. I find it hard to make an argument that the level of performance we get from it is something we actually need.
If we go down this rabbit hole, we have to ask: do we need computers in our homes? How much of our time is spent on them doing something actually productive, whatever qualifies as such? If it's entertainment purposes, we don't need computers, we can just go outside and do sports, or read a book, or whatever.

And then there's the question of what if the computer is essentially their only major electronic device they use? Is it worth spending $600 for a computer and all of its associated emissions and whatnot to save maybe 25W compared to a common gaming PC doing idle tasks?

Or if you want to go on about the idea that we don't need more than say 1080p 60FPS. Well, I posit we don't need any more than 30 FPS. Maybe even going so far as to say 720p is fine (I mean, some of my Blu-Ray videos have just a subtle blur to them if I convert them to 720p vs 1080p, then put them on full screen on a 27" 1440p display). And I'll be promptly laughed at and be told to stop playing games on a potato.

In any case, let's not focus on the top-end. That's small fry. Focus on the midrange where most people are. And looking at the past 10 or so generations of NVIDIA cards that have a 60 in the model number, it's remained at around a 150-160W median.
 
  • Like
Reactions: lordmogul
Or if you want to go on about the idea that we don't need more than say 1080p 60FPS. Well, I posit we don't need any more than 30 FPS. Maybe even going so far as to say 720p is fine (I mean, some of my Blu-Ray videos have just a subtle blur to them if I convert them to 720p vs 1080p, then put them on full screen on a 27" 1440p display). And I'll be promptly laughed at and be told to stop playing games on a potato.
That is exactly why I steered clear of saying what exactly is "enough". The standard is fluid, but what I am sure of is that most people don't need the top of the line graphics card, as evidenced by their market share. As for whether home computers are needed at all, I think that leisure is a good reason to say that they are necessary as it contributes to people's mental health.

I am fully aware that this is a very minor portion of the market, but nevertheless it's concerning in my opinion. Even small things may contribute to an overall improvement, but there is an attitude of "how much of a difference can one person make" that is holding a lot of progress back.

As someone who lives in Europe, I think releasing a 600W GPU amidst skyrocketing energy prices is in poor taste, although that is more of a personal thing.
 
That is exactly why I steered clear of saying what exactly is "enough". The standard is fluid, but what I am sure of is that most people don't need the top of the line graphics card, as evidenced by their market share. As for whether home computers are needed at all, I think that leisure is a good reason to say that they are necessary as it contributes to people's mental health.

I am fully aware that this is a very minor portion of the market, but nevertheless it's concerning in my opinion. Even small things may contribute to an overall improvement, but there is an attitude of "how much of a difference can one person make" that is holding a lot of progress back.

As someone who lives in Europe, I think releasing a 600W GPU amidst skyrocketing energy prices is in poor taste, although that is more of a personal thing.
Luckily it's not a government mandate to upgrade your GPU.