The only way to solve this problem is to have a cultural shift in PC enthusiast circles. But good luck trying to tell the random person who gets off to having a million FPS that they need to tone it down. Same goes for the manufacturers.
At one point I wanted to see how much effiicency I could get out my setup, so I've got three profiles in my MSI Afterburner for my 2070 Super:
- Voltage capped to ~0.925V, with a frequency ceiling of 1950MHz
- Voltage capped to ~0.8V, with a frequency ceiling of 1800MHz
- Voltage capped to ~0.65V, with a frequency ceiling of 1650MHz
I usually stick with the second profile because for the most part, I can achieve about 65-70% TBP utilization and get within 95% performance. If I go down to the third profile, I can easily get down to 50% TBP.
The funny thing is too is I noticed in some games that performance doesn't go up, but the card will happily go up to the limits. In one game, I noticed I got a 0% performance improvement at 100% TBP compared to dropping the power limit down to 75% TBP. In another game, similar thing, only it was between the two lower Afterburner profiles.
And I relayed this point a lot but I found a similar thing happened with my CPU, a 5600X, while doing a Handbrake run. If I let the CPU run at base clock, it chewed about 60% of the power than if I let it run at full bore, and it only lost about 15-20% performance. Although when running games, it normally doesn't consume anywhere near its PPT limit. Plus I undervolted it.
So as a result of all this, my computer would've normally sat at 300+W while gaming. It now averages around 200-220W and it still handily meets or exceeds performance requirements.