Discussion When Will PC Parts Start Optimizing Power for Performance

Gamefreaknet

Reputable
Mar 29, 2022
356
15
4,685
(Skimped a little bit on the Title) but a general trend as later CPUs and GPUs have been released it has been notable that:
More Powerful CPU and/or GPU = Higher Wattage
Generally from living in the UK the highest consumer Amperage (in a home) through plugs is 13a which allows a maximum of 3120w through at once. (I'm near certain industrial rated cables have higher Amperages - Caravan Fuses are 16a and similar) and better wire gauges for less resistance.
That being said there are plenty of builds using top tier components (13900K, 14900K, 4090 etc....) and for multi-gpu setups considering the average 4090 takes around 400ish watts - 480w (without overclocks or overvolts). If you were to do a triple GPU setup (overkill and not really needed in my personal opinion) taking the most power hungry GPUs and 1440w (3x 4090 GPUs) +125w (13900K/14900K) and thats around 1600w (without voltage tweaks or overclocks).
(Not sure how the power system in other countries works fully although I do know some use 120v - America and other countries - but not the "full details") but considering that wattage is just over half the wattage the average UK consumer plug can take without making it "industry standard/heavy duty"/etc....) eventually it's going to be to a degree forced that components need to draw less power whilst maintaining/having better performance in the next generation of parts?
 
They won't. The main problems are the other guy is doing it and the only thing that sells to people are how many 3DMarks and FPSes the card will get. In addition, if people live in a place where electricity is fairly cheap, then they won't really care about power consumption. Even moreso if they're living in an area that generates most of its power using clean energy sources or the person has solar so they don't feel bad about consuming all that power.

In addition to the above, the manufacturers have to specify generous defaults so that all products they sell hit the specified targets. My CPU for instance doesn't need 1.4-1.45V to hit 4.6GHz, it can do that with ~1.2-1.25V. But that's just my CPU. Someone else's CPU may have more slight defects in it such that it can't hit 4.6GHz unless it's fed at least 1.3V.

Similarly my video card can hit its stock boost clock of about 2.6GHz at ~0.91V, but by default it'll want 1V. But again, this is my video card, someone else's may need 1V.

Also at the end of the day, you can only really design a chip for either raw performance or better efficiency. You can't have both.
 
Currently, consumer level PCs fit within the power levels of a typical home.

A 3x 4090 system is NOT a typical use case.

It is not in their interest to produce parts that can't actually run in a typical residence.
So....things are already optimized for this use.
 
"They" already do, AMD with Ryzen emphasizes efficiency and Intel introduced power saving cores on which most mundane tasks run. They all have one common "enemy", HEAT which is seriously in the way of performance and is just wasted energy. They may not be out to save the world but it's in their best interest to improve efficiency.