News AMD Foresees GPUs With TDPs up to 700W by 2025

Well, AMD is clearly wrong.

nVidia is rumored to release a 600W card already, so 100W in 3 more years looks rather small. Plus, I'm sure they'll feel compelled to increase the power if nVidia increases the power just to get a sight advantage.

Overall, this "MOAR power" tendency from all camps is just bad for everyone. Ugh...

Regards.
 
Part of me is giggling and the cartoonish thought of AMD saying "yeah, let's just say that we expect our GPUs to hit outrageous power consumption, too, then Nvidia will keep going that route, while we make it more efficient, and they'll have no idea what we're up to."

Realistic? Probably not. But, sometimes the cartoonish-absurdity interpretation is fun.
 
What worries me is game developers creating content for these hungry GPUs, then forcing all of us to get them to play their games. If the top GPUs doubled their consumption, the lower ones should increase too, and suddenly reasonable cards won't handle the extra quality of future games.

I hope it's not a tendency, and users feel constrained by cooling, PSU wattage/cost, energy, heat, etc., and don't feed this nonsense.
 
Add a higher-end CPU and that will be like running a small toaster oven while gaming

I prefer PC gaming, but man, this is making me glad that game consoles often have that 150-300W limit. Hopefully that can help keep games playable on a 200W or less GPU
 
Last edited:
Well, AMD is clearly wrong.

nVidia is rumored to release a 600W card already, so 100W in 3 more years looks rather small. Plus, I'm sure they'll feel compelled to increase the power if nVidia increases the power just to get a sight advantage.

Overall, this "MOAR power" tendency from all camps is just bad for everyone. Ugh...

Regards.

They are rapidly closing in on the limits of a USA wall plug's maximum power delivery. If CPU + board take up 300W and GPU takes 700W and losses take 100W you're left with only 400W of headroom.
 
  • Like
Reactions: artk2219
GPU's will be bundled with it's own generator by 2030 while Nvidia patents small nuclear batteries

If I were a company and I had to choose between making GPU's or having the capability to develop a safe, clean, and small portable nuclear battery that could fit in the confines of a computer and not be a hazard, I would go for the nuclear battery option every time. Not everyone needs a GPU, but holy cow a safe nuclear battery revolutionize so many fields. Need 3KW of continuous power from a device roughly the size of a piece of toast? We've got you covered, please make your checks payable to Jensen Hua... umm ahem, I mean Nvidia corporation please.
 
They are rapidly closing in on the limits of a USA wall plug's maximum power delivery. If CPU + board take up 300W and GPU takes 700W and losses take 100W you're left with only 400W of headroom.

Eh a standard NEMA 5-15 can handle 1875 watts (125v x 15A), theres still plenty of headroom so long as you dont have a bunch of things on the same curcuit, either way you're not wrong in that thats alot of power for a high end desktop, thats pulling into straight up toaster or electric space heater territory.
 
If I were a company and I had to choose between making GPU's or having the capability to develop a safe, clean, and small portable nuclear battery that could fit in the confines of a computer and not be a hazard, I would go for the nuclear battery option every time. Not everyone needs a GPU, but holy cow a safe nuclear battery revolutionize so many fields. Need 3KW of continuous power from a device roughly the size of a piece of toast? We've got you covered, please make your checks payable to Jensen Hua... umm ahem, I mean Nvidia corporation please.

I mean, of course the first thing that came to mind was the Fallout universe. And now I'm picturing Vault Boy working away with a nuclear powered PC and starting to glow as he gets into his gaming..


6a0133f3b98a81970b0240a450f563200c-600wi
 
  • Like
Reactions: artk2219
Eh a standard NEMA 5-15 can handle 1875 watts (125v x 15A), theres still plenty of headroom so long as you dont have a bunch of things on the same curcuit, either way you're not wrong in that thats alot of power for a high end desktop, thats pulling into straight up toaster or electric space heater territory.
Folks with those crappy power strips and surge protectors, for example?
 
  • Like
Reactions: artk2219
Well, AMD is clearly wrong.

AMD Foresees GPUs With TDPs up to 700W by 2025

By 2025, as in any time between now and 2025. AMD is probably right.

3 years from now, we'll only be on the refresh cycle of the generation that succeeds the upcoming generation releasing later this year. So, it isn't a forgone conclusion we'll hit 700W by then. It will depend on what this generation tops out at and how well it is received by customers. All we have are rumors at this point.
 
  • Like
Reactions: prtskg and artk2219
Power consumption is certainly out of hand in the GPU space, and I think Ampere is the worst example I've seen. The architecture is very efficient at lower clockspeeds, but to maintain margins they've got them clocked so high all of those gains are thrown away. We need a generation like the GTX 600 series which basically served as a reset on power consumption.
So what you are saying is a gpu called Ampere takes too many amps?

You can't say Nvidia was trying to hide it lol.
 
I mean, you don't have to use the full power headroom of these GPUs. Surely better to have it and not use it, than need it and not have it?

I do think Nvidia and AMD need to do a better job exposing these knobs to the user though. Strange that the best way to manage power/temp limits/undervolting is a Windows-only third party program.
 
Last edited:
What worries me is game developers creating content for these hungry GPUs, then forcing all of us to get them to play their games. If the top GPUs doubled their consumption, the lower ones should increase too, and suddenly reasonable cards won't handle the extra quality of future games.

I hope it's not a tendency, and users feel constrained by cooling, PSU wattage/cost, energy, heat, etc., and don't feed this nonsense.
Devs are going to primarily taget the massive console market, and by the virtue of the economics of the console market - and the likely prominence of Switch-like devices going forward - I doubt the minimum power consumption requirements are likely to reach extravagant heights.
Whether you feel the need to run these games at 8K/120+fps, of course, is your call.
 
By 2025, as in any time between now and 2025. AMD is probably right.

3 years from now, we'll only be on the refresh cycle of the generation that succeeds the upcoming generation releasing later this year. So, it isn't a forgone conclusion we'll hit 700W by then. It will depend on what this generation tops out at and how well it is received by customers. All we have are rumors at this point.
I thought that my writing wasn't that bad or incorrect, but if I write iit this way: "Fran Foresees GPUs With TDPs up to 700W by 2023", does it make it better for you? 😊

Regards.
 
  • Like
Reactions: artk2219
Here's the thing though: Will GPUs consume 700w or more power by 2025, if not sooner? Well yeah, of course, MCM designs will allow much more performance density, especially in the industrial side where more exotic cooling methods can be used. This is a good thing, and will increase overall efficiency as much as 64, 96, and soon to be 128 core server processors have allowed 1 server to replace half a dozen or more and power consumption to decrease as a result. I would not be surprised to see high performance server level cards break the kilowatt barrier by 2025 or shortly after, again because MCM allows them to break free of the limitations of monolithic, same as they did with CPUs, and existing industrial chillers can handle them.

In the consumer side? I don't see that happening outside the ultra high end and prosumer level at all, even into the angstrom level.
 
  • Like
Reactions: artk2219
Really it's no problem.

Just move your PC to the kitchen and use the electric stove/electric range circuit: 240V x 50A = 12,000W. You can have a 2000W PSU for the GPU, another one for the CPU. A 1000W one per additional PCIe slot. A 500W one for the hyper-super-duper gaming mouse and keyboard.

Piece of cake. Still plenty of room in terms of power for the future 16999X Ti Ti Ti XXXX.

You won't be able to use the stove and the PC at the same time but who cares.

Ultimately, to be really future proof, you can opt to have your own solar furnace to generate electricity (1+ Mega Watt), it's green! https://en.wikipedia.org/wiki/Odeillo_solar_furnace#/media/File:Four_solaire_001.jpg
 
Nope. I've drawn the line. That sort of power demand where you have to be concerned about what you have plugged into the outlets in the same room aren't going to be fungible. And lets not mention adding an air conditioner to the room to offset the heat load. I saw a video the other day of a guy ducting his gaming rig out a window so he didn't braise while gaming. Silly.

At this point, a boring mid range card gives good 1080/1440 performance with settings turned up. And it takes a 3080 or better with a big PS to start doing 4k without everything turned off on some poorly optimized code.

I'll buy a console or use cloud gaming, or stick to what I already have.

We started doing the same things years ago on the CPU side when some wanted to continue with single cores and raise the GHz through the roof, and then multi-core became a better option. Like the transition between the Pentium 4 and the Core Duo.

It's time for a different, non power hungry model.

Perhaps we teach good coding skills to people before they start a game company?