News AMD Foresees GPUs With TDPs up to 700W by 2025

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Devs are going to primarily taget the massive console market, and by the virtue of the economics of the console market - and the likely prominence of Switch-like devices going forward - I doubt the minimum power consumption requirements are likely to reach extravagant heights.
Whether you feel the need to run these games at 8K/120+fps, of course, is your call.
Exactly, I can't see consoles adopting that kind of power draw any time soon, and developers are not likely to make their games require what will probably be a multi-thousand dollar GPU. There just isn't a big enough market for that. On the consumer side, cards like that will be more for those with money to spare who are willing to pay a large premium to run their games at very high resolutions and frame rates with raytracing cranked up. Buyers of "mid-range" cards should still be able to run the games just fine, only at more mainstream resolutions, frame rates and settings.

A number of the most profitable games these days are even designed to run on mobile phones, allowing them to access as large a market as possible. Games that push the boundaries of graphics hardware have their place, and that can certainly be a selling point, but they won't sell all that well unless they also manage to run reasonably well on common hardware.

Overall, this "MOAR power" tendency from all camps is just bad for everyone. Ugh...
The great thing about multi-chip designs is that they should be able to scale to cover a wide range of hardware. If designed right, you could theoretically have a mainstream $250 card use a single graphics chip at around 100 watts of power draw, a $500 higher-end card use two chips at around 200 watts, and a $1000+ enthusiast-level card use four at 400 watts. But why stop there when there would be a market for something like an 800 watt card that utilizes 8 of the chips at a $2000+ price point? Just because new design methods allow for product lineups to be expanded beyond what is currently practical to make, doesn't necessarily mean the rest of the range will see higher power draw.

Much like we see with Ryzen on the CPU side, the existence of 280 watt Epyc server processors utilizing 8 chiplets for 64 cores doesn't mean that the mainstream models with a fraction of the cores need to draw all that much power.
 
The great thing about multi-chip designs is that they should be able to scale to cover a wide range of hardware. If designed right, you could theoretically have a mainstream $250 card use a single graphics chip at around 100 watts of power draw, a $500 higher-end card use two chips at around 200 watts, and a $1000+ enthusiast-level card use four at 400 watts. But why stop there when there would be a market for something like an 800 watt card that utilizes 8 of the chips at a $2000+ price point? Just because new design methods allow for product lineups to be expanded beyond what is currently practical to make, doesn't necessarily mean the rest of the range will see higher power draw.

Much like we see with Ryzen on the CPU side, the existence of 280 watt Epyc server processors utilizing 8 chiplets for 64 cores doesn't mean that the mainstream models with a fraction of the cores need to draw all that much power.
Hm... While I don't dismiss what you say, I do have to disagree that one of the main "pluses" of MCM / chiplets is not efficiency, but effectiveness of die area over efficiency. AMD's been managing power very well, but do not mistake their massive efforts (and TSMC's lead) on making the Zen cores and I/O dies as efficienct as they possibly can over the IF mesh. The amount of power a non-monolithic die can consume over a monolithic die is about 10-20%, which is not trivial. The more cores, the more the mesh over the MCM will punish the design in the long run. Intel was forced to use bigLITTLE to reach parity (somewhat), but they still have the "monolithic" advantage via lower overall power for the mesh and lower latencies. If you want me to phrase it differently: if Intel were to make Alder Lake an MCM design, they'd be horribly inneficient compared to AMD. I will concede this is "what-ifsm" territory and I won't even argue contrary points, but when you move to MCM, concessions need to be made.

No solution is perfect, is what I'm trying to get at. Intel's approach has it's pros, much like AMD's, but overall, Intel will have to demonstrate EMIB works well enough to use MCM effectively with Sapphire Rapids.

On the GPU side, nVidia is stretching their monolithic advantage to the furthest they can and, from what I can see, after Ada (or Lovelace) they'll have to use some form of MCM for sure. The behemoth die they'll be using here is just... Bananas 😆 I wonder how much bigger of a die they could get away with given yields, but remember: the bigger the die, the less space-efficient it'll be per waffer, implying less working dies and more defects.

Pro's and Con's as always xP

Regards.
 
While I don't dismiss what you say, I do have to disagree that one of the main "pluses" of MCM / chiplets is not efficiency, but effectiveness of die area over efficiency.
My point was not so much that multi-chip cards would improve energy efficiency, but simply that they could open up the option of making really high core-count graphics cards possible, without significantly impacting the power draw of the rest of the lineup.

So, the 700 watt card suggested in the article might not be referring to any relatively mainstream cards hitting those power levels, but rather designs that go well beyond the scope of current models. So, the more typical sub-$1000 designs might stick to relatively "normal" power levels, while something like a $2000+ card with double the graphics cores might draw significantly more. For ~99% of the those buying a graphics card for gaming though, the power draw of those extreme-enthusiast cards won't really matter, since they won't be buying them anyway.
 
My point was not so much that multi-chip cards would improve energy efficiency, but simply that they could open up the option of making really high core-count graphics cards possible, without significantly impacting the power draw of the rest of the lineup.

So, the 700 watt card suggested in the article might not be referring to any relatively mainstream cards hitting those power levels, but rather designs that go well beyond the scope of current models. So, the more typical sub-$1000 designs might stick to relatively "normal" power levels, while something like a $2000+ card with double the graphics cores might draw significantly more. For ~99% of the those buying a graphics card for gaming though, the power draw of those extreme-enthusiast cards won't really matter, since they won't be buying them anyway.
I mean, there's already GPUs (if you want to call them that) over 700W for the Data Center, so I'm taking in the context of consumer cards. Also, one obvious thing I didn't notice; TDP != TBP. I always forget that stupid little detail; le sigh. Anyway, they can define their TDP metric however they want, but I'll go with TBP instead on that statement, because reasons, haha.

Anyway, no matter how you slice it, "moar powah" is not going to be good as a norm, unless there's breakthroughs in energy production and storing (batteries). Power grids, while not really saturated everywhere (although some well known cases, but not the norm), this is a trend I don't think it's really sustainable unless Companies start including the power generation side of the equation into their profit generations schemes rather soon. That's not even taking into account heat... I don't want to play a game and feel like I'm in a sauna >_<

Regards.
 
Really it's no problem.

Just move your PC to the kitchen and use the electric stove/electric range circuit: 240V x 50A = 12,000W. You can have a 2000W PSU for the GPU, another one for the CPU. A 1000W one per additional PCIe slot. A 500W one for the hyper-super-duper gaming mouse and keyboard.

Piece of cake. Still plenty of room in terms of power for the future 16999X Ti Ti Ti XXXX.

You won't be able to use the stove and the PC at the same time but who cares.

Ultimately, to be really future proof, you can opt to have your own solar furnace to generate electricity (1+ Mega Watt), it's green! https://en.wikipedia.org/wiki/Odeillo_solar_furnace#/media/File:Four_solaire_001.jpg
At that point the PC is the stove 😉
 
  • Like
Reactions: King_V