News Next-Gen GPUs Will Require More Robust Cooling Solutions

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Eximo

Titan
Ambassador
Saw a recent video where they set a power limit on a 3080Ti, capped it at 220W or so and it still did about 80% the job. Just take a look at laptop GPUs. Not like they aren't more efficient, just that the desktop cards are given free reign to use more power and they are the more leaky of the GPUs.

I've been thinking about it myself since it is a little warm to run a 3080Ti in the summer.
 
  • Like
Reactions: martinch and KananX

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
That’s for the base model, and then people will 90% buy the custom models which will use more power than that, just wait for it, which leads to terrible inefficiency just like with the 3090 Ti and 3090, lol, whole of GA102 is just inefficient cards.

And then you’re comparing your “efficiency metrics” against a inefficient card, I think your whole point is flawed in the first place.
Most AIB 3090's do not have a TDP of 350W which is what I used for the calculation. In order to make any sort of halfway intelligent comparison you have to compare like to like which in this case is the Nvidia FE models which adhere to Nvidia's specs for TDP. If you want to use AIB 3090 Ti's as your basis, then we will see 4090's with twice the performance while using the same amount power resulting is a huge gain in efficiency.
 

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
The top end configuration of the just announced Hopper professional GPU's from Nvidia have a 700W TDP. Plop 8 of those into an H100 workstation and you have 5.6kW's just for GPU's in a single system. Based on your incorrect definition of efficiency, that is catastrophically worse than any rumored 4000 series gaming GPU.
Interesting, so your argument is so poor that you have to abuse server GPUs not intended for personal use, to try and make a false argument.

This was obviously about PCs not servers, and it was about GeForce and not SERVER GPUs that are also used in supercomputers. Insane take you got going there.

I’m gonna explain the most obvious: workstation users buy Quadro or “A” named GPUs, those are efficient and actually what I was talking about. Go check on A6000.
 
Last edited:

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
Most AIB 3090's do not have a TDP of 350W which is what I used for the calculation. In order to make any sort of halfway intelligent comparison you have to compare like to like which in this case is the Nvidia FE models which adhere to Nvidia's specs for TDP. If you want to use AIB 3090 Ti's as your basis, then we will see 4090's with twice the performance while using the same amount power resulting is a huge gain in efficiency.
Another ridiculous take. Any even further overclocked Ada Big GPU will be extremely inefficient, as with 600W even founders (assuming it is 600W) will be over the efficiency curve, which in other words means, overclocked.
 

ien2222

Distinguished
Sure, I only have 25 years of experience with PC tech. I’m gonna skip the rest of your arrogant and nonsensical post, good luck. Typical tech forum, with odd people that don’t have the slightest respect.

PS. IPC gains are a regular thing in GPU tech since decades. Nothing special and nothing alien. Ridiculous takes here sometimes.

Sure, I've only had 29 years but whatever.

I'd say that you are the one without any respect here. If I want to run three 4k monitors, why in the world should I be prohibited from doing so simply because you are bent out of shape over the power draw and think governments should put specific limits on it.

Seriously, why do you specifically think I shouldn't be able to run three 4k monitors for gaming? We aren't talking about a 600w 4060, we're talking about a 600w 4090 Halo card. Top of the line.

This is quite literally a physics matter and fits into electrical engineering (PC Tech, really?), it doesn't matter what you think it should or should not be.

Besides, both AMD and Nvidia gives you tools to underclock and undervolt. Buy a 4090 and limit it to 450w.

Show us where we can look up this huge movement - what do I google search or whatnot - because I have to see this; everyone you know is too small of a sample.

Sorry, a bit hyperbolic and should have been a bit more precise. I would consider resolutions 3440 x 1440 and higher to be in the 4k area.

You can look at the steam survey for one. Straight up 2.4% 4k as primary monitor, assuming that's roughly the same throughout the active user base that's hundreds of thousands of people. 1.3% at 3440 x 1440. and 2.3% at various other resolutions which probably include the ultra ultra wide aspect ratios of which some of it will need 4kish type power.

Then there's the multi monitor category where you have 18% at 4480 x 1440, with another 23% at various higher resolutions.

As a result, you may be approaching 10% of Steams active user base playing at 3440 x1440 or higher. Even at around 7% you're over 1.5 million setups and this is just Steam.
 

MasterMadBones

Distinguished
I have already called out European Parliament and AMD and Nvidia in twitter and saying they should stop the insanity with always higher limits to win their bigger d*** wars and EU should limit the maximum a GPU can use to about 350W max, which is about the maximum which makes sense for a GPU without being terribly inefficient, but of course my twitter was ignored.
I tried to address this in a post here several weeks ago, but the response from the community was that 'it's not that bad'. I agree with you though.
 
  • Like
Reactions: KananX

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
Seems there's a lot of high level electrical engineers in this thread who should contact Nvidia or AMD directly and tell them how to double the power while using less energy.
Ironically you’re right with your usage of the words “high level” which means someone understands some things about it, but not the deeper details, which are, usually, what only engineers understand. However you don’t have to be a engineer for this threads discussion, just experienced or knowledgeable enough. Many things I have said are based on the past, things Nvidia or AMD already did before. The 2080 Ti was insanely efficient and the 3090 simply wasn’t, because it was overclocked to beat or match the 6900XT.
 

TJ Hooker

Titan
Ambassador
You don't understand. At all.

GPU's take a given data set and transform it, grab the next data set and transform it. Massively parallel processing is happening here and it's actually rather simple in how it works, so to speak.

CPU's are different. They do all sorts of different computations and tasks, to understand increases in IPC, you need to understand how pipelines are created and how prediction works to keep the pipeline full. This doesn't really happen the same way with GPUs

For GPUs, it requires a certain amount of transistors, running at a certain speed, to create an output of a specified resolution and FPS. Power draw is a function of what's needed to run a specific number of transistors at a given clock speed to keep errors from happening due to under or over voltage. **

Therefore, you want a higher resolution at the same FPS, or higher FPS at the same resolution, you're either increasing the transistor count, or the clock speed, either way it REQUIRES a higher power draw. A GTX 1080 just can't do any meaningful 4k resolution as it doesn't have the transistor count, nor could you increases the clock speed to what's required without liquid nitrogen.

We're into the 4k gaming era now, and we have been for at least a couple of years. The amount of power needed for that is going to be higher because the transistor count demands it for any given FPS. IF you are still running 1080p, then you won't need anything more than a 4050 when it comes out. If you're at 1440p, then a 4060 is all you will probably need and either way the power usage will be lower than what the previous generations needed for the same performance.

**Edit: Given the same node. Having a smaller node will require less power as mentioned in my first post.
Radeon RX 6000 beat RX 5000 in performance and efficiency, on the same node. Geforce 900 beat 700 series in performance and efficiency, on the same node. E.g. a GTX 980 outperformed a 780 Ti while using a smaller die, with less transistors, and using less power.

It isn't hard to find examples that contradict your assertion that all performance gains require increased power draw and/or a smaller node.
 
  • Like
Reactions: KananX

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
The 2080 Ti was insanely efficient and the 3090 simply wasn’t, because it was overclocked to beat or match the 6900XT.

Until you learn what efficiency means, please stop posting misinformation on the topic. Not going to bother with your other responses as this one can easily be debunked with a review chart. At 4k, the 3090 is the most efficient GPU we have seen to date. 13% better than the 2080ti.


Perf_Watt-p.webp


Based on these figures, the RTX 3090 is actually better than the 3080 in terms of performance per watt, no doubt this is due to a binning process that sees the best silicon reserved for the 3090. In terms of efficiency, the 3090 is surprisingly good.
 

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
Until you learn what efficiency means, please stop posting misinformation on the topic. Not going to bother with your other responses as this one can easily be debunked with a review chart. At 4k, the 3090 is the most efficient GPU we have seen to date. 13% better than the 2080ti.


Perf_Watt-p.webp
https://www.techpowerup.com/review/nvidia-geforce-rtx-3090-ti-founders-edition/38.html

Too bad I got a better link from a better website. You lose hard.

And nothing to say about the many other mistakes you did I pointed out? Definition of a wannabe right here. Get out of my face.
 
Last edited:

Phaaze88

Titan
Ambassador
You can look at the steam survey for one. Straight up 2.4% 4k as primary monitor, assuming that's roughly the same throughout the active user base that's hundreds of thousands of people. 1.3% at 3440 x 1440. and 2.3% at various other resolutions which probably include the ultra ultra wide aspect ratios of which some of it will need 4kish type power.

Then there's the multi monitor category where you have 18% at 4480 x 1440, with another 23% at various higher resolutions.

As a result, you may be approaching 10% of Steams active user base playing at 3440 x1440 or higher. Even at around 7% you're over 1.5 million setups and this is just Steam.
I'm not seeing a large migration from that.

I know Steam doesn't represent everyone - it's still a pretty large sample - and that laptops are mixed in with those results.
How the survey is conducted is a bit funky, as it randomly selects systems; some users can go a few years without ever seeing the notification to take the survey.

That multi monitor desktop resolution section... I wonder how that works, when I have a 2560x1440 and a 1920x1080 both plugged in to a 1080Ti?
Pixel density isn't additive..?

I would consider resolutions 3440 x 1440 and higher to be in the 4k area.
But they are not?
4K(3840x2160) has almost 70% more pixels than 2K UW(3440x1440), whereas 2K UW is about 35% more than standard 2K... how is 2K UW in the 4K area with that big a gap?
 
While I'd like the new GPUs to be much more reasonable with their power draw, I knew they would get higher. Even with a process node shrink, they still are trying to pack in more transistors than the last gen, and that generally requires more power. So yeah, I expected higher power. What is annoying, though, is that we didn't get stuff like 15% bumps in power consumption, we will be seeing much higher overall consumption. So even if the new gen is more efficient (perf/watt), per se, I still wish they could have kept the overall power envelope lower.

But hey, knock out cryptomining, and not only would you have a lot more GPUs available, but you'd also recover a lot more electricity which could offset some of the more power-hungry new-gen GPUs. :)
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
https://www.techpowerup.com/review/nvidia-geforce-rtx-3090-ti-founders-edition/38.html

Too bad I got a better link from a better website. You lose hard.

And nothing to say about the many other mistakes you did I pointed out? Definition of a wannabe right here. Get out of my face.
What part of at 4K did you not understand? I own a 3090, I sure didn't buy it to play at 1440p or less. Why not post 720p results next?

Using the results from the site you linked to. AT 2160P, the 3090 is on average 44.5% faster (93.4fps vs 64.6fps) than a 2080ti, while using only 34% more power (355W vs 265W). That makes it more efficient, at the resolution the 3090 is targeted for, than a 2080Ti. Even at 1440p, the 3090 eeks out better efficiency than the 2080Ti. So, the one game that site used to calculate efficiency (CP2077 at 1440p) was an outlier and not representative of the average game at that resolution. Picking an often CPU limited game to compare GPU efficiency is a pretty odd ball choice. I stand by my original assertion, you are peddling false information.
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
It may hace been the most efficient GPU (in Doom Eternal @4K/Ultra) when it was released in September 2020, but several cards have come out since that outclass it in that metric.
Probably, those results were at the launch of the 3090. From Nvidia, the 3070 probably surpassed it. Anything below the 3070 I would not consider a 4k card. Some of the top end AMD cards may have surpassed it as well. That doesn't invalidate the point that the 3090 was more efficient than anything that had come before it, including the 2080ti, which KananX claimed was "insanely efficient" while the 3090 wasn't.
 

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
What part of at 4K did you not understand? I own a 3090, I sure didn't buy it to play at 1440p or less. Why not post 720p results next?

Using the results from the site you linked to. AT 2160P, the 3090 is on average 44.5% faster (93.4fps vs 64.6fps) than a 2080ti, while using only 34% more power (355W vs 265W). That makes it more efficient, at the resolution the 3090 is targeted for, than a 2080Ti. Even at 1440p, the 3090 eeks out better efficiency than the 2080Ti. So, the one game that site used to calculate efficiency (CP2077 at 1440p) was an outlier and not representative of the average game at that resolution. Picking an often CPU limited game to compare GPU efficiency is a pretty odd ball choice. I stand by my original assertion, you are peddling false information.
Nice coping you’re doing, doesn’t change the fact that 3090 is less efficient than a 12nm GPU from 2018 and way behind the 3070. 2080 Ti was ahead of anything else, unlike the 3090 which isn’t. But keep posting nonsense, you’re a huge fan and defensive about your buying decision, we all get it.

PS. The world doesn’t revolve around 4K.
 
Last edited:

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
I think something has to be done about this power conusmption issue. WE need to remember that these are gaming cards and its not productive. ITs ridiculous to consume so much power for entertainment purpose. Does nothing good to environment.
 
  • Like
Reactions: KananX

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Nice coping you’re doing, doesn’t change the fact that 3090 is less efficient than a 12nm GPU from 2018 and way behind the 3070. 2080 Ti was ahead of anything else, unlike the 3090 which isn’t. But keep posting nonsense, you’re a huge fan and defensive about your buying decision, we all get it.

PS. The world doesn’t revolve around 4K.
moving-goalpost.gif
 

watzupken

Reputable
Mar 16, 2020
1,022
516
6,070
In my opinion, there is a limit as to how much power consumption can go up for consumer PCs before it becomes a problem. CPUs and GPUs have become like a race between the chip makers, instead of improving user experience as a whole. Current high end Ampere chips are already good enough to be heaters in confined rooms, and with the power requirement expected to go up with next gen CPUs and GPUs, they will end up becoming heaters for bigger rooms. To me, it is not so much about the power bill and whether you can keep the component cool with better/ more robust cooling solution because it can be done with fatter/ taller air coolers or even water cooled. The problem is that you have 600W of heat being dumped into the atmosphere, and that is not including heat from the CPU and other components like SSDs, PSU, etc…
 
  • Like
Reactions: martinch

watzupken

Reputable
Mar 16, 2020
1,022
516
6,070
Until you learn what efficiency means, please stop posting misinformation on the topic. Not going to bother with your other responses as this one can easily be debunked with a review chart. At 4k, the 3090 is the most efficient GPU we have seen to date. 13% better than the 2080ti.


Perf_Watt-p.webp
I don’t dispute what you mentioned about improving power efficiency. My take is that power efficiency will surely go up with each generation because of the performance bump. The only question is how much more efficient. The graph you pulled shows RTX 3090 being the most efficient, but clearly the power efficiency is diminishing when compared previous generations. In addition, a jump from 450W (using the most power hungry current gen GPU as a gauge) to 600W is a major leap, even more so if you compare with a RTX 3090 @ 350W. If every 2 years we see such a hike in power requirements in percentage, at some point it will become untenable. In fact for me, I’ve hit the max threshold for GPU power draw with the RTX 3080 @ 340W (for the model I use). Again, I am not considering the impact of the power draw on my utility bill. Rather there are many other factors that pretty much stopped me from considering a GPU with this sort of crazy power requirements.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
What part of at 4K did you not understand? I own a 3090, I sure didn't buy it to play at 1440p or less. Why not post 720p results next?

Using the results from the site you linked to. AT 2160P, the 3090 is on average 44.5% faster (93.4fps vs 64.6fps) than a 2080ti, while using only 34% more power (355W vs 265W). That makes it more efficient, at the resolution the 3090 is targeted for, than a 2080Ti. Even at 1440p, the 3090 eeks out better efficiency than the 2080Ti. So, the one game that site used to calculate efficiency (CP2077 at 1440p) was an outlier and not representative of the average game at that resolution. Picking an often CPU limited game to compare GPU efficiency is a pretty odd ball choice. I stand by my original assertion, you are peddling false information.

Who cares wether it's more efficient. The problem is that it consumes 355W which is alot of power. Gaming is just for personal entertainment purpose. It's not productive at all and waste of power.

Pple can game on a mobile phone that uses a fraction of the power so why would someone needs 355w for a gaming card?
 

TheFlash1300

Prominent
Mar 15, 2022
312
7
695
I don't think more robust cooling solutions are needed, when the problem comes from a company 'inevitably' settling on throwing power efficiency out the window to keep up or stay on top of their competition.
Is this supposed to be a bad thing? Competition produces innovation. When two companies compete, each company wants its product to be better than the other company's product, meaning they will be trying to make a product that is as good as possible.

Thanks to this kind of competition, computational power is growing rapidly.

I have already called out European Parliament and AMD and Nvidia in twitter and saying they should stop the insanity with always higher limits to win their bigger d*** wars and EU should limit the maximum a GPU can use to about 350W max, which is about the maximum which makes sense for a GPU without being terribly inefficient, but of course my twitter was ignored.
If they put limits on power consumption, how will CPU's power continue to grow? Higher power requires a high amount of energy. More energy will allow for more power growth.
 

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
If they put limits on power consumption, how will CPU's power continue to grow? Higher power requires a high amount of energy. More energy will allow for more power growth.
Via higher instructions per clock, in other words, higher efficiency. This has already happened many times. If every time more power was needed for more performance, CPUs would now sit at 20000W and not under 200W (for Ryzen).
 
Last edited: