News RTX 4080 and RTX 4070 Power Consumption Dropped By Up to 30%

bigdragon

Distinguished
Oct 19, 2011
913
286
19,360
0
Note for the writer/editor: the 4070 watt columns are reversed.

I'm not sure how this is unexpected. Nobody liked the high power consumption numbers rumored for the 40-series. Sales would be hurt if gamers had to buy a new power supply to go with each 40-series card. Nvidia doesn't need a problem wholly within their control scaring away customers. Definitely makes sense that there would be a smaller bump to power consumption instead of the ridiculous jump previously rumored.

The important number for me is VRAM. Hopefully Nvidia isn't stingy about VRAM this time around. I want to see a minimum of 16GB on the 4080 and a minimum of 12GB on the 4070 -- preferably also 16GB there.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,285
1,219
4,070
1
Note for the writer/editor: the 4070 watt columns are reversed.

I'm not sure how this is unexpected. Nobody liked the high power consumption numbers rumored for the 40-series. Sales would be hurt if gamers had to buy a new power supply to go with each 40-series card. Nvidia doesn't need a problem wholly within their control scaring away customers. Definitely makes sense that there would be a smaller bump to power consumption instead of the ridiculous jump previously rumored.

The important number for me is VRAM. Hopefully Nvidia isn't stingy about VRAM this time around. I want to see a minimum of 16GB on the 4080 and a minimum of 12GB on the 4070 -- preferably also 16GB there.
Oops, I fixed the 4070 power columns. But yeah, I've long been skeptical of the 450-600W power claims for the 4080 and 4090. Custom cards will probably hit those levels, but I suspect the Founders Edition will be far more reasonable.

Part of me wonders if it wasn't all a disinformation campaign. Get people angry about 600W cards, then announce 450W cards and everyone is happy. Where 3090 Ti was "Wow, this uses a ton of power!" the 4090 will be, "Hey, this only needs 450W, not 600W. Awesome!"
 

thisisaname

Distinguished
Feb 6, 2009
390
153
18,860
0
Or their guess was wrong and they have put out a lower guess, put out enough and you going to be right sometime.

So many leaks and disinformation and they wonder why the sale of current cards have fallen so much in the last few months. Who wants to but now when the new generation is going to be so much greater.
 

PiranhaTech

Prominent
Mar 20, 2021
57
33
560
0
I lean towards this being good news. It doesn't feel like as much of a technology improvement if the next generation has a huge TDP jump. I'm not crazy about the amount of heat my PC puts out when gaming, and my setup is maybe mid tier.

There more than likely is improved technology, especially with Nvidia, but when the TDP jumps that far, it feels like what AMD had to do with Bulldozer and Jaguar.
 

King_V

Illustrious
Ambassador
Yeah, I think that a disinformation campaign is possible, though it seems kind of risky at best.

That said:
but we can presume from this data that Nvidia's engineers have been desperately trying to get these rumored ultra-high power consumption figures to work on the RTX 4080 and RTX 4090 but found the TSMC 5nm silicon didn't behave as desired at these extreme watts and amps.
Ok, see, I am glad they're bringing the power consumption down, but if it was because they wanted to push the silicon to its limits, but simply couldn't, then this isn't exactly virtuous.

"We wanted to consume gobs of power to get the last couple of percent of performance, but we couldn't, so, now we're forced into doing something more responsible" isn't exactly a ringing endorsement of any kind of good intent.

Ok, Nvidia, you're doing better, but grudgingly so, so I'm still giving you the stink-eye.
 
Reactions: martinch

pyrofire95

Distinguished
Sep 19, 2013
5
0
18,510
0
Dear all writers.
The more salt you tell us to put in our beliefs of a rumored the more you're saying we should believe it.
 

warezme

Distinguished
Dec 18, 2006
2,431
36
19,840
20
From one looking forward to a 4090 upgrade I wasn't thrilled about the alleged power usage of 4000 series cards even though I have a 1000w PS already and an updated 1060w waiting in the wings. I am hoping the power for the 4090 is not as high as it has been rumored.
 
Oops, I fixed the 4070 power columns. But yeah, I've long been skeptical of the 450-600W power claims for the 4080 and 4090. Custom cards will probably hit those levels, but I suspect the Founders Edition will be far more reasonable.

Part of me wonders if it wasn't all a disinformation campaign. Get people angry about 600W cards, then announce 450W cards and everyone is happy. Where 3090 Ti was "Wow, this uses a ton of power!" the 4090 will be, "Hey, this only needs 450W, not 600W. Awesome!"
Or...(puts on AMD fanboy cynic hat)
NVIDIA realized that they're not gonna catch AMD in raw rasterization power next gen (at least not on initial release) and decided to scale back on the watts they were pushing to catch the RX 7900 XT's rumored performance.
 
Aug 9, 2022
1
1
15
0
I love how it says that the power consumption was dropped "unexpectedly" as if anyone who isn't legally insane expected Nvidia to release a series of GPUs that would be nigh impossible to air cool, let alone make OEM models for pre-builts for.

Also, nobody in their right mind would ever conclude that a physically smaller die on a smaller and more efficient manufacturing process would consume MORE power, compared to a older and lager die.
It literally doesn't make sense.
I think people should stop listening to Twitter users with anime profile pics with their "insider knowledge" and instead let logic and critical thinking prevail every now and then.
 
Reactions: KyaraM
Aug 2, 2022
9
3
15
0
and they wonder why the sale of current cards have fallen so much in the last few months
I have to disagree. The most important reason sales dropped drastically is because people are expecting muhc muhc greater performance for the SAME prices or even a bit lower if we are talking about middle end cards.
I strongly advise against upgrading GPU now. 2.5years, 2.5 yo cards.... Not to mention Nvidia got redicilously rich selling rtx 3000 to miners. I dont wanna help them get even richer by buying their outdated video cards.
 
Reactions: thisisaname
Oops, I fixed the 4070 power columns. But yeah, I've long been skeptical of the 450-600W power claims for the 4080 and 4090. Custom cards will probably hit those levels, but I suspect the Founders Edition will be far more reasonable.

Part of me wonders if it wasn't all a disinformation campaign. Get people angry about 600W cards, then announce 450W cards and everyone is happy. Where 3090 Ti was "Wow, this uses a ton of power!" the 4090 will be, "Hey, this only needs 450W, not 600W. Awesome!"
I think, strangely coming from me, is that OEMs (like Dell, HP and such) looked at the cooling requirements and the money they would need [to invest in tooling and such*] to cool them in their years old designs for cases and just told nVidia to re-think their power consumption strategy or they won't be buying cards from them. It's the most reasonable explanation I can find, outside of wild speculation, which nVidia has been caught doing before mind you all. Remember the actors in forums? Yeah, Pepperidge Farm remembers XD

The second explanation that I also liked was the "Super" cards one from... Ugh, I can't remember the person's name/nick (sorry!). This is something they did before with the Turing family and, well, have tried before with the Ti series (kind of). EDIT: it was @hannibal !

Regards.
 
To be fair, pushing cards power draw to the limit often results in diminishing returns in terms of performance. Maybe they could get an extra 5% or more performance out of boosting power draw by 100+ watts, but is it worth it? I wouldn't be surprised if many partner cards do that though.

The important number for me is VRAM. Hopefully Nvidia isn't stingy about VRAM this time around. I want to see a minimum of 16GB on the 4080 and a minimum of 12GB on the 4070 -- preferably also 16GB there.
I would be surprised if the 4070 had less than 12GB, considering even cards like the 3060 and rebooted 2060 got that, despite it being a bit of a waste for cards of that level.
 
I would be surprised if the 4070 had less than 12GB, considering even cards like the 3060 and rebooted 2060 got that, despite it being a bit of a waste for cards of that level.
This is due more to memory arcitecture than anything else. The memory bus width dictates the amount (at least via multiplier) of memory the GPU supports*.
The 3060 and 2060 would be laughed at if they only had 6GBs VRAM. The next step is 12GBs. NVIDIA counted on some buyers just looking at the amount of VRAM and thinking, "more equals better/faster."

*Not getting in to non-uniform bus width by using different size VRAM modules or GDDR clamshell as those are an outlier and not the norm.
 
The problem for Nvidia now, is that VRAM has to go up no matter what if the company wants to continue pushing for more ray tracing technology. 8GB is fine, but once games start going into serious RT implementations, that's gonna be problematic.

Unless Nvidia finds another magical boost in memory compression, then perhaps 8GB will live on as the exception.
 
This is due more to memory arcitecture than anything else. The memory bus width dictates the amount (at least via multiplier) of memory the GPU supports*.
The 3060 and 2060 would be laughed at if they only had 6GBs VRAM. The next step is 12GBs. NVIDIA counted on some buyers just looking at the amount of VRAM and thinking, "more equals better/faster."

*Not getting in to non-uniform bus width by using different size VRAM modules or GDDR clamshell as those are an outlier and not the norm.
The 2060 does offer 6GB of VRAM, at least for the original design, though that came out a few years back and it wouldn't make as much sense for the 3060. And yes, the bus design would limit memory bandwidth if Nvidia gave the 3060 only 8GB of VRAM, at least with the graphics processor as it was designed. We see them do that with the 3050 using the same chip with a third of the processor disabled, but it wouldn't really make sense for a card using nearly all of it.

What may have happened, is that Nvidia decided they needed to change their plans for the lineup relatively late in development, after the graphics processors were all designed. Prior to the 30-series coming out, leaks were suggesting that the cards would have significantly increased VRAM over their 20-series counterparts. But that didn't happen.

It's possible that Nvidia caught wind of the fact that AMD was going to be a lot more competitive at the enthusiast level than they originally anticipated. So, they had to be more competitive with how much performance they were offering at a given price level at the high-end. And part of making that happen involved cutting VRAM to allow for card designs to meet lower price points than they would have otherwise. Perhaps memory being more expensive than anticipated played into that as well. Of course, that was all disrupted by the crypto market, and few of the cards were actually sold near those price points.

For example, the 3080 might not have originally been intended as a $700 card, but as something priced higher with more VRAM, possibly marketed as a "3080 Ti". While the card that became the 3070 might have originally been planned as the 3080, again, with more VRAM. Or perhaps they would have utilized different numbers of enabled cores and memory channels depending on how early in production the decision was made.

This left the 3060 in an odd place though, since it was using a graphics chip designed to utilize either 6 or 12GB of VRAM for optimal performance. That would have fit in well with the rest of the lineup if the higher-end cards were all equipped with 12-16GB, but not so much with them using less. Of course, had the chips been designed with the adjusted memory in mind from the start, these mid-range cards could have been built to utilize 8GB without a loss in performance.
 

missingxtension

Distinguished
May 31, 2009
17
3
18,515
0
The claimed leaked speeds would have been worth the big wattage. That is expected, nm are not something that just magically makes a processors faster. You can fit more transistors but when you scale down nm leakage and physics behave differently. Its dumb to see process nodes become such a big part of the specs.
 

KyaraM

Notable
Mar 11, 2022
911
317
890
41
I thought the same "leaker" said the card was going to be 600W+
And before that they projected numbers similar to those "new" ones as well. It's been goin up and down. No wonder that guy has a "good track record", as inconsistent as his leaks are. If I constantly projected different numbers, I would very likely be able to yell "told you so!" at the end as well despite having pulled stuff out of my butt. That's why I only give a hoot about the final, confirmed specs and maybe leaked benchmarks (though those are kinda shaky, too), and not some leaks months to half a year before that are pure speculation anyways.
 

watzupken

Respectable
Mar 16, 2020
821
385
2,270
1
I love how it says that the power consumption was dropped "unexpectedly" as if anyone who isn't legally insane expected Nvidia to release a series of GPUs that would be nigh impossible to air cool, let alone make OEM models for pre-builts for.

Also, nobody in their right mind would ever conclude that a physically smaller die on a smaller and more efficient manufacturing process would consume MORE power, compared to a older and lager die.
It literally doesn't make sense.
I think people should stop listening to Twitter users with anime profile pics with their "insider knowledge" and instead let logic and critical thinking prevail every now and then.
In my opinion, the power draw numbers may be credible only because the bulk of the power consumption actually comes from the GDDR6X. At just 19Gbps, you observe a sizeable jump in power requirement between the GDDR6 used on the RTX 3070 vs GDDR6X used on the RTX 3070 Ti. And don't forget, GDDR6 itself draws quite a lot of power, where you also can find a good bump in power requirement between the GTX 1650/1660 GDDR5 vs GDDR6. So with them pushing 21Gbps and 24Gbps, you can only expect power draw to go exponentially up. And for the top end GPUs, XX102, these are massive chips, and on top of the beefy # of CUDA cores, you now also have Tensor and RT cores and bigger cache to feed, all with the chip being rumored to be clocked at very high clockspeed. All these made possible due to a 2 or more node improvement coming from Samsung's 10nm where Samsung's node is noticeable less efficient than TSMC even on supposedly the same nm. But with competition so heated, I strongly believe that each company will push the clockspeed as far as they can given the fact that the hardware is fixed and no longer a variable they can use to make their products more competitive.
 
Reactions: martinch and KyaraM

edzieba

Honorable
Jul 13, 2016
88
68
10,610
0
We don't know why the power expectations have gone down so unexpectedly
Well, we can hazard a pretty good guess: the rumoured rumours of rumoured power consumption were replaced with rumoured rumours of a different rumoured power consumption. All the while the actual design TDP target remains unchanged, and has no particular correlation with any rumoured values.
 

jp7189

Distinguished
Feb 21, 2012
132
44
18,610
0
I wouldn't put it past nvidia to gimp the performance of the first wave of 4000 series and create an artificial shortage just to move the remaining 3000 series. Then in the spring they can release super/Ti cards at higher prices and with meaningful performance boosts.
 

GenericUser

Distinguished
Nov 20, 2010
261
103
18,990
12
And before that they projected numbers similar to those "new" ones as well. It's been goin up and down. No wonder that guy has a "good track record", as inconsistent as his leaks are. If I constantly projected different numbers, I would very likely be able to yell "told you so!" at the end as well despite having pulled stuff out of my butt.
"My sources tell me the power draw will be somewhere between 1 and infinity watts"

actual spec gets published

"Bam. Called it."
 
Reactions: KyaraM

ASK THE COMMUNITY