News RTX 4080 and RTX 4070 Power Consumption Dropped By Up to 30%

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

BILL1957

Commendable
Sep 8, 2020
59
17
1,535
I thought the last numbers I saw before these was for the vanilla cards was 420w for the 4080 and 450w for the 4090.
The 600w number was rumored to be for the 3090ti and the extreme 800w was supposed to be for some Titan class monster.

Of course all these figures are leaks from different phases of testing in different configurations.
Perhaps now they are finding the sweet spot that gives the maximum performance as compared to getting away with a lower wattage draw.

Again these new numbers could very well be the bottom line performance founders edition specs and the AIB's still may have some sku's with the power cranked up.
But again, more choice is good for the consumer, if heat and operating cost is your concern buy the lower spec model of the tier card you want and if raw performance is your bag damn all else there may a model in the stack for you as well!
 

thisisaname

Distinguished
Feb 6, 2009
804
439
19,260
I have to disagree. The most important reason sales dropped drastically is because people are expecting muhc muhc greater performance for the SAME prices or even a bit lower if we are talking about middle end cards.
I strongly advise against upgrading GPU now. 2.5years, 2.5 yo cards.... Not to mention Nvidia got redicilously rich selling rtx 3000 to miners. I dont wanna help them get even richer by buying their outdated video cards.

I think that is what I said in the second paragraph?

Who wants to but now when the new generation is going to be so much greater.
 

russell_john

Honorable
Mar 25, 2018
115
81
10,660
Nvidia's RTX 4080 and RTX 4070 have reportedly dropped in estimated power consumption by up to 30%, with the RTX 4080 falling to 320W.

RTX 4080 and RTX 4070 Power Consumption Dropped By Up to 30% : Read more

My assumption is that this guy was dead wrong with his initial estimates and is now perpetrating a little CYA ...... Leakers always have an excuse why they were wrong that's why people shouldn't give them the time of day when all they are looking for is attention .....
 

husker

Distinguished
Oct 2, 2009
1,209
222
19,670
My assumption is that this guy was dead wrong with his initial estimates and is now perpetrating a little CYA ...... Leakers always have an excuse why they were wrong that's why people shouldn't give them the time of day when all they are looking for is attention .....
Or their guess was wrong and they have put out a lower guess, put out enough and you going to be right sometime.

Or possibly during the R&D process things change. What was an accurate leak 2 months ago is no longer accurate. Like, don't read the weather in a 2 month old newspaper, look out the window, and say "Man, that report was way off!"
 

thisisaname

Distinguished
Feb 6, 2009
804
439
19,260
Or possibly during the R&D process things change. What was an accurate leak 2 months ago is no longer accurate. Like, don't read the weather in a 2 month old newspaper, look out the window, and say "Man, that report was way off!"
Which would be why I'm getting to dislike these leaks, I want some nice solid facts not some speculation, possibility based on old information.
 
"My sources tell me the power draw will be somewhere between 1 and infinity watts"

actual spec gets published

"Bam. Called it."
There are definitely "leakers" that take this approach. Kopite7kimi tends to be better than that. At least, he has been in the past. I don't get the point of being a Twitter leaker of information, though. It's not like you get paid for getting lots of retweets. Well, maybe you can get paid a bit, I don't know. Would be curious to know if 20K followers is worth anything at all. Super Follows? ¯\(ツ)
 

SethNW

Reputable
Jul 27, 2019
36
20
4,535
That is exactly problem with leaks, they aren't from final products. Just stuff they got in the lab. I got no doubt they have 600W or whatever prototype there. I am pretty sure nVidiaengineers know that going all out on power won't make them look best either. So stiff change and might still change, for better or worse, depending on yields, state of he market and stuff.

As for buying now vs waiting, I pretty much would say it depends. If you are buying 4080 price tier, wait, since if you are getting something towards top end, you might as well get the latest. If you are note like 4060 tier, then if you need something now, buy now, those will come out latter, even 4070 should be at least a month latter, if not more. Since from rumors ai heard I'd that board partners were pisses at nVidia for stock and were willing to have allocation dropped just to make sure nVidia knows it. So I got feeling like nVidia will try to stall anything that is more directly competing below top end to give board partners time to sell stock. And it seems like parents are happier now. Bit like AMD tried to do after previous mining craze, where market was flooded with used Polaris and leftover stock didn't sell, so AMD only released Vega that was above Polaris tier. So Polaris would keep selling. Except nVidia will be more successful. Bit if you can wait, because younare more like, I want some change and not I can't play games I want well enough, then waiting is always beneficial, till you got to break the cycle with you needing new card. Hence why I say, don't buy because new stuff came, buy when you need to, because old stuff just can't do what you need it to with good enough performance level.
 
  • Like
Reactions: martinch

AgentBirdnest

Respectable
Jun 8, 2022
271
269
2,370
I don't get the point of being a Twitter leaker of information, though. It's not like you get paid for getting lots of retweets. Well, maybe you can get paid a bit, I don't know. Would be curious to know if 20K followers is worth anything at all. Super Follows? ¯\(ツ)
I've been thinking this same thing lately. What's the incentive here? I don't wanna get all tin-foil-hat, but I can't think of a better explanation: could Nvidia be intentionally leaking things through this Twitter account to build hype for the product? That way, if the final product ends up being worse than the leaks, Nvidia is off the hook.

If the leaks are true, it would make me more attracted to RTX 40, and more likely to save up for one. The difference between 320w and 450w is a big deal for me, in the middle of a hot desert.
But, I don't believe the leaks. Not yet, anyway. I'll believe them if they end up being accurate. :)