News GeForce RTX 4090 May Have 24GB of 21 Gbps GDDR6X VRAM

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
RTX 4090, PG137/139-SKU330, AD102-300, 21Gbps 24G GDDR6X, 600W RTX 4070, PG141-SKU341, AD104-400, 18Gbps 12G GDDR6, 300W
Ouch and double ouch!

I don't care for stupid 600W GPUs, but 300W for the RTX 4070 I do no like at all!
I'll wait for RDNA3 RX 7700 XT at 250W.
 
Looks like any future GPU review ought to give projections on what the monthly electric bill impact will be.
Even if you play games for 12 hours per day, the difference between a 200W and 300W GPU would be pretty negligible.

100W * 12 hours = 1.2 kWh. If you have super expensive electricity, that might work out to almost $0.50 per day. But if you can afford to play games for 12 hours every day, I can't imagine the $15 per month would be much of a hardship.
 
Also if you really want to call yourself a "power" user, you would do well to tune your cards for efficiency and not chase after absolute maximum performance. That last 25% or so of the power envelope may just be for squeezing out another 5-10% more performance at best. With undervolting, you could drop the power consumption significantly without losing much performance.
 
Last edited:
  • Like
Reactions: renz496

spongiemaster

Admirable
Dec 12, 2019
2,345
1,323
7,560
Also if you really want to call yourself a "power" user, you would do well to tune your cards for efficiency and not chase after absolute maximum performance. That last 25% or so of the power envelope may just be for squeezing out another 5-10% more performance at best. With undervolting, you could drop the power consumption significantly without losing much performance.
I guess this depends on the cost of your electricity. If you're mining, it makes total sense to optimize power usage. If I was strictly a gamer, this would be a total waste of time for me. Starting with a 300W GPU, if I cut power consumption by 25%, that's 75W. If I played games 6 hours a day, everyday of the year, and that maxed out the GPU 100% of the time, all impractical in reality. At my current electricity rate of 9.84c/kWh, that comes out to a savings of 4.5 cents a day, $1.33/month, and $16.16 per year. Who cares? Then as a bonus, I lose 5-10% of my performance. I understand, not everyone has electricity rates that low, but even if you double the electricity cost, you're looking at only $32 for an entire year. The hysteria over the additional cost for electricity for these cards is completely blown out of proportion.
 
Last edited:

LastStanding

Prominent
May 5, 2022
75
27
560
I think the argument about the extra "power usage" will never go away and in most advanced territories, many will not lose any sleep thinking about it.

But, in my opinion, the major attention/concern, and this outweighs everything in all electronic's components, that SHOULD be focused on is... thermals!
 
  • Like
Reactions: Charogne
May 11, 2022
2
1
15
RTX 3070 Ti is already basically at 300W (290W reference, more than that on custom cards). ¯\(ツ)
Feels like a stretched point. At least compare generational equivalents; the 3070 is a ~220W card. That 290W stands to prove inefficiency that you are seemingly happy to swallow for a mere 6-8%? Reckon the 4070Ti would draw upper 300 at this rate
 
  • Like
Reactions: VforV

watzupken

Reputable
Mar 16, 2020
1,171
655
6,070
Even if you play games for 12 hours per day, the difference between a 200W and 300W GPU would be pretty negligible.

100W * 12 hours = 1.2 kWh. If you have super expensive electricity, that might work out to almost $0.50 per day. But if you can afford to play games for 12 hours every day, I can't imagine the $15 per month would be much of a hardship.
In my experience, I feel it is not just about the power price you pay for upgrading from a 200 to 300W GPU.. For example, with previous gen of GPUs I’ve tested, i.e. GTX 1080 Ti, RTX 2060 Super, etc, I’ve never once experience where my room started to heat up fairly rapidly. With the RTX 3080, I could feel my room warming up even with air conditioning on, and also see the rise in temps on the thermometer. The most I logged is a degree celcius increase in slightly over 30 mins of gaming. You can imagine the heat that gets dumped out of a GPU that draws north of 400W is going to be worst. To me, the limit is probably going to be below 350W so as not to turn my room into a sauna, or kill my air conditioning system.
 

watzupken

Reputable
Mar 16, 2020
1,171
655
6,070
Feels like a stretched point. At least compare generational equivalents; the 3070 is a ~220W card. That 290W stands to prove inefficiency that you are seemingly happy to swallow for a mere 6-8%? Reckon the 4070Ti would draw upper 300 at this rate
Between a RTX 3070 and 3070 Ti, I would gladly recommend the former. GDDR6X basically ruined the efficiency of the card, for incremental improvement in performance. I would rather have more VRAM, than ”faster” VRAM. While both don’t really benefit much due to the card’s target resolution, at least the former allows the card to be more future proof since we are already seeing some games that cards with 8GB struggles.
 
  • Like
Reactions: VforV

edzieba

Distinguished
Jul 13, 2016
578
583
19,760
Remember previous FE cards had notoriously overbuilt power delivery (and vBIOS power limits well below what the hardware was capable of delivering). Having 600W capable VRMs on the board does not necessarily imply a 600W continuous (or even 600W peak) draw.
 
I guess this depends on the cost of your electricity. If you're mining, it makes total sense to optimize power usage. If I was strictly a gamer, this would be a total waste of time for me. Starting with a 300W GPU, if I cut power consumption by 25%, that's 75W. If I played games 6 hours a day, everyday of the year, and that maxed out the GPU 100% of the time, all impractical in reality. At my current electricity rate of 9.84c/kWh, that comes out to a savings of 4.5 cents a day, $1.33/month, and $16.16 per year. Who cares? Then as a bonus, I lose 5-10% of my performance. I understand, not everyone has electricity rates that low, but even if you double the electricity cost, you're looking at only $32 for an entire year. The hysteria over the additional cost for electricity for these cards is completely blown out of proportion.
I run a Gigabyte 3080 Gaming OC. I found overclocking gave an insignificant fps increase, detectable in benchmarks but not noticeable in games unless scrutinising the numbers. I then decided to undervolt. This turned out more beneficial, I lost 1-2% performance at the most which again is not actually noticeable in any games. However the reduction in noise and heat is quite noticeable. For these reasons I leave my gpu undervolted, nothing to do with cost of running. Therefore I can see the benefit of tuning for efficiency of reducing waste heat.
 

Thunder64

Distinguished
Mar 8, 2016
199
282
18,960
In my experience, I feel it is not just about the power price you pay for upgrading from a 200 to 300W GPU.. For example, with previous gen of GPUs I’ve tested, i.e. GTX 1080 Ti, RTX 2060 Super, etc, I’ve never once experience where my room started to heat up fairly rapidly. With the RTX 3080, I could feel my room warming up even with air conditioning on, and also see the rise in temps on the thermometer. The most I logged is a degree celcius increase in slightly over 30 mins of gaming. You can imagine the heat that gets dumped out of a GPU that draws north of 400W is going to be worst. To me, the limit is probably going to be below 350W so as not to turn my room into a sauna, or kill my air conditioning system.

Agree with you. It's not about the cost Jarred, it's about the extra heat output.
 

vacavalier

Reputable
Mar 28, 2020
6
1
4,515
guess you will need a special kind of agreement with electric company to power a 4090ti...
Your statement might hold an element of truth, for some who live in progressive state's, ie. California.

California's Energy Consumption Tier 2 regulation.

It may not affect the GPU(s) sold individually, but included with a prebuild/ custom-build system... might be a no-go, depending on the zip code you're having it mailed to.

Good times we live in!
 

Phaaze88

Titan
Ambassador
IMO, the concern with increasing system power consumption is higher room ambient temperatures, leading to less comfortable entertainment, or work sessions.
Not everyone lives in a year-round chilly climate, which is a bit of a blessing. The subtropical climate I live in has an easily noticeable ~10C gap in room ambient between the cold and hot seasons.
When it's hot here, using AC is a necessity; the idea of using a box fan in the window to take the heated room air outside and have the door open to direct some outside air in, doesn't work when the outside air is usually warmer.
Not everyone has access to AC either, for one reason or other, so "Just turn on the AC, or set it lower" isn't always a valid solution.

The biggest contributor to system power consumption depends:
Games: Generally the gpu.
Wrkstn: Not too familiar with them, but it looks to be either, depending on the application.

There's some folks accusing operating thermals for their rooms getting uncomfortably warm... they couldn't be further from the truth.
"No, your 5800X or i9 running at 90C is not contributing that much at 60w of power use... what's the gpu doing? Oh, it's a 3080Ti pulling around 420w in game, and the core temperature is ~75C? Hmm...
 

spongiemaster

Admirable
Dec 12, 2019
2,345
1,323
7,560
I run a Gigabyte 3080 Gaming OC. I found overclocking gave an insignificant fps increase, detectable in benchmarks but not noticeable in games unless scrutinising the numbers. I then decided to undervolt. This turned out more beneficial, I lost 1-2% performance at the most which again is not actually noticeable in any games. However the reduction in noise and heat is quite noticeable. For these reasons I leave my gpu undervolted, nothing to do with cost of running. Therefore I can see the benefit of tuning for efficiency of reducing waste heat.
Without any actual numbers attached to your claim, I either don't believe their authenticity or your computer is located in a tiny closet. This card isn't that loud even when getting hammered by furmark

View: https://www.youtube.com/watch?v=vMdHPl21zIU


GPU's manufacturers have gotten noise under control for most part with titanic modern heat sinks and triple fan coolers. Stock fan curves are pretty conservative and you should not be hitting 100% fan speed like the above video during typical gaming. You want loud, listen to a blower style cooler from a decade ago. I'm not buying that dropping power usage by 60W or so makes any really tangible heat difference either unless you game in a closet. 60W is a traditional light bulb. People were not complaining about one light bulb making a room uncomfortably hot. Will a PC generating 400W+ while gaming heat a room? Absolutely, will there be a noticeable temperature difference between a 400W PC and a 340W PC in a typical sized room? No.
 

Thunder64

Distinguished
Mar 8, 2016
199
282
18,960
Without any actual numbers attached to your claim, I either don't believe their authenticity or your computer is located in a tiny closet. This card isn't that loud even when getting hammered by furmark

View: https://www.youtube.com/watch?v=vMdHPl21zIU


GPU's manufacturers have gotten noise under control for most part with titanic modern heat sinks and triple fan coolers. Stock fan curves are pretty conservative and you should not be hitting 100% fan speed like the above video during typical gaming. You want loud, listen to a blower style cooler from a decade ago. I'm not buying that dropping power usage by 60W or so makes any really tangible heat difference either unless you game in a closet. 60W is a traditional light bulb. People were not complaining about one light bulb making a room uncomfortably hot. Will a PC generating 400W+ while gaming heat a room? Absolutely, will there be a noticeable temperature difference between a 400W PC and a 340W PC in a typical sized room? No.

Almost nobody has used a traditional incandescent light bulb in over a decade, and yes, those did heat rooms as they very often used 75 or even 100W. I'd imagine the difference between a 250-300W GPU vs one closer to 600W would make a noticeable difference.
 
Without any actual numbers attached to your claim, I either don't believe their authenticity or your computer is located in a tiny closet. This card isn't that loud even when getting hammered by furmark

View: https://www.youtube.com/watch?v=vMdHPl21zIU


GPU's manufacturers have gotten noise under control for most part with titanic modern heat sinks and triple fan coolers. Stock fan curves are pretty conservative and you should not be hitting 100% fan speed like the above video during typical gaming. You want loud, listen to a blower style cooler from a decade ago. I'm not buying that dropping power usage by 60W or so makes any really tangible heat difference either unless you game in a closet. 60W is a traditional light bulb. People were not complaining about one light bulb making a room uncomfortably hot. Will a PC generating 400W+ while gaming heat a room? Absolutely, will there be a noticeable temperature difference between a 400W PC and a 340W PC in a typical sized room? No.
Firstly that video has the gpu on a test bench giving the best cooling potential. Putting the card in a case will reduce airflow and have higher air temperature after a period of use. It also does not take into account how the heat from the gpu causes other fans in the system to perform due to the extra hot air dumped in the case from the gpu. It also does not need to heat the room. It’s sat under the desk with a wall behind it and draws either end of the table, it only has to heat the box of air under the desk differently to be noticeable.
 

ikernelpro4

Reputable
BANNED
Aug 4, 2018
162
69
4,670
At my current electricity rate of 9.84c/kWh,I understand, not everyone has electricity rates that low, but even if you double the electricity cost, you're looking at only $32 for an entire year.
There's your problem. No people don't have a 9.84c/kWh rate, especially in europe where prices can range from 20-35cents! per kWh.
Now you're looking at a more realistic perspective. Your electricity cost skyrockets by 2-4x at least and that's not additionally "just $32" anymore....

The hysteria over the additional cost for electricity for these cards is completely blown out of proportion.
The hysteria is not blown out of proportions. People just don't want the new stupidity of massive power increase for a bit more performance continue.

With the current energy crisis, climate change incl. desire for fossil fuels declining, and price exploitation as well as inflation and up-work of paying off Covid-debt (both personal and state-level), prices for everything will be rising and the last thing people want is this energy-hike by Nvidia.

Forget the electricity cost, price gauging, massive VRAM increase which leads(as we have seen before) to lower production-capability:
The elephant in the room, somewhat literally and figuratively is heat.

Aside from your room heating up like a sauna + double AC's (in america, not in europe) working full-time, you're now looking at 1kWh power supplies, more powerful cpu's to battle bottleneck, wiring withstanding the new power-hog, massive heat&humidity-increase causing interior deterioration etc....
 

spongiemaster

Admirable
Dec 12, 2019
2,345
1,323
7,560
Almost nobody has used a traditional incandescent light bulb in over a decade, and yes, those did heat rooms as they very often used 75 or even 100W. I'd imagine the difference between a 250-300W GPU vs one closer to 600W would make a noticeable difference.
None of what you said has any real relevance to the conversation. Yes, a 600W GPU is going to have a noticeable difference vs a 250W. However, I am fairly certain sizzling didn't undervolt his 3080 by 350W while losing 1-2% of his performance.
 

spongiemaster

Admirable
Dec 12, 2019
2,345
1,323
7,560
Firstly that video has the gpu on a test bench giving the best cooling potential. Putting the card in a case will reduce airflow and have higher air temperature after a period of use. It also does not take into account how the heat from the gpu causes other fans in the system to perform due to the extra hot air dumped in the case from the gpu. It also does not need to heat the room. It’s sat under the desk with a wall behind it and draws either end of the table, it only has to heat the box of air under the desk differently to be noticeable.
The test was running Furmark which is a power virus that does not behave like any game. It will easily push the fans to 100% even on a test bench. That's not why I posted the video, it was for the noise of the card. Putting it on a test bench will obviously make it noisier than in a case which will dampen the sound. The card is not that loud running full bore on an open testbench. It is going to be even quieter in a case while gaming.
 

spongiemaster

Admirable
Dec 12, 2019
2,345
1,323
7,560
There's your problem. No people don't have a 9.84c/kWh rate, especially in europe where prices can range from 20-35cents! per kWh.
Now you're looking at a more realistic perspective. Your electricity cost skyrockets by 2-4x at least and that's not additionally "just $32" anymore....
Pegging your GPU at 100% while gaming for 6 hours a day 365 days a year is a pretty unrealistic scenario. Even 3 hours a day, every single day of the year is a pretty ridiculous amount that almost no one will do. An additional 75W for 3 hours a day, every day of the year, at .35/kWh comes out to $28.76 for an entire year. That's an irrelevant amount of money to someone dropping 100's of dollars on a gaming GPU.

The hysteria is not blown out of proportions. People just don't want the new stupidity of massive power increase for a bit more performance continue.

No one is forcing anyone to buy the top of the line halo cards. If it doesn't make sense to you, don't buy it. Easy fix.

These rumored TDP's are nothing new. SLI configurations from a decade of more ago were capable of reaching 1000W. No one cared then. The only difference now is that we can get that power draw and performance from a single card instead of having to deal with the headaches of multi card/multi GPU on a card solutions. You didn't have to a buy a quad Titan configuration years ago, and you won't have to buy whatever over the top halo card will be sold now. There will be cards with lower requirements.
 
  • Like
Reactions: JarredWaltonGPU