News Nvidia Reportedly Readies 800W RTX 4090 Ti With 18,176 Cores, 48GB GDDR6X

I could write something snide about the 4090 Ti popping circuit breakers, but realistically those cards only exist to give away to YouTubers in the "have over 10 people on staff category". All of those people are running commercial power, so it really doesn't matter.

I'm a lot more upset about the idea their "4070" could be a 300W card. That's significantly higher than even a 2080Ti. I live in an area with this rare event called "Summer", which happens on a semi-regular basis. . .

A sub-200 Watt card might not even happen from Nvidia next generation, which is wild to me. If these are the results Nvidia is getting, then maybe they should finally remove the useless Ray Tracing software. Game devs don't want to support it. It's time to move on. Stop adding all that cost and power to gaming cards for tech that is really only useful to Hollywood render farms.
 
Just wait 'til the RTX 5090 Ti comes out....

1*is0imqZ9gHLUaAcnLjAUCg.gif
 
Everything about this "leak" sounds like Nvidia just trolling people or trying to figure out who is leaking stuff on twitter. If the 4090 has a 450W TDP, how does the 4090Ti have an 800W TDP? Even Nvidia's upcoming enterprise H100 SXM boards top out at 700W. Based on specs alone, there should not be a 350W difference. I can't see Nvidia releasing a 48GB RTX card either. If this card exists in any form, it will be back to the Titan branding.
 
Interesting specs of the 4080 and 4090 between CUDA cores and memory bus compared to the 3080 and 3090 (if this 4-series info is true) and then seeing where the 3080 TI is. I'm still scratching my head why Nvidia even created the 12GB version of the 3080 which just made it effectively a 3080 Ti at the same price point (and I'm a 3080 Ti owner). But what I find most interesting is Nvidia dropping the 4080 to a 256-bit memory bandwidth bus leaving a large 50% gap to the 384-bit bus for the 4090. One can only speculate what their plans are for a fill.


GPU
CUDA Cores
Memory Bus
Memory
RTX 409016384384-bit24GB
RTX 408010240256-bit16GB
RTX 309010469384-bit24GB
RTX 30808704/8960320-bit/384-bit10GB/12GB
RTX 3080 Ti​
10240​
384-bit​
12GB​
 
I could write something snide about the 4090 Ti popping circuit breakers, but realistically those cards only exist to give away to YouTubers in the "have over 10 people on staff category". All of those people are running commercial power, so it really doesn't matter.

I'm a lot more upset about the idea their "4070" could be a 300W card. That's significantly higher than even a 2080Ti. I live in an area with this rare event called "Summer", which happens on a semi-regular basis. . .

A sub-200 Watt card might not even happen from Nvidia next generation, which is wild to me. If these are the results Nvidia is getting, then maybe they should finally remove the useless Ray Tracing software. Game devs don't want to support it. It's time to move on. Stop adding all that cost and power to gaming cards for tech that is really only useful to Hollywood render farms.

Commercial power? That’s a fair bit overblown.

800W is 6.67 Amps. That’s less than 50% of the power my 4 slice toaster uses. Assuming I bought the card and used it around 400 hours a year (8 hours a week) my power cost would be $22.40 a year ($0.07KW power). Plus my house has a 200 Amp service (code where I live now) which means I could run absolutely everything simultaneously and still not crack 70 Amps.

I really can’t understand when people are talking about GPUs that will cost several thousand dollars why $20 of power a year is a concern? I get your power might cost more but it’s still a tiny fraction of the overall cost. Agonizing over 100W ($2.80 a year for me) seems absurd.
 
As always, I only believe any of this when Nvidia themselves release the specs. This is all speculation at this point. So many of those "leaks" feel like shots into the blue, why does anyone even still read them?
 
Seems to be more for just bragging rights than a practical commercial product and to see how far they can push the limits of the technology. Kind of reminds me of a concept car, but in this case they can sell a few production units as well.
Agreed. Plus it will probably cost $10,000
 
Commercial power? That’s a fair bit overblown.

800W is 6.67 Amps. That’s less than 50% of the power my 4 slice toaster uses. Assuming I bought the card and used it around 400 hours a year (8 hours a week) my power cost would be $22.40 a year ($0.07KW power). Plus my house has a 200 Amp service (code where I live now) which means I could run absolutely everything simultaneously and still not crack 70 Amps.

I really can’t understand when people are talking about GPUs that will cost several thousand dollars why $20 of power a year is a concern? I get your power might cost more but it’s still a tiny fraction of the overall cost. Agonizing over 100W ($2.80 a year for me) seems absurd.

What state is that? Checking the rates here (table defaults to Business, you have to manually switch it to Residential), the cheapest in the nation is 10.03 cents/kWh.. and that's not counting delivery.

It's also not counting the extra heat pouring into your room, and the extra air-conditioning that you need to run to compensate for it.

AND... some of us actually do care to minimize the damage we're doing, and also think efficiency is important in technological development. It's not all about the money.
 
Commercial power? That’s a fair bit overblown.

800W is 6.67 Amps. That’s less than 50% of the power my 4 slice toaster uses. Assuming I bought the card and used it around 400 hours a year (8 hours a week) my power cost would be $22.40 a year ($0.07KW power). Plus my house has a 200 Amp service (code where I live now) which means I could run absolutely everything simultaneously and still not crack 70 Amps.

I really can’t understand when people are talking about GPUs that will cost several thousand dollars why $20 of power a year is a concern? I get your power might cost more but it’s still a tiny fraction of the overall cost. Agonizing over 100W ($2.80 a year for me) seems absurd.
Going beyond your own personal case:

  • many people pay more than that for power;
  • in some places, power cost is exponential, meaning if you cross a certain threshold, the price per kWh increases;
  • that's almost the same power as a microwave heater;
  • if they sell that to 1000 people, that's an extra 400kW of energy that the grid must provide; now extrapolate to even more people;
  • consider the above as a tendency, and in a few years we'll need new powerplants to feed hungry GPUs just to play GTA VI;
  • now the card needs water-cooling, air will hardly dissipate 800W of heat inside a closed case;
  • cards will get even bigger and fatter, at least three slots;
  • lastly, the heat that goes to the room must also be dissipated, and many will use air conditioning, which again uses up more energy from the grid and increase the monthly bill.

Sorry to take your post for this rant, but it had a convenient case as to why graphics cards over 400W are absurd, and should not be made the norm in any way.
 
maybe they should finally remove the useless Ray Tracing software. Game devs don't want to support it. It's time to move on. Stop adding all that cost and power to gaming cards for tech that is really only useful to Hollywood render farms.

I would strongly disagree with this.. some games like Dying Light 2 for example look immensely better with raytracing. It looks fantastic when implemented well and while not every implementation of it looks amazing it is awesome technology.
Here is an Off/On comparison I took seconds apart:

View: https://imgur.com/a/dJhDK6L
 
They look like they were taken at different times of the day.

Exactly except they weren't. That's how big the difference is in shaded areas. I took the first, turned on RT, restarted the game, appeared in the same place at the same time and this was the result.

You can see in the bottom right corner is the in game day time, 20 game world minutes had passed between the two. I've seen the same results just looking at my gf's screen across the room while playing coop to see the difference. She plays with RT off for higher fps.
 
So it seems like now nvidia is trying to make XX90 the new XX80 by severely undercutting the XX80 series on core count and launching them late. So when a flagship mainstream should have costed 600 to 700$ will now cost 1700$.
 
  • Like
Reactions: lordmogul