News Nvidia GeForce RTX 4090 Rumored to Feature Just 126 SPs

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
Pricing: "If you have to ask, you cannot afford it."
Probably around $2000 for the AIB models, with the FE a bit lower. Maybe $1700-1800. Considering how few gamers are using 4k screens, very few gamers need a card twice as fast as a 3090.

The 450W TDP is encouraging, based on the 600W rumors. Guess that leaves room for the full die 4090Ti to hit 600W.
 
D

Deleted member 431422

Guest
To hell with environment. Let's make a PC heater. Future nVidia GPU's are a joke with power consumption. AMD isn't any better.
 

InvalidError

Titan
Moderator
Considering how few gamers are using 4k screens, very few gamers need a card twice as fast as a 3090.
Considering how much cheaper 4K TVs and monitors are getting, I'd actually expect the number of people wishing they could get playable frame rates at 4k to grow quite considerably. Portal 2 was nice to replay on my 4k60 TV, picture captions and other little details that were just blurry gibberish at 1080p become readable without camera trickery to get closer. The added real pixels sharpness and reduced aliasing are also always nice to have at least in my book.

It'll be one of those things where people tell themselves they don't care about it while the ticket price is exorbitant but once it enters their pricing comfort range, most won't want to go back.

I didn't care about 4k until mid-range 50" 4k TVs dropped to $700, then I got one to replace my 24" CFL-lit secondary monitor and 22" tertiary monitor with a dead column.
 
Probably around $2000 for the AIB models, with the FE a bit lower. Maybe $1700-1800. Considering how few gamers are using 4k screens, very few gamers need a card twice as fast as a 3090.

The 450W TDP is encouraging, based on the 600W rumors. Guess that leaves room for the full die 4090Ti to hit 600W.
Hey! There's the few of us that use VR and need these behemoths.

Regards :p
 
  • Like
Reactions: jp7189 and deesider

Johnpombrio

Distinguished
Nov 20, 2006
247
67
18,770
Hmmm, if NVidia got their hands on RDNA3 cards and saw what the competition was bringing to the table, NVidia might have dropped their high-end specs some. Now they have another upgrade path for the highest card while getting more use from the chip dies that don't make the full spec.
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
Considering how much cheaper 4K TVs and monitors are getting, I'd actually expect the number of people wishing they could get playable frame rates at 4k to grow quite considerably. Portal 2 was nice to replay on my 4k60 TV, picture captions and other little details that were just blurry gibberish at 1080p become readable without camera trickery to get closer. The added real pixels sharpness and reduced aliasing are also always nice to have at least in my book.

It'll be one of those things where people tell themselves they don't care about it while the ticket price is exorbitant but once it enters their pricing comfort range, most won't want to go back.

I didn't care about 4k until mid-range 50" 4k TVs dropped to $700, then I got one to replace my 24" CFL-lit secondary monitor and 22" tertiary monitor with a dead column.
I bought my first 4k monitor in 2014. Currently using a 65" 4k screen and looking to upgrade this year. Was interested in a 75in or bigger 8k screen until I started reading about QD-OLED. Looked great on paper until I saw it will only come in 55 and 65" sizes this year. Then read a Samsung review on Rtings this week and they reported the weird pixel layout of a large green subpixel above smaller red and blue subpixels hurts text clarity making it not great as a PC screen. So, now waiting on reviews of this year's 8k screens to see how much they have improved since last year and what to go with if they are ready for primetime. Definitely going to be in the market for 4090.

A 4090 is not going to be the baseline needed for 4k gaming this generation. A 4070 and 4080 should be more than enough for anyone with a 4k/60 screen. A 4090 will be for people trying to push 4k/120 or 8k, raytracing or as -Fran- mentioned, highres VR. That's a pretty small target market.
 

InvalidError

Titan
Moderator
A 4090 will be for people trying to push 4k/120 or 8k, raytracing or as -Fran- mentioned, highres VR. That's a pretty small target market.
120Hz is another of those things people don't pay much attention to until they've can get it for a price they are happy with. At the rate GPUs are improving vs the rate at which their prices are rising, we may be 10 years away from 4k120 going mainstream.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
353
5,260
I expect their 2x performance to be ray tracing performance. Without it, its likely 40-50% faster.

4080 could be slightly faster than 3090 (possibly 10-20%) and 4070 should be on par
 
  • Like
Reactions: deesider

InvalidError

Titan
Moderator
I expect their 2x performance to be ray tracing performance. Without it, its likely 40-50% faster.
RT is definitely where the extra oomph is most needed with Nvidia pushing RT hard and game developers hopping onboard now that AMD and soon Intel will have at least somewhat usable ray-tracing too. Many people will avoid RT as long as it still comes with a non-trivial penalty vs raster-only.
 
Going 4K is like when you went from dial-up to broadband.

"Welp, there's no going back to that old crap now."
Going to 4K compared to what? A 480p screen? : P

At typical screen sizes and viewing distances, most are going to struggle to notice much difference in sharpness between native 4K and something like 1440p when actually playing a game or watching video. If anything, they are probably more likely to notice the hit to framerate that the higher resolution brings. Past a certain point, adding more pixels has diminishing returns.

Of course, the improved upscaling options that have launched in recent years have made 4K screens more viable, especially on hardware that can't manage at least 60fps at native 4K with high settings in demanding games. Though 4K screens supporting high refresh rate inputs still carry a fairly large premium disproportionate to their improvement to visuals over comparable 1440p models, even if the price difference isn't as bad as it once was.
 

InvalidError

Titan
Moderator
At typical screen sizes and viewing distances, most are going to struggle to notice much difference in sharpness between native 4K and something like 1440p when actually playing a game or watching video.
Jaggies and shimmering on cleanly defined lines are definitely noticeable, especially straight ones and if you are going to run 4X FSAA or higher to smooth those up, may as well just go native 4k without AA to get the full sharpness back.
 

jkflipflop98

Distinguished
Going to 4K compared to what? A 480p screen? : P

At typical screen sizes and viewing distances, most are going to struggle to notice much difference in sharpness between native 4K and something like 1440p when actually playing a game or watching video. If anything, they are probably more likely to notice the hit to framerate that the higher resolution brings. Past a certain point, adding more pixels has diminishing returns.

Maybe if you have cataracts. The difference in resolution is clearly apparent.
 
To hell with environment. Let's make a PC heater. Future nVidia GPU's are a joke with power consumption. AMD isn't any better.
Let me know when you come up with a solution to get Dennard Scaling back in track.

Going 4K is like when you went from dial-up to broadband.

"Welp, there's no going back to that old crap now."
It depends on the content, I'd argue. If the content isn't infinitely scalable or has been "mastered" for that resolution, then 4K isn't really worth while.

I made an initial dive back in like 2015 or so on a 27" monitor. For the UI, yes, 4K with 125-150% scaling is leagues better than 100% scaling at any resolution. However, for games, I couldn't really see enough of a difference for it to matter. 1440p looked pretty much like 4K, maybe with a slight fuzziness but not enough to be bothersome.

I moved to a high refresh rate 1440p monitor a few years later. However, I've seen examples in screenshots where 4K provided clear details for distant objects, but at the same time there's the question of how often does one care about the finer details of objects far enough away. I might go back and see how today's games look since 4K should at least given some consideration.
 
Last edited:
  • Like
Reactions: KyaraM
Games don't need to be "mastered for 4k" to get the benefits of added sharpness. Even emulated PS2/GC/N64 games look considerably better IMO.
Well sure, going from 480p to UHD at the same screen sizes is definitely going to be an improvement, but going from 1440p to UHD isn't as dramatic or as sharp.

Also I take into consideration of all assets, including textures. For instance, a given tree isn't going to look any better at 4K than at 1080p if the textures were only designed to look good at 1080p because tree leaves are usually made up of flat polygons with textures on them most of the time.
 
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
Also I take into consideration of all assets, including textures. For instance, a given tree isn't going to look any better at 4K than at 1080p if the textures were only designed to look good at 1080p because tree leaves are usually made up of flat polygons with textures on them most of the time.
Yes it will look at least slightly better from the low-res textures getting mapped to screen space with 4-24X finer granularity so you retain more texture fidelity in any orientation across a wider scaling range. It won't un-pixelate blowing up blocky textures to 5X the TV/monitor size they were intended for but each one of those texture pixels will get represented on screen far more accurately which is still a considerable win in my book.

From GPU manufacturers including "integer scaling" in their GPUs to accommodate pixel art games, I'm guessing there is a relatively significant market for people who want sharp-looking low-res pixelated textures.