News Nvidia GeForce RTX 4090 Rumored to Feature Just 126 SPs

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
From GPU manufacturers including "integer scaling" in their GPUs to accommodate pixel art games, I'm guessing there is a relatively significant market for people who want sharp-looking low-res pixelated textures.
Integer scaling doesn't affect texture mapping in 3D games though. It only affects the output resolution. I believe integer scaling was added mostly because 2D indie games either relied on the GPU scaling the image or increasing the resolution also meant increasing how much you could see of the playing field, and I'm sure some people didn't care to see a million miles out when their character was small. Also probably for older games that don't support modern resolutions. And also because people kept complaining something "so simple" wasn't added yet.

I also had a look at Forza Horizon 5 using my 27" 4K monitor. While it certainly was sharper in stills, I wouldn't really care as much once things got rolling. Maybe it might be nicer looking on a slower paced game, but for action games as long as it's not glaringly bad, I stop caring about resolution after a bit.
 

InvalidError

Titan
Moderator
I also had a look at Forza Horizon 5 using my 27" 4K monitor. While it certainly was sharper in stills, I wouldn't really care as much once things got rolling. Maybe it might be nicer looking on a slower paced game, but for action games as long as it's not glaringly bad, I stop caring about resolution after a bit.
Action games, sure. Most of my favorite games are more along the puzzle axis such as Portal 1&2.
 
Higher resolution doesn't mean much if there's no higher-count polygons on the models. You'll just see the cube a bit bigger with crappier looking textures. This is so very true in VR, where you can take objects quite literally next to your eyes and you can see the texture and model quality very vividly. This requires a whole new approach to building objects and stuff. Quite interesting all in all.

Anyway, you guys should give VR a try and experience that first hand!

Regards.
 
  • Like
Reactions: KyaraM

KyaraM

Admirable
"Just" 450 W"...... Wow!!!

Does nGridia expect a pat on the back and cookie for this amazing achievement??
Journalists: make sensationalist statements based on leakers and unofficial information
Guys like you: crucifies the hardware company for the statement made by the journalist
As long as nVidia didn't make any statements about max power draw and all rumors come from leakers, how about you wait for more official stuff to be released instead of making comments like this? None of this is official. If anything, this are reassuring news for anyone with half a brain, as long as you are actually using it. If this is the trend, for both companies, then the consumer can only win. It means that power consumption will likely not go up as much as feared all across the board. Is is a concerning development? For sure. Is it as bad as though? Thankfully not. Hopefully both the new nVidia and AMD can be tuned for efficiency. But then, I guess some people need to complain about literally anything and everything.
 
Journalists: make sensationalist statements based on leakers and unofficial information
Guys like you: crucifies the hardware company for the statement made by the journalist
As long as nVidia didn't make any statements about max power draw and all rumors come from leakers, how about you wait for more official stuff to be released instead of making comments like this? None of this is official. If anything, this are reassuring news for anyone with half a brain, as long as you are actually using it. If this is the trend, for both companies, then the consumer can only win. It means that power consumption will likely not go up as much as feared all across the board. Is is a concerning development? For sure. Is it as bad as though? Thankfully not. Hopefully both the new nVidia and AMD can be tuned for efficiency. But then, I guess some people need to complain about literally anything and everything.
The 3090ti is a >450W card that can go as high as ~900W when OC'ing with LN2 and ~550W on WC; most air models cap around 500W before going too high in temps. That is not speculation. This is going to be the baseline of the high end going forward for sure. I hope nVidia changes its course, but I doubt it.

Same-ish for AMD, mind you. Both are going up in power, but the saving grace from AMD is they're still not pushing all their cards to the limit. The 6950XT is the only ah heck that was pushed beyond what it should and it can also hit 500W with air (GamersNexus has OC numbers). So not a good look for neither GPU maker. That being said, I do wonder what are Intel plans though.

Regards.
 

KyaraM

Admirable
The 3090ti is a >450W card that can go as high as ~900W when OC'ing with LN2 and ~550W on WC; most air models cap around 500W before going too high in temps. That is not speculation. This is going to be the baseline of the high end going forward for sure. I hope nVidia changes its course, but I doubt it.

Same-ish for AMD, mind you. Both are going up in power, but the saving grace from AMD is they're still not pushing all their cards to the limit. The 6950XT is the only ah heck that was pushed beyond what it should and it can also hit 500W with air (GamersNexus has OC numbers). So not a good look for neither GPU maker. That being said, I do wonder what are Intel plans though.

Regards.
This about the 4090, though, not the 3090Ti, so the next gen card that was initially rumored to have 600W base consumption. So from what you said, there is even rather less reason to complain about nVidia, unless you will also complaing about AMD, which the guy I replied to didn't, it was only about nVidia leaks of the new 4090 card. It is a valid complaint that cards get everore power hungry, mind you, especially with rising energy costs. It just annoys me when the complaints are only ever leveled at one but not the other, when it was already stated that both will go up next gen.

Edit: btw, I got pretty good results with undervolting my 3070Ti, with a simultaneous overclock even. 60W below max consumption at over 200MHz more (if you consider my card is essentially reference design with better cooling; it goes up to 2025MHz with a base boost of 1770MHz) is pretty nice. So at least you can decrease consumption a bit still, most of the time...
 
Last edited:
This about the 4090, though, not the 3090Ti, so the next gen card that was initially rumored to have 600W base consumption. So from what you said, there is even rather less reason to complain about nVidia, unless you will also complaing about AMD, which the guy I replied to didn't, it was only about nVidia leaks of the new 4090 card. It is a valid complaint that cards get everore power hungry, mind you, especially with rising energy costs. It just annoys me when the complaints are only ever leveled at one but not the other, when it was already stated that both will go up next gen.

Edit: btw, I got pretty good results with undervolting my 3070Ti, with a simultaneous overclock even. 60W below max consumption at over 200MHz more (if you consider my card is essentially reference design with better cooling; it goes up to 2025MHz with a base boost of 1770MHz) is pretty nice. So at least you can decrease consumption a bit still, most of the time...
Rumours put the 4090 at 450W base and 600W TBP (total board power). That is perfectly believable looking at the 3090ti. That was my point. Looking at the 4090 with potentially faster RAM (about 20-30 more W with that alone, since GDDR6X sucks a lot of power) and the GPU being just a tad more power hungry, the consumption numbers will most likely look very similar. Performance-wise, it will be better still, so "efficiency" (yes, quote marks there) is still improved.

Regards.
 

KyaraM

Admirable
Rumours put the 4090 at 450W base and 600W TBP (total board power). That is perfectly believable looking at the 3090ti. That was my point. Looking at the 4090 with potentially faster RAM (about 20-30 more W with that alone, since GDDR6X sucks a lot of power) and the GPU being just a tad more power hungry, the consumption numbers will most likely look very similar. Performance-wise, it will be better still, so "efficiency" (yes, quote marks there) is still improved.

Regards.
Then we are kinda talking along the same line here :)
Was just confused since you only mentioned the 3090Ti, and never the 4090 above, but it makes sense like this. Yeah, I also think that power consumption has to be put in relation to the raw power (in terms of FPS in this case). This is a rather steep increase right there, and if they keep TGP increases within 100W it's... not really great, but better than before for what you get I guess. That's not meant as a free pass, but man, on the plus side, imagine what a 4060Ti could do with around the consumption of a 3070 (my guess, pure assumption). Far more reasonable card anyways and if they manage to repeat what they did with the 3060Ti...
 
Then we are kinda talking along the same line here :)
Was just confused since you only mentioned the 3090Ti, and never the 4090 above, but it makes sense like this. Yeah, I also think that power consumption has to be put in relation to the raw power (in terms of FPS in this case). This is a rather steep increase right there, and if they keep TGP increases within 100W it's... not really great, but better than before for what you get I guess. That's not meant as a free pass, but man, on the plus side, imagine what a 4060Ti could do with around the consumption of a 3070 (my guess, pure assumption). Far more reasonable card anyways and if they manage to repeat what they did with the 3060Ti...
Ah, yes; I forgot to give context. Apologies.

As for the "mid range". It is rumoured for the 4070 to be 300W TBP, so another bump compared to current and previous gens of xx70 cards from nVidia. Current 3070 I believe is around* 250W TBP? Maybe a tad more.

Regards.
 

KyaraM

Admirable
Ah, yes; I forgot to give context. Apologies.

As for the "mid range". It is rumoured for the 4070 to be 300W TBP, so another bump compared to current and previous gens of xx70 cards from nVidia. Current 3070 I believe is around* 250W TBP? Maybe a tad more.

Regards.
3070Ti is 290W; I think the 3070 is 220W. Always depends on the manufacturer, of course, but the same goes for the 4000 card. So that is a not insignificant increase, but final judgment hoes out when they release final specs.
 
3070Ti is 290W; I think the 3070 is 220W. Always depends on the manufacturer, of course, but the same goes for the 4000 card. So that is a not insignificant increase, but final judgment hoes out when they release final specs.
Ah, right; I forget the 3070ti exists... There's a caveat though; the 3070ti uses GDDR6X and it uses the same (fully enabled) GA104 as the 3070, so it sucks more power due to those two combined and so I'll agree that it's a good comparison to what the new 4070 may consume as a baseline. Which, again, rumours are reasonable there. A small bump for more performance still. I still dislike the trend, though.

Regards.
 
  • Like
Reactions: KyaraM

KyaraM

Admirable
Ah, right; I forget the 3070ti exists... There's a caveat though; the 3070ti uses GDDR6X and it uses the same (fully enabled) GA104 as the 3070, so it sucks more power due to those two combined and so I'll agree that it's a good comparison to what the new 4070 may consume as a baseline. Which, again, rumours are reasonable there. A small bump for more performance still. I still dislike the trend, though.

Regards.
Oh, definitely... I work in the environmental field and I don't like the trend at all, which is why I trimmed down energy consumption of even the one indulgence I grant myself, a high-end gaming computer as much as possible; though I could likely do some more by underclocking the GPU, it wouldn'tcost much performance and I seem to be rather lucky with the lottery.

Honestly, unpopular opinion maybe, but I could live with a GPU generation that wasn't 20% more powerful, but at the same level with -20% or less consumption instead. That's why I mentioned the 4060Ti. Iirc, it's insanely good price/performance.
 
Oh, definitely... I work in the environmental field and I don't like the trend at all, which is why I trimmed down energy consumption of even the one indulgence I grant myself, a high-end gaming computer as much as possible; though I could likely do some more by underclocking the GPU, it wouldn'tcost much performance and I seem to be rather lucky with the lottery.

Honestly, unpopular opinion maybe, but I could live with a GPU generation that wasn't 20% more powerful, but at the same level with -20% or less consumption instead. That's why I mentioned the 4060Ti. Iirc, it's insanely good price/performance.
This is a good video that explains how the TBP moves around a bit. As you can imagine and probably know, partners can just alter those slightly as long as AMD and nVidia allow them to.

View: https://www.youtube.com/watch?v=H2E2swBDml4


My point there is that, the quoted TBP numbers in rumours are from partners based on the basic designs, but special edition cards can use, and more than likely will, more power and maybe a lot more. EVGA's Kingpin cards are a good example of "overkill".

Regards.
 

InvalidError

Titan
Moderator
Higher resolution doesn't mean much if there's no higher-count polygons on the models. You'll just see the cube a bit bigger with crappier looking textures.
If you take a texture of a 4x4 black and white square and rotate it 45 degrees, you end up with a multi-sampled blended blob that almost looks like a circle because there aren't enough screen pixels to draw it correctly. Bump resolution by 4X or so and the rotated square still looks like the square it was meant to be, rotated 45 degrees. Having 4X the screen resolution also greatly reduces staircasing and shimmering along polygon edges.
 
If you take a texture of a 4x4 black and white square and rotate it 45 degrees, you end up with a multi-sampled blended blob that almost looks like a circle because there aren't enough screen pixels to draw it correctly. Bump resolution by 4X or so and the rotated square still looks like the square it was meant to be, rotated 45 degrees. Having 4X the screen resolution also greatly reduces staircasing and shimmering along polygon edges.
I guess you missed the point. If the polygon or mesh design is not meant to be for high resolution, other than the weird artifacts you could find at a lower target resolution than intended, the higher resolution will not improve the intended model other than removing those artifacts. Most games and their art teams use a fixed poly count based on their baseline graphics power (SquareEnix gave a few interesting presentations on this when talking about the FF7 remake IIRC) where just moving to a higher resolution with old models is not enough and they will just look worse overall. It's like the example of the beautiful lady on a lower MP camera vs another where you can zoom to the point where you can see all the pores and imperfections.

So, overall even taking into account what you say, my point still stands.

Regards.
 
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
I guess you missed the point. If the polygon or mesh design is not meant to be for high resolution, other than the weird artifacts you could find at a lower target resolution than intended, the higher resolution will not improve the intended model other than removing those artifacts.
I personally find absolutely no issue with sharper, cleaner-looking low-poly models with low-res textures. Jagged shimmering polygon edges are likely the thing that annoys me the most after outright visual glitches.
 
I personally find absolutely no issue with sharper, cleaner-looking low-poly models with low-res textures. Jagged shimmering polygon edges are likely the thing that annoys me the most after outright visual glitches.
Your point reminded me of the debacle around "scan lines" for a few remasters of old 2D games, like Chrono Trigger. As you say, it is really a preference thing, but the big majority of people chooses the art style as originally conceived, even when upscaled to a higher resolution (keeping proportions and, to a point, the same "tricks" to hide the low res).

All in all, no matter how much "modern" filtering you try to use on low poly and low resolution textures, you can't go past a certain point on quality of them. Even going down the scale has its problems. I can see that very well in VR as I mentioned above. Again, I seriously recommend everyone to give it a try and just compare a game on 4K vs how you perceive it in VR. It's SO different.

Regards.
 
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
Your point reminded me of the debacle around "scan lines" for a few remasters of old 2D games, like Chrono Trigger. As you say, it is really a preference thing, but the big majority of people chooses the art style as originally conceived, even when upscaled to a higher resolution (keeping proportions and, to a point, the same "tricks" to hide the low res).
2D games are a whole other can of worms since the visuals and animation are on a fixed grid that cannot be altered short of rewriting the game and visual trickery that only works under specific circumstances, unlike 3D graphics where most stuff is vectors that can easily be scaled absolutely fine to any resolution. One of my biggest disappointments with my 50" TV is how horrible Gran Turismo 4 on PS2 looks even using component output for 480p/720i. I bet it'd look leagues better emulated in 4k.

I haven't looked at SNES emulation in several years. I could imagine mode-7 effects using super-resolution rendering would look quite nice if someone bothered doing some HLE magic there to make it happen.
 

TJ Hooker

Titan
Ambassador
Maybe if you have cataracts. The difference in resolution is clearly apparent.
As cryoburner alluded to, it depends on viewing distance and screen size. Talking about resolution on its own is largely meaningless.

For a 27" screen, a person with 20/20 vision (~1 arc minute minimum angle of resolution) would need to be viewing the screen from less than 2.6' to start seeing benefit from a resolution greater than 1440p. If I'm leaning back in my chair a bit (as I tend to do), I'm probably around that distance away, so there wouldn't be much benefit to moving to 4K (unless I also increase the screen size).

At 1.75' viewing distance, you get the full benefit of 2160p.

Edit: I may be misusing the concept of angular resolution here.
 
Last edited:
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
For a 27" screen, a person with 20/20 vision (~1 arc minute minimum angle of resolution) would need to be viewing the screen from less than 2.6' to start seeing benefit from a resolution greater than 1440p.
Eh, hell no.

Resolution is distinctly resolvable details, ex.: still being able to positively tell that the single-pixel dot above an i is separated by a single pixel from the rest of said 'i'. Once you are at that limit, you need to double resolution one more time before pixels become too small to be individually perceivable and enter the territory of rapidly diminishing returns for sharpness.

This is similar to how "the human eye cannot perceive more than 24 frames per second": you may struggle to ingest details of unique images with less than 1/24s of exposure time but you can still perceive motion smoothness beyond 200fps.
 
  • Like
Reactions: TJ Hooker
As I read somewhere, the human eye doesn't have very good "resolution" (still stupid high), but it has magnificent motion detection due to evolution (hunting, remember?).

That being said, I use glasses, so I just shrug at the resolutions and obscene DPI from some marketing agencies. Anything above 160DPI for me is lost further than ~10 inches away, haha. Glasses help so much, but there's distortion introduced by them so even worse. I play VR without glasses for games like Beat Saber and I don't need motion blur as everything looks smeared, so I have to, quite literally, walk into the text and stuffs XD

Regards.
 
  • Like
Reactions: KyaraM
I also had a look at Forza Horizon 5 using my 27" 4K monitor. While it certainly was sharper in stills, I wouldn't really care as much once things got rolling. Maybe it might be nicer looking on a slower paced game, but for action games as long as it's not glaringly bad, I stop caring about resolution after a bit.
That's the thing, for the most part, one's brain should do a pretty decent job ignoring minor differences in resolution once they are not specifically searching for them. After a few minutes, you will tend to be focused on the content of the game, not the sharpness of the image or a few perceptible pixels here and there.

Far more reasonable card anyways and if they manage to repeat what they did with the 3060Ti...
I certainly hope we don't see a repeat of what happened with the 3060 Ti. What was billed as a $400 successor to cards like the 2060 SUPER launched for closer to $500, then skyrocketted in price to around double its MSRP due to the crypto-induced shortages. Even now, when graphics cards are beginning to pile up at retailers, the starting price for a 3060 Ti is still around $600, a 50% markup. The card had potential to be a good value, but not so much at a price point closer to prior x80 cards. I'm not sure it's going to ever reach MSRP before Nvidia's next generation of cards launches, unless Intel launches a card around its price point that makes it look bad within the coming months.
 

InvalidError

Titan
Moderator
That's the thing, for the most part, one's brain should do a pretty decent job ignoring minor differences in resolution once they are not specifically searching for them. After a few minutes, you will tend to be focused on the content of the game, not the sharpness of the image or a few perceptible pixels here and there.
If you play slower-paced games like puzzles, exploration, sight-seeing, etc., watching out for details is often the point of the game or at least a source of easter eggs for people who are looking. Plenty of times to notice polygon edge artifacts while paused to read text or figure out a pattern.
 

KyaraM

Admirable
I certainly hope we don't see a repeat of what happened with the 3060 Ti. What was billed as a $400 successor to cards like the 2060 SUPER launched for closer to $500, then skyrocketted in price to around double its MSRP due to the crypto-induced shortages. Even now, when graphics cards are beginning to pile up at retailers, the starting price for a 3060 Ti is still around $600, a 50% markup. The card had potential to be a good value, but not so much at a price point closer to prior x80 cards. I'm not sure it's going to ever reach MSRP before Nvidia's next generation of cards launches, unless Intel launches a card around its price point that makes it look bad within the coming months.
Considering I'm purely talking about performance inside the card's market niche as well a power draw, not prices the retailers want for it, I certainly do hope for a repeat, and that aren't fully in their control (and neither in AMDs, for that matter), especially since it would make it an awesome replacement for my current card if I'm even switching yet; I will most likely wait another gen or two, as always. There is no denying that this card is an awesome preformer inside its niche.
 

TRENDING THREADS