News OEM Exclusive RTX 3050 Confirmed With Cutdown Specs

King_V

Illustrious
Ambassador
Oh for crap's sake, Nvidia . . when you do this, GIVE IT A DIFFERENT MODEL NUMBER!

Or at least call it a (whatever model) LE or LT or something.

Also looking at you AMD, but at least you haven't pulled this crap since the RX 550 and RX 560 variants, unless I've forgotten something.
 

InvalidError

Titan
Moderator
The original RTX3050 was already a 30+% cut down GA106, I find it unlikely there are specimens so badly defective that they need one extra SM cut out to make the grade.

It would make more sense if this was the cost-optimized RTX3050 based on GA107, likely to be followed by a Ti variant with all 20 GA107 SMs available.
 

Giroro

Splendid
How hard is it to just add an extra letter or 2 to the name of the product?
Or if this is just about cost-cutting, then why not make it an RTX 3050 4GB? The 3050 can't play 4k games, so it has no need for the VRAM needed to support 4k "ultra" textures.
4096x4096 textures take 4x the VRAM of 2048x2048 textures, even though the difference is unviewable on a 1920x1080 monitor. Even the situations where 2048x2048 has a resolvable difference to 1024x1024 is limited to when a single texture is covering over half the screen, but how often does that actually happen in games when there's usually dozens, if not hundreds of textured objects on screen at once? New games might be coming out with huge, impractical textures, but what does it matter in FHD gaming when they exceed the resolution of the display? Am I completely off base with how pixels work?
If people are unhappy that they can only play a game at medium then, like, just do the TV thing and rename the Texture settings to resolutions or something like "HD, FHD, QHD, UHD", etc.

I get that the 8GB selling point can help move a product off the shelf compared to a less expensive/better performing RTX 2060 6GB, but since when did OEMs bother advertising how much VRAM is in their low-end gaming systems?
 
  • Like
Reactions: King_V

InvalidError

Titan
Moderator
Am I completely off base with how pixels work?
4k gaming and 4k textures are two different and only loosely related things. I've done some 4k gaming on a GTX1050, works great in older games like Portal 2, though not quite there for 4k PS2 emulation. I don't remember trying N64 emulation since getting a 4k TV, might be worth a look.

Regardless, most games will readily blow through 4GB of VRAM with anything lowest everything when available, so 4GB definitely isn't viable for anything pitched to actual gamers who want to be able to play stuff at some reasonable approximation of how they are meant to look instead of rock-bottom details to squeeze everything within 4GB.
 

King_V

Illustrious
Ambassador
4k gaming and 4k textures are two different and only loosely related things. I've done some 4k gaming on a GTX1050, works great in older games like Portal 2, though not quite there for 4k PS2 emulation. I don't remember trying N64 emulation since getting a 4k TV, might be worth a look.

Regardless, most games will readily blow through 4GB of VRAM with anything lowest everything when available, so 4GB definitely isn't viable for anything pitched to actual gamers who want to be able to play stuff at some reasonable approximation of how they are meant to look instead of rock-bottom details to squeeze everything within 4GB.

Slight digression, but you might get a kick out of this guy doing 8K gaming with a giant screen... and an RX 6400.

Some of it went hilariously badly, as you might expect, but there were some really pleasant surprises.

 

InvalidError

Titan
Moderator
Slight digression, but you might get a kick out of this guy doing 8K gaming with a giant screen... and an RX 6400.

Some of it went hilariously badly, as you might expect, but there were some really pleasant surprises.
280fps at 8k on an RX6400... nice :)

And yeah, I really like the extra sharpness in older games on 4k for what little stuff is playable on 4k with a 1050.
 

King_V

Illustrious
Ambassador
280fps at 8k on an RX6400... nice :)

And yeah, I really like the extra sharpness in older games on 4k for what little stuff is playable on 4k with a 1050.

Yeah! That bit with Half-Life 2, and looking at the crowbar, and his comment about how 8K made the old graphics look kind of weirdly real - it was nuts!

This whole thing now makes me want to game on a gigantic TV.... uh, except my TV, while being a 60-incher, is an older one . . 1920x1080. My eyes aren't good enough to complain about that low resolution at TV-viewing distance, though.