News Nvidia Hierarchy Shows the RTX 3050 Can't Keep Up With the Old RTX 2060

InvalidError

Titan
Moderator
I mean...they literally had a shareholder meeting slide gloating that they're charging the customer more for a GPU, soooo.

Supply and demand babyyyy!!!
Question is whether inflated prices will survive GPU-mining likely dying down once ETH2 launches and the mass sell-offs begin. The beyond-MSRP situation appears to be well on its way to sorting itself out as long as there isn't another industry-wide supply disruption to fudge things up again.
 
  • Like
Reactions: artk2219

Karadjgne

Titan
Ambassador
What that video/nvidia doesn't say is the % difference. For all anyone knows the 2060 is only averaging 1% better than a 3050, which in reality makes them the same card and fits perfectly with nvidia's drop 10 next gen numbering scheme. Same applies to the 3060/2070 which bump heads consistently or close enough to be margin of error.

My personal opinion is the 3050 is better than the 2060, uses better DLSS, better RTX, at gives a better 1080p gaming experience, regardless of a couple fps difference I can't see or tell exist in anything except a benchmark.

For general gaming, and all intents and purposes, a 1070, 1660ti, 2060 and 3050 all land in the same place±, the fps differences being so minor as to not realistically mean anything.

Making that hierarchy chart useless to anyone but a benchmark chaser. Imho.
 
Not sure who hasn't seen review of the 3050, but...
fHX5XGbknKv7sX9NbbRc9c-970-80.png.webp


BvXW74etMCgQm8Xe9xTDUG-970-80.png.webp
 
  • Like
Reactions: artk2219

watzupken

Reputable
Mar 16, 2020
1,030
521
6,070
I think there are already a lot of reviews out there, whether written or on videos, that shows that the RTX 3050 is generally behind the RTX 2060. The only time it can catch up with the RTX 2060 is when that extra 2GB VRAM is used. But the fact is that it is generally not that far off from the RTX 2060. So if given a choice of RTX 2060 6GB vs RTX 3050 8GB, I feel the 3050 may still be a viable option as long as it is cheaper to make up for the performance loss.
 
  • Like
Reactions: artk2219

Karadjgne

Titan
Ambassador
$389.99 for Asus Dual 3050 ,130w
$389.98 for Gigabyte 2060 Gaming X, 215w

According to PCPartPicker.com as of this reply.

Considering the measly 3-5 fps difference, which honestly is margin of error % and close enough that an OC aftermarket version could easily tie the difference, the 3050 8Gb and 2060 6Gb are the same, except one has lowered power consumption which could be a benefit.
 
My personal opinion is the 3050 is better than the 2060, uses better DLSS, better RTX, at gives a better 1080p gaming experience, regardless of a couple fps difference I can't see or tell exist in anything except a benchmark.
Better RTX and DLSS? Maybe if they didn't drastically reduce the Tensor and RT core counts. The original 2060 had 30 RT cores and 240 Tensor cores, while the 3050 drops those to 20 RT cores and 80 Tensor cores. So any per-core performance gains are negated by the reduction in core count.

In terms of real-world performance, Techpowerup's benchmarks show the 2060 6GB performing 13% faster on average across 25 games at 1080p, 14% faster at 1440p, and 15% faster at 4K, despite the VRAM deficiency...
https://www.techpowerup.com/review/evga-geforce-rtx-3050-xc-black/31.html

OC cards only improve the 3050's performance by around 1-2%. And since they only gave it 8 PCIe lanes, it loses 1-2% of it's performance when installed in a PCIe 3.0 motherboard. The card's performance is closer to that of the 1660 SUPER than the 2060, at least outside RT and DLSS, which the 16-series lacks.

As for RTX and DLSS, they showed the performance hit of enabling raytracing to be relatively similar for both cards. The 2060 still performed slightly better on average with RT enabled, outside of DOOM Eternal, which was the only example where the 2060's 6GB of VRAM resulted in a significant performance hit with raytracing turned on. And both saw a relatively similar performance uplift with DLSS enabled as well. The 3050 might manage a slightly better performance increase when upscaling with DLSS, and slightly less of a hit from RT, but generally still not enough to take the lead over the 2060's DLSS and RT performance.

That's not to say the 3050 is a bad card, but as far as performance goes, it's pretty much a slightly faster 1660 SUPER with more VRAM, DLSS and some limited RT capabilities. Even at it's $250 MSRP, it felt a bit underwhelming to me, considering the 1660 SUPER was offering similar performance in most games at a $230 price point more than two years prior to its release. For it's higher price, there's no reason why it's performance shouldn't have at least matched or exceeded that of the 2060, which came out over 3 years ago. And current market pricing has made it a significantly worse value than what prior cards offered, though that's been the case for all cards the last year or so.

For general gaming, and all intents and purposes, a 1070, 1660ti, 2060 and 3050 all land in the same place±, the fps differences being so minor as to not realistically mean anything.
I would generally agree, and the 3050 and 2060 get the added benefit of RT and DLSS that arguably gives them an edge above the rest. But the 1660 SUPER also performs very similar to the 1660 Ti and 1070, and launched a couple years prior to the 3050 at a lower suggested price. So, even at the cards intended MSRPs, the 3050 wasn't really bringing more performance to the table than what one could have bought two years prior, outside the features that trickled down from the 2060. And while that level of RT hardware is better than none at all, it's still a bit questionable how useful that will be going forward, as the 2060 was already only borderline useable for RT, and the 3050 doesn't really improve upon that, despite coming out a few years later.

As for the power draw difference, the stock 2060 was officially a 160 watt card, while the stock 3050 is a 130 watt card, so it's not exactly a huge difference, and OC versions of either will likely drive power levels higher, despite not improving performance to any perceptible degree. Though the power listed for that particular overclocked 2060 seems inaccurate, since Tom's only found it to draw around 180-190 watts in their review.
 
  • Like
Reactions: artk2219

Karadjgne

Titan
Ambassador
Yeah, I noticed the power draw discrepancies, some are saying close to 200w, some less. Some even as low as 140w. Anandtech puts it at 160w, for the FE, so with its boost clocks bumped I'd imagine the Strix Evo is somewhat higher.

Afaict nvidia hasn't really changed anything in years. New gen cards run 10 lower for roughly the same performance, it's only the top line cards that see any real gains. 3050-2060-1070, 690-780-970 etc. Even the 1080ti-2070Super-3060 are all relatively close.

As far as core counts, that's an iffy thing. Nvidia was pushing higher cores all the way up to the 700 series, then dropped the Maxwell 750/ti and threw everything out the window. Gtx660 had 960 Cuda, gtx750ti had 640. Bumped heads. So I can see different architectures having a decent impact on core effectiveness. As gimped as the 3050 is, it still stands toe to to with the older gens. I'm half expecting nvidia to drop a 3050Super or 3050ti (which is already in laptops) that's far less gimped and put it ahead of the 2060.
 

InvalidError

Titan
Moderator
I'm half expecting nvidia to drop a 3050Super or 3050ti (which is already in laptops) that's far less gimped and put it ahead of the 2060.
Rumors I have seen say GA106 RTX3050s are gimped down to practically the same specs as top-end GA107 so Nvidia can seamlessly transition the boards over to the cheaper chips later on. In that case, we are more likely to see lower-end parts than higher-end ones.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
$389.99 for Asus Dual 3050 ,130w
$389.98 for Gigabyte 2060 Gaming X, 215w

According to PCPartPicker.com as of this reply.

Considering the measly 3-5 fps difference, which honestly is margin of error % and close enough that an OC aftermarket version could easily tie the difference, the 3050 8Gb and 2060 6Gb are the same, except one has lowered power consumption which could be a benefit.
There are quite a few 3050's in stock at Newegg for less than $390. This one is $330. That would give a significant price advantage to the 3050. Cheapest 2060 in stock at newegg is $390.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
For general gaming, and all intents and purposes, a 1070, 1660ti, 2060 and 3050 all land in the same place±, the fps differences being so minor as to not realistically mean anything.
Until you factor in DLSS 2.0. Then the 2060 and 3050 separate themselves from the other two cards. DLSS is mainly being implemented in current AAA games that actually need the extra performance boost so as time moves on, the 2060 and 3050 will age much better.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
I'm not sure how accurate this whole thing is. In gaming I usually see the 2080Ti still falling between the 3070 and the 3080 if I recall. I'm not surprised about the 2060 still being strong as it was really strong for 60 series when ti was released and only improved upon by different version.
 
Until you factor in DLSS 2.0. Then the 2060 and 3050 separate themselves from the other two cards. DLSS is mainly being implemented in current AAA games that actually need the extra performance boost so as time moves on, the 2060 and 3050 will age much better.
While I agree that it's a nice feature to have, other recently-released upscaling solutions that don't require special hardware are also fairly competitive with it. DLSS might improve framerates a little better at a given quality level, or alternately allow for a slightly sharper image at a similar framerate, but not enough that I would say those cards will age "much better" as a result of it.

The raytracing hardware might also provide some additional benefit as time goes on, though it's unclear how well those cards will run RT going forward. And I doubt many games will require RT hardware for quite a while still, seeing as the current generation of consoles don't seem to handle raytracing well enough to make that standard.