The GTX 1050 2GB, worth it?

Rafael Mestdag

Reputable
Mar 25, 2014
1,442
1
5,460
I previously had a GTX 750 1GB and I've seen an improvement of about 25 fps over it. As for the price difference, it cost about 30% more than the 750. It's not a revolution in terms of performance, but it's a significant evolution.It's a good card for the money(about 120 us dollars).

Do you think it was worth it? Would you do the same?
 
Solution
Yes, for the budget sure.
Asking what WE would do is rather pointless. You know the performance increase you have, AND you know what you paid to get that.

Uh, and saying "If you can afford $120, then you can afford $190... " does suggest a problem with basic mathematics. Seriously though, saying "spend more" is a bit silly IMO.

AND the card is already been purchased, so suggesting he spend over $200 (tax etc) is even less of a value proposition.

Really, if you are HAPPY with the experience then great. There are plenty of games that a GTX1050 card does really well (the rest of your system matters to) thus it's hard to comment in detail.

Heck, I'm playing several OLDER games on a very high-end rig. That includes Burnout Paradise...
In my opinion no unless you were on an extreme budget, especially with games these days using over 2GB VRAM even at 1080p. If I was going to upgrade in that GPU market segment I'd go for a $190 4GB AMD RX 570. If you can afford $120, then you can afford $190. A lot more performance for your dollar. Unfortunately the 6GB GTX 1060 is just priced too high for how it performs compared to the AMD RX 570.

Better to spend a little more even if you have to save it up and be happy with what you have for a few years than deal with buyer's remorse if a new game comes out that you can't enjoy playing at high settings for the eye candy it was made for. I have only once owned an AMD card (HD 4870 from circa 2009) and have used Nvidia exclusively, so I do not make that recommendation lightly. AMD has the best performance for the dollar in the sub-$200 market segment. Unconditionally.

Here's an AMD RX 570 example (hard to keep in stock as a hot seller):

https://www.newegg.com/Product/Product.aspx?Item=N82E16814126189&cm_re=RX_570-_-14-126-189-_-Product


 
Yes, for the budget sure.
Asking what WE would do is rather pointless. You know the performance increase you have, AND you know what you paid to get that.

Uh, and saying "If you can afford $120, then you can afford $190... " does suggest a problem with basic mathematics. Seriously though, saying "spend more" is a bit silly IMO.

AND the card is already been purchased, so suggesting he spend over $200 (tax etc) is even less of a value proposition.

Really, if you are HAPPY with the experience then great. There are plenty of games that a GTX1050 card does really well (the rest of your system matters to) thus it's hard to comment in detail.

Heck, I'm playing several OLDER games on a very high-end rig. That includes Burnout Paradise, CnC3, Torchlight or the newer Stardew Valley. There are plenty of great games with low hardware requirements.
 
Solution
It depends. 15fps is not playable, to me. Gaining 25fps, so that now I'm at 40fps, that's 'worth it' because the game went from unplayable to playable.

If I was at 45fps, then the new card gets me 70fps, that's not worth it to me. I paid $120 and only went from playable to playable.

So it's not how much you gained, it's what difference did the gain make?
 


Exactly, in Eurotruck Simulator 2 for example there's a part of the map where the fps drops down to 16fps(totally unplayable), now it should be about 40fps(totally playable). That's an example of the big gain for me.

Plus it scored over 3000 points on Valley Benchmark 1.0 whereas the old 750 only managed a bit over 2000.

For me, taking these number into account at least, it was worth it.
 


He asked our opinion. I gave mine. All I am saying is that if I'm going to upgrade, I'm going to do a serious upgrade, not a step upgrade. Speaking of math, let's do it just based on one game benchmark at 1080p, Battlefield 1:

GTX 1050:
$120 for 41 FPS = $2.93 per FPS

RX 570:
$190 for 91 FPS = $2.09 per FPS

And keep in mind my point was about thinking about the future.
 


Don't take it too seriously. Read the quote again, it just doesn't make sense literally though of course I got your POINT which is why I also said "seriously" after. Also, FPS per dollar isn't so simple because it assumes a good CPU or else the numbers change (high-end GPU less likely to benefit on average).

As I said, it's hard to comment anyway because it depends on the GAMES he plays etc.

OTHER:
*You may want learn how to use ADAPTIVE VSYNC. For example, you are currently running either:
a) VSYNC ON - which adds stuttering if the FPS can't meet the Hz rate (i.e. solid 60FPS for 60Hz monitor), or
b) VSYNC OFF - which causes screen tearing.

You might want to investigate tweaking to 60FPS (likely) using Adaptive VSYNC, at least in games where it benefits most. I would turn VSYNC OFF first, then tweak the settings so that you hit 60FPS about 90% of the time.

Then force on Adaptive VSYNC for the game. This should lock on 60FPS VSYNC ON, but if you can't output 60FPS it turns VSYNC OFF automatically, thus giving you some screen tearing instead of added STUTTER.

(a couple games like Max Payne 3 also drop down from 60FPS to 30FPS if you can't maintain 60FPS which is really jarring as it gets much more sluggish. I forced on Adaptive VSYNC and when I drop into the 50's briefly i get slight screen tear only)

Here's how:
NVidia Control Panel,
"manage 3d settings"
"program settings"
add game,
find the "Adaptive VSYNC" option, then SAVE the settings, then

TEST it works (screen tear happens if you drop below 60FPS is the proof). Can use FRAPS or Steam FPS indicator.

(I use this for some of the Assassin's Creed games since I find it hard to keep a solid FPS and the stuttering was pretty bad at times, and VSYNC OFF showed a lot of screen tear so this was ideal... at least until I buy a GSYNC monitor but they are expensive)

OTHER: Adaptive VSYNC's main drawback is that cut scenes often have screen tearing. If it's a 30FPS, pre-rendered video it will because it just monitors the FPS output, and if it's lower than the goal (i.e. 60FPS) then it auto turns VSYNC OFF. I'd like them to disable that for VIDEOS in a game.
 

TRENDING THREADS