[SOLVED] Wait for 30 series with my specs, or would it require an upgrade?

Mar 13, 2020
43
0
30
Hi guys, here's what I'm running on my machine right now.

CPU: AMD Ryzen 7 2700x
MB: B450 Tomahawk
PSU: Corsair RMX 650w
RAM: 16 GB DDR4-3000 Corsair Vengenace
GPU: MSI 1660ti Ventus X Gaming

I know it's a generational leap so I'm hesitant if I have to upgrade the rest of my machine for when the 30 series comes out. I really want a 2070 super at the moment but I see many say wait for the 30 series. I do not want to upgrade the rest of my machine if I can help it, since I'm satisfied overall minus the GPU. It was built at a time when I needed something on a budget, but now I can definitely afford a 2070 Super. Still, would I need to upgrade more based on speculation on what the new specs would be for the 30 series (Amphere, right?).
 
Solution
Oh, I agree - I was just trying to make sure to eliminate any and all other possibilities, just in case.

I know that Nvidia was supposed to make some kind of announcement about the 3000 series this month - though with the pandemic, that's apparently not happening now.

Also, it depends on what kind of future-proofing you're talking about. If you are sticking with 1920x1080 @ 60Hz, then the 2070 Super is so far into overkill territory that it's a waste. A 2060 Super would more than handle anything you could throw at it at 60fps at that resolution. I doubt the minimums would dip below 60 (I think Metro and AotS are the GPU killers, but even the high end cards suffer when those games are thrown at them at max settings).

Now, if you...

King_V

Illustrious
Ambassador
Likely not - but this all really depends on what your monitor's resolution and refresh rate are, and if it has GSync, FreeSync, or neither, and what specific games you're playing.

I suppose it's possible that Nvidia makes some bizarre change where the 3000 series won't work with older motherboards, but that doesn't seem likely.

How's your current card falling short?
 
Mar 13, 2020
43
0
30
Likely not - but this all really depends on what your monitor's resolution and refresh rate are, and if it has GSync, FreeSync, or neither, and what specific games you're playing.

I suppose it's possible that Nvidia makes some bizarre change where the 3000 series won't work with older motherboards, but that doesn't seem likely.

How's your current card falling short?
Honestly, I just want to run stuff on max settings 1080p at 60 FPS, and while some games can handle it, I notice drops a lot. Nothing too significant, although the biggest performance issues i had was with Jedi: Fallen Order. I also want to go max on games such as Cyberpunk and Halo Infinite when they launch, and I don't think my card will be capable of doing that. I think a 2070 should be able to sustain my needs for quite a few years, while I don't see the 1660ti holding up too much longer for the games i want to play.

Also, I just noticed that nvidia is likely delaying the launch due to GTC! So, wow.
 

King_V

Illustrious
Ambassador
I don't know what games you're playing, but most games manage a minimum of 60 fps, with a couple of execptions.

Which ones are you currently having trouble with?

Thus far, though, my suggestion would be to stick with the 1660Ti, and, instead of buying in anticipation of games that arenn't out yet, worry about it after buying those games, if they give you much trouble.

After all, in the future, you generally get more GPU horsepower per dollar than you can in the present.
 
Mar 13, 2020
43
0
30
I don't know what games you're playing, but most games manage a minimum of 60 fps, with a couple of execptions.

Which ones are you currently having trouble with?

Thus far, though, my suggestion would be to stick with the 1660Ti, and, instead of buying in anticipation of games that arenn't out yet, worry about it after buying those games, if they give you much trouble.

After all, in the future, you generally get more GPU horsepower per dollar than you can in the present.
Well, there were quite a few spots in Jedi: Fallen Order that were bad, such as Kashyyk. Otherwise, I can run Borderlands 3 and Divison 2 pretty well for the most part on 1080p max settings, but I run into areas and instances where it can dip to 50 frames or so. That can get really irritating for me and I question how long my card will be able to handle more modern games. I also make some money from playing video games and as a hobby so I do want to have good stuff, you know?
 

King_V

Illustrious
Ambassador
Well, there were quite a few spots in Jedi: Fallen Order that were bad, such as Kashyyk. Otherwise, I can run Borderlands 3 and Divison 2 pretty well for the most part on 1080p max settings, but I run into areas and instances where it can dip to 50 frames or so. That can get really irritating for me and I question how long my card will be able to handle more modern games. I also make some money from playing video games and as a hobby so I do want to have good stuff, you know?
I guess your monitor doesn't have FreeSync or GSync, then?

I do know that some games do have issues where they simply aren't optimized well, but I don't know if the games you mention are among them.

It might not hurt to run a program (I know GPU-Z is one) that will graph CPU and GPU usage over time. Try them, and see if either the CPU or the GPU are at all spiking and hitting 100% utilization.

Also, if you use CPU-Z, it can confirm whether your memory is running in single vs dual channel mode.

This may seem like a bit of tediousness, but I want to make sure that something else isn't causing an issue, thus resulting in you spending money on a more powerful card, but still having frame-rate dips.
 
  • Like
Reactions: Phaaze88
Mar 13, 2020
43
0
30
I guess your monitor doesn't have FreeSync or GSync, then?

I do know that some games do have issues where they simply aren't optimized well, but I don't know if the games you mention are among them.

It might not hurt to run a program (I know GPU-Z is one) that will graph CPU and GPU usage over time. Try them, and see if either the CPU or the GPU are at all spiking and hitting 100% utilization.

Also, if you use CPU-Z, it can confirm whether your memory is running in single vs dual channel mode.

This may seem like a bit of tediousness, but I want to make sure that something else isn't causing an issue, thus resulting in you spending money on a more powerful card, but still having frame-rate dips.
I am using duel memory, I've confirmed. I almost always enable v-sync when gaming so I maintain a steady 60 (or try to). As far as CPU and GPU usage, my GPU has maxed at 100%, like on Fallen Order. My CPU works as it should too.
 

King_V

Illustrious
Ambassador
Ok, so, GPU is maxed at 100%, but CPU is NOT maxed? The only other CPU issue that I think would be if the game didn't do threading well - for example, say you had an old i5 Haswell, 4 core/4 thread. If the game didn't thread, it would max out one core, and the others would be near idle. You'd get dips and lags, but the CPU utilization would only show 25% (100% on only 1/4 of the cores).

I don't recall if GPU-Z can show usage on a per-core basis or not. I forgot to consider this when I posted last night.

This is a little confusing, though, because the 1660Ti should be more than enough for 1080p.

If you lower graphics settings and/or resolution, do you see performance improvement?
 
Mar 13, 2020
43
0
30
Ok, so, GPU is maxed at 100%, but CPU is NOT maxed? The only other CPU issue that I think would be if the game didn't do threading well - for example, say you had an old i5 Haswell, 4 core/4 thread. If the game didn't thread, it would max out one core, and the others would be near idle. You'd get dips and lags, but the CPU utilization would only show 25% (100% on only 1/4 of the cores).

I don't recall if GPU-Z can show usage on a per-core basis or not. I forgot to consider this when I posted last night.

This is a little confusing, though, because the 1660Ti should be more than enough for 1080p.

If you lower graphics settings and/or resolution, do you see performance improvement?
I apologize for not making my point more clear. My card's not terrible by any means. Of course, when I lower the settings I get better improvements. I'm very picky I think about my performance, so if I see dips it annoys me. The primary reason I want a 2070 Super is for future proofing. I want something to use for several years down the line. With the new generation, I don't see the 1660ti being that card that'll last me for as long as I'd like, and I'd rather get it now than later. I hope that makes sense.
 

King_V

Illustrious
Ambassador
Oh, I agree - I was just trying to make sure to eliminate any and all other possibilities, just in case.

I know that Nvidia was supposed to make some kind of announcement about the 3000 series this month - though with the pandemic, that's apparently not happening now.

Also, it depends on what kind of future-proofing you're talking about. If you are sticking with 1920x1080 @ 60Hz, then the 2070 Super is so far into overkill territory that it's a waste. A 2060 Super would more than handle anything you could throw at it at 60fps at that resolution. I doubt the minimums would dip below 60 (I think Metro and AotS are the GPU killers, but even the high end cards suffer when those games are thrown at them at max settings).

Now, if you also want to use ray tracing, then that throws all the math and assumptions about performance completely out the window. Ray tracing is very GPU intensive.

I tend to be a little cautious, maybe overly so, so I would wait to see what the 3000 series brings. I can't imagine that Nvidia would delay the announcement too long, even if they can't do it at a live event, if they were originally ready to announce it this month. Either one of the new 3000 cards is what you'll go for, or they'll drive the 2000 cards' prices down, and you'll be able to snag a deal.
 
Solution
Mar 13, 2020
43
0
30
Oh, I agree - I was just trying to make sure to eliminate any and all other possibilities, just in case.

I know that Nvidia was supposed to make some kind of announcement about the 3000 series this month - though with the pandemic, that's apparently not happening now.

Also, it depends on what kind of future-proofing you're talking about. If you are sticking with 1920x1080 @ 60Hz, then the 2070 Super is so far into overkill territory that it's a waste. A 2060 Super would more than handle anything you could throw at it at 60fps at that resolution. I doubt the minimums would dip below 60 (I think Metro and AotS are the GPU killers, but even the high end cards suffer when those games are thrown at them at max settings).

Now, if you also want to use ray tracing, then that throws all the math and assumptions about performance completely out the window. Ray tracing is very GPU intensive.

I tend to be a little cautious, maybe overly so, so I would wait to see what the 3000 series brings. I can't imagine that Nvidia would delay the announcement too long, even if they can't do it at a live event, if they were originally ready to announce it this month. Either one of the new 3000 cards is what you'll go for, or they'll drive the 2000 cards' prices down, and you'll be able to snag a deal.
Thank you for your advice! I saw that delay. Unfortunate but necessary. personally I’d really like to invest in a new monitor since I’ve been looking for a duel monitor setup for a while, and I would definitely go for a 1440p. Do you think the 2070 S would be a good fit for that?
 

King_V

Illustrious
Ambassador
Could be. I would say definitely try to determine what your new monitor's specs will be PRIOR to choosing a new video card.

Firstly - whatever you do, go for a FreeSync monitor with LFC (low-framerate compensation). If you do get dips in some games, at least the FreeSync will make things smooth by adapting the refresh rate on the fly. Skip out on GSync - that's Nvidia only, and you have to pay a premium on that. There's no point in doing so since Nvidia 10-, 16-, and 20- series cards all support FreeSync.

Figure out what exact resolution you're looking for. . and, this is my own preference, consider ultra-wide. I like the extra field-of-view that the 21:9 aspect ratio gives, and I used to be one of those guys who said "Nah, ultra-wide is just a worthless gimmick!"

Now, my son's monitor has a FreeSync range of 50-144Hz (though the LFC allows it to adapt down to 25, if need be). We have no intention of playing at 144Hz because his video card isn't anywhere near capable of that, but does pretty well at averaging 60fps, and when there's dips, while sometimes noticeable, it's at least smooth.

BUT: when you have your resolution, also figure out what you're looking for in terms of frames/sec. If you have a 144Hz monitor, do you want to actually try to push that framerate all the time, or do you just want to make sure it never goes below 60?

I know I'm kind of making this sound a little complicated, but the choice of GPU and monitor, in my opinion, should really be thought of as a, well, maybe the term "single system" fits it.
 

larsv8

Distinguished
To me, doesn't make sense to to upgrade from 1660ti to a 2070 super to maximize 1080p 60hz performance.

Now if you jumped to a 1080p 144hz display, a 2070 super could make more sense, or a 1440 monitor.

I would normally say, lets wait till the 22nd and see what NVIDIA tech conference says, but it has now been cancelled, so who knows when the new cards are coming.

I really don't like the idea of future proofing. Why pay a premium now for something you may not even be able to use. I advocate buying your best bang for the buck items.

Whats your budget?
 
Mar 13, 2020
43
0
30
To me, doesn't make sense to to upgrade from 1660ti to a 2070 super to maximize 1080p 60hz performance.

Now if you jumped to a 1080p 144hz display, a 2070 super could make more sense, or a 1440 monitor.

I would normally say, lets wait till the 22nd and see what NVIDIA tech conference says, but it has now been cancelled, so who knows when the new cards are coming.

I really don't like the idea of future proofing. Why pay a premium now for something you may not even be able to use. I advocate buying your best bang for the buck items.

Whats your budget?
I have never thought of monitors this way. My budget... I’m not really sure what’s reasonable for a decent monitor. I haven’t looked extensively at prices, but maybe $200?
 

larsv8

Distinguished
I always start with the monitor when I am looking at performance because it is the "decoder of performance".

One of the big issues right now in cost / performance is that Gsync is expensive and Freesync is not. These are the NVIDIA and AMD anti tearing technologies in monitors. You always want to pair them to get the most out of your system.

Short answer, I would not upgrade if your are keeping your monitor.

Longer answer, if you are really looking to upgrade and are okay sticking with 1080p for a bit, I would consider:

Monitor: XG27VQ $272
GPU: 5700xt $380-420


If you had $500 for a super, and 200 for a monitor, I am going to assume you had an all in budget of 700.

5700xt, in my honest opinion, is the best high end bang for your buck card right now. This monitor, while 1080p, can handle up to 144 fps, which the 5700xt can easily provide now and going forward. You will also be able to take advantage of the Freesync. You would likely have no problems with future titles for quite some time, and if you did, 3-4 years down the line, we would be talking about 110-120 fps, instead of the full 144, which would hardly be noticeable, if at all.