[SOLVED] Is a new system build a waste of money at this point? (Current build listed in thread.)

Dec 13, 2020
21
1
15
0
i7 7700K
2400mhz DDR4
1660Ti
650W PSU
1080p/60hz

So, until Cyberpunk 2077 released I've never really had a need to upgrade my system for 1080p gaming. There have been one or two games I've had to skip due to CPU limitations, most notably games like RDR2, but for the most part my system can play almost anything at 1080p as long as they're not CPU bound. But 2077 being 2077 this was the year I was going to upgrade all of my hardware, but for obvious reasons we've all pretty much been left in the <Mod Edit> when it comes to GPUs.

But, with the 1660Ti still being a pretty decent card, I thought that perhaps I'd upgrade to Zen 3 with a 5600X and then approach the GPU issue at a later date, but that was going to cost upwards of £700 for the system which was a lot of money. So I considered cheaper options something like the 10600K which is essentially a rebranded 8800K which is only a generation above my current CPU, and that was going to cost a more moderate £500 for the system.

But now, given how bad silicon is at the minute, I'm starting to think that it's not even worth doing anything in 2021 and I wondered if anyone agrees with this notion. The chances of seeing more 30 series GPUs or anything else in 2021 just seems bleak as hell unless you buy from a scalper, and it just makes me think this is a total waste of money, a crap time to do anything and that saving and waiting for 2022 is just the best thing to do.

I had even considered just buying a PS5 or XBSX as a sort of, next gen interlude for 2021 instead of doing anything with my PC, since AMD will be rolling out their new FidelityFX and with me being locked to a 1660Ti due to the market, I don't feel like I'm getting any benefit when I could be waiting and skipping this year entirely.

What are your thoughts?
 
Last edited by a moderator:
How have you confirmed that the 7700K is holding you back?
Both RDR2 and Cyberpunk are really GPU demanding titles, so a 4 core 8 thread cpu should really struggle, especially if its a 7700K.
What resolution and settings are you gaming at?
 

Zerk2012

Titan
Ambassador
i7 7700K
2400mhz DDR4
1660Ti
650W PSU
1080p/60hz

So, until Cyberpunk 2077 released I've never really had a need to upgrade my system for 1080p gaming. There have been one or two games I've had to skip due to CPU limitations, most notably games like RDR2, but for the most part my system can play almost anything at 1080p as long as they're not CPU bound. But 2077 being 2077 this was the year I was going to upgrade all of my hardware, but for obvious reasons we've all pretty much been left in the <Mod Edit> when it comes to GPUs.

But, with the 1660Ti still being a pretty decent card, I thought that perhaps I'd upgrade to Zen 3 with a 5600X and then approach the GPU issue at a later date, but that was going to cost upwards of £700 for the system which was a lot of money. So I considered cheaper options something like the 10600K which is essentially a rebranded 8800K which is only a generation above my current CPU, and that was going to cost a more moderate £500 for the system.

But now, given how bad silicon is at the minute, I'm starting to think that it's not even worth doing anything in 2021 and I wondered if anyone agrees with this notion. The chances of seeing more 30 series GPUs or anything else in 2021 just seems bleak as hell unless you buy from a scalper, and it just makes me think this is a total waste of money, a crap time to do anything and that saving and waiting for 2022 is just the best thing to do.

I had even considered just buying a PS5 or XBSX as a sort of, next gen interlude for 2021 instead of doing anything with my PC, since AMD will be rolling out their new FidelityFX and with me being locked to a 1660Ti due to the market, I don't feel like I'm getting any benefit when I could be waiting and skipping this year entirely.

What are your thoughts?
Your processor is not bad for either of those games your video card matters most.

Cyberpunk uses about 60% of my 10600K
RDR2 with about the same hardware you have very playable with a mix of high/mid settings.

View: https://www.youtube.com/watch?v=csx3AHVcPeg
 
Reactions: Master Djoza
Dec 13, 2020
21
1
15
0
How have you confirmed that the 7700K is holding you back?
Both RDR2 and Cyberpunk are really GPU demanding titles, so a 4 core 8 thread cpu should really struggle, especially if its a 7700K.
What resolution and settings are you gaming at?
By monitoring in MSI Afterburner over a 150 hour period, so I've watched my both my GPU and CPU usage in all environments. It's mostly areas with lighting and shadows that stress the GPU and populated areas that stress the CPU.

Your processor is not bad for either of those games your video card matters most.

Cyberpunk uses about 60% of my 10600K
RDR2 with about the same hardware you have very playable with a mix of high/mid settings.

View: https://www.youtube.com/watch?v=csx3AHVcPeg
With RDR2 the case isn't as bad, but yeah, with 2077 my CPU peaks frequently at 99% in crowded areas with NPCs; as well as driving in city areas unless I use low population density. With med-low GPU settings my card is fine, but whenever it comes to the CPU tasks it tanks. With more open world environments the 7700K is fine for the most part hovering between 70%-75%, but as soon as you enter a market area or some other high population I get spikes below 40fps, which is fine I can play it like that but it's not ideal and not a game I really want to waste on a mediocre experience.

The only realistic thing I can do is do a clean install of windows, have no other software running, disable Windows Defender and play offline to squeeze any more performance out of this CPU.
 
How have you confirmed that the 7700K is holding you back?
Both RDR2 and Cyberpunk are really GPU demanding titles, so a 4 core 8 thread cpu should shouldn't really struggle, especially if its a 7700K.
What resolution and settings are you gaming at?
I suspect that (shouldn't) is what you meant? :)

It would be hard to imagine a 7700K not effectively saturating (or nearly so) a 1660 even at 1080P, as long as no streaming is involved....(the faster GPUs of course will benefit more at 1080P from faster CPUs) Naturally, some games improve more than others beyond 8-10 threads. (BF5 seemed to do well/scale upward with even up to 20 threads)

My own 7700K at 4.7 GHz (MCE all core) still seems more than adequate for my own BF1 needs anyway, although I'm sure the 3600X/5600 and all the newer Intel CPUs have higher min and average FPS as well...
 
Last edited:

Zerk2012

Titan
Ambassador
By monitoring in MSI Afterburner over a 150 hour period, so I've watched my both my GPU and CPU usage in all environments. It's mostly areas with lighting and shadows that stress the GPU and populated areas that stress the CPU.



With RDR2 the case isn't as bad, but yeah, with 2077 my CPU peaks frequently at 99% in crowded areas with NPCs; as well as driving in city areas unless I use low population density. With med-low GPU settings my card is fine, but whenever it comes to the CPU tasks it tanks. With more open world environments the 7700K is fine for the most part hovering between 70%-75%, but as soon as you enter a market area or some other high population I get spikes below 40fps, which is fine I can play it like that but it's not ideal and not a game I really want to waste on a mediocre experience.

The only realistic thing I can do is do a clean install of windows, have no other software running, disable Windows Defender and play offline to squeeze any more performance out of this CPU.
Turn your game settings down a bit and try it again.
 
i7 7700K
2400mhz DDR4
1660Ti
650W PSU
1080p/60hz

So, until Cyberpunk 2077 released I've never really had a need to upgrade my system for 1080p gaming. There have been one or two games I've had to skip due to CPU limitations, most notably games like RDR2, but for the most part my system can play almost anything at 1080p as long as they're not CPU bound. But 2077 being 2077 this was the year I was going to upgrade all of my hardware, but for obvious reasons we've all pretty much been left in the <Mod Edit> when it comes to GPUs.

But, with the 1660Ti still being a pretty decent card, I thought that perhaps I'd upgrade to Zen 3 with a 5600X and then approach the GPU issue at a later date, but that was going to cost upwards of £700 for the system which was a lot of money. So I considered cheaper options something like the 10600K which is essentially a rebranded 8800K which is only a generation above my current CPU, and that was going to cost a more moderate £500 for the system.

But now, given how bad silicon is at the minute, I'm starting to think that it's not even worth doing anything in 2021 and I wondered if anyone agrees with this notion. The chances of seeing more 30 series GPUs or anything else in 2021 just seems bleak as hell unless you buy from a scalper, and it just makes me think this is a total waste of money, a crap time to do anything and that saving and waiting for 2022 is just the best thing to do.

I had even considered just buying a PS5 or XBSX as a sort of, next gen interlude for 2021 instead of doing anything with my PC, since AMD will be rolling out their new FidelityFX and with me being locked to a 1660Ti due to the market, I don't feel like I'm getting any benefit when I could be waiting and skipping this year entirely.

What are your thoughts?
If you are happy with current frame rates/playability with current rig, I'd stand pat....

Until you take up streaming or get a much faster GPU, perhaps striving for 100/144 FPS minimums for higher refresh rate monitors, you might eke out another generation of waiting... (If Anandtech's review of 11700K is accurate, 11th gen not looking good for gaming, as their results showed it slower than 10th gen...in gaming anyway. Let's hope their BIOS get's updated, as I've seen other comparisons that showed 11700K faster, but, unknown sources on Youtube are hardly worthy of any conclusions; results will be out by 1 April by the hundreds...
 
Dec 13, 2020
21
1
15
0
Turn your game settings down a bit and try it again.
I've tried everything. With everything at low the game runs without issues but I can't find a happy balance, to the point where the FPS hit is more acceptable than the graphics settings. Mostly I just use the medium preset and turn down stressful settings like lighting, shadows, volumetrics, screen space reflections m, etc, with the only things being on medium the likes of LOD, mirror quality and perhaps local shadow quality. It's practically as low as it can get while still having decent texture quality.

The hit comes from crowd density which is what spikes my CPU. Which is self explanatory really, low has less of a hit, medium, and high even more so. I tend to jist play at medium and accept the hit, but realistically I did want to play this game at 1440p with variable refresh rate.

The only OTHER thing I can do is turn off V-Sync but then you have the screen tearing, so you can kind of see the dilema here. Either way I have to upgrade something, and either way it kind of seems like a waste of money when next year will hopefully bring newer and better things.
 
Dec 13, 2020
21
1
15
0
If you are happy with current frame rates/playability with current rig, I'd stand pat....

Until you take up streaming or get a much faster GPU, perhaps striving for 100/144 FPS minimums for higher refresh rate monitors, you might eke out another generation of waiting... (If Anandtech's review of 11700K is accurate, 11th gen not looking good for gaming, as their results showed it slower than 10th gen...in gaming anyway. Let's hope their BIOS get's updated, as I've seen other comparisons that showed 11700K faster, but, unknown sources on Youtube are hardly worthy of any conclusions; results will be out by 1 April by the hundreds...
It's just, if 3nm is a successful launch for AMD as much as the past two generations on 7nm have been, it seems like a rough year in general. I do kind of feel it's worth holding out, as really I want the RT experience and I'm not really going to get that either way regardless of a few frame dips in a handful of games. The CPU benefits from waiting are almost as good as the GPU benefits from a price perspective, save now and buy later just seems like a better financial decision.
 

g-unit1111

Titan
Moderator
The only sucky part about upgrading right now is the extreme shortage of gpu's. Otherwise the rest of the components are there to be had at a reasonable price.
Yeah I know it's maddening. I went to Micro Center last week and there was a line like out the door and around the street corner of people trying to get GPUs. I hope things will start correcting themselves soon and more GPUs will start coming into the markets. This so far has been worse than the 2018 GPU shortage, at least from what I have seen.
 
Reactions: Why_Me

ASK THE COMMUNITY

TRENDING THREADS