Question Is it worth to upgrade to i7 3770 or k from i7 2600?

Aug 22, 2019
61
12
35
0
Hey there, I've recently bought a GTX 1070. My current CPU bottlenecks the freak out of it. I was wondering if should I get an i7-3770/k or would the bottleneck it too?

Thank you in advance.
 
Oct 13, 2018
24
0
10
0
it would still be a bottleneck, get a new mobo prehaps the budget b360 or if u can spend more money a z370 and get a i5 8600 or the 9th gen one. I think that an i5 9th gen plus a b360 mobo and 8gb 2666mhz ram will vone around 250 bucks, idk about amd, they have a good choice of processors too, but if u use it for gaming and want 0 bottlenecks get intel
 
Aug 22, 2019
61
12
35
0
it would still be a bottleneck, get a new mobo prehaps the budget b360 or if u can spend more money a z370 and get a i5 8600 or the 9th gen one. I think that an i5 9th gen plus a b360 mobo and 8gb 2666mhz ram will vone around 250 bucks, idk about amd, they have a good choice of processors too, but if u use it for gaming and want 0 bottlenecks get intel
was thinking about getting a Ryzen chip. Either an R5 1600 or R5 2600. Seem to be cheaper to get a ryzen chip and mobo. Ram is pretty cheap second hand can cost around 34 USD for 2x4gb DDR4-2400mhz
 

Third-Eye

Distinguished
Jun 26, 2011
281
37
18,720
3
Upgrading to an 3770/k would not be worth the trouble since it's something like 7-10% faster than a 2600/k. This could mean that if you already get 60fps, you will only get up to 5fps more which you are unlikely to notice when playing.

Also, the GTX 1070 is not as much of bottleneck for the i7 2600k or 3770k as people would have you believe. Even a GTX 1080ti will still improve fps in games that are not heavily CPU biased. Newer games coming out actually require faster memory for higher fps as well, so the next real bottleneck is becoming the memory speed of a DDR3 system under DDR3 2133Mhz. Battlefield 1/5 and Fallout 4 are perfect examples of memory speed directly affecting your fps, which could be as much as 20fps difference depending on configuration.
 
Aug 22, 2019
61
12
35
0
Upgrading to an 3770/k would not be worth the trouble since it's something like 7-10% faster than a 2600/k. This could mean that if you already get 60fps, you will only get up to 5fps more which you are unlikely to notice when playing.

Also, the GTX 1070 is not as much of bottleneck for the i7 2600k or 3770k as people would have you believe. Even a GTX 1080ti will still improve fps in games that are not heavily CPU biased. Newer games coming out actually require faster memory for higher fps as well, so the next real bottleneck is becoming the memory speed of a DDR3 system under DDR3 2133Mhz. Battlefield 1/5 and Fallout 4 are perfect examples of memory speed directly affecting your fps, which could be as much as 20fps difference depending on configuration.
Yea the main games I play is Fortnite, Overwatch and getting in Destiny 2. Fortnite is the big culprit of my system bottlenecking.

Going to save up for a Ryzen chip
B450 & Ryzen 5 2600 for $230 USD
DDR4-2400 2x4GB $38 USD

Is this set-up any good?
 

Third-Eye

Distinguished
Jun 26, 2011
281
37
18,720
3
Well, what GPU do you have right now? You might not even have to buy a new system right now if you just need to upgrade the GPU to something like a GTX 1050ti or 1650 or a GTX 1070 equivalent GTX 1660ti. An Ryzen 5 2600 is a good start, but I'd get a 2x4GB DDR4 3000 cl16 kit for $6-12 more. If you can afford it, get a 2x8GB kit for $74-85.
 
Aug 22, 2019
61
12
35
0
Well, what GPU do you have right now? You might not even have to buy a new system right now if you just need to upgrade the GPU to something like a GTX 1050ti or 1650 or a GTX 1070 equivalent GTX 1660ti. An Ryzen 5 2600 is a good start, but I'd get a 2x4GB DDR4 3000 cl16 kit for $6-12 more. If you can afford it, get a 2x8GB kit for $74-85.
GTX 1070
 

Third-Eye

Distinguished
Jun 26, 2011
281
37
18,720
3
Ok well what exactly do you mean by bottlencking then? You should be getting 80-100fps in Fortnite with a 2600/k and GTX 1070. edit Actually the fps should be well over 120 at 1080p and max settings.
 
Last edited:
Reactions: IBeats
Aug 22, 2019
61
12
35
0
Ok well what exactly do you mean by bottlencking then? You should be getting 80-100fps in Fortnite with a 2600/k and GTX 1070. edit Actually the fps should be well over 120 at 1080p and max settings.
I have a 240hz monitor and get horrible frame skips and drops. Maybe it's just the game itself. I set it to 144fps and the same results.

and this is on all low settings too
 

Third-Eye

Distinguished
Jun 26, 2011
281
37
18,720
3
I have a 240hz monitor and get horrible frame skips and drops. Maybe it's just the game itself. I set it to 144fps and the same results.

and this is on all low settings too
What is your full system specs? You may have ram that is too slow or you are running a single 8GB module which is forced to single channel mode. you may also be running the game off an old 5400rpm drive. Stutter, skip and frame drops are usually because of slow memory or hard drive when the game is streaming data.
 
Last edited:

logainofhades

Titan
Moderator

Third-Eye

Distinguished
Jun 26, 2011
281
37
18,720
3
It looks like your ram is running at 1333mhz instead of that kits 1600 advertised speed. Since you are running a z77 board you should be able to set that kit to 1600 manually without XMP and just set the timings to auto, which should set to it 9-9-9-24. It might be just enough to stop the problem, but the stutter or skips may just be the game. Your CPU is also quite hot at 53c idle, you might want to get an after market cooler for that. I don't play fortnite, so I can't tell you personally how it runs on my 2600k. Does any other game have the same issue?
 
Aug 22, 2019
61
12
35
0
It looks like your ram is running at 1333mhz instead of that kits 1600 advertised speed. Since you are running a z77 board you should be able to set that kit to 1600 manually without XMP and just set the timings to auto, which should set to it 9-9-9-24. It might be just enough to stop the problem, but the stutter or skips may just be the game. Your CPU is also quite hot at 53c idle, you might want to get an after market cooler for that. I don't play fortnite, so I can't tell you personally how it runs on my 2600k. Does any other game have the same issue?
Hey, sorry I was running a game in the background. Cpu idle around 29-30c, max is less than 60c.

I do have xmp enable
 

mitch074

Distinguished
Mar 17, 2006
2,111
58
19,940
27
It looks like your ram is running at 1333mhz instead of that kits 1600 advertised speed. Since you are running a z77 board you should be able to set that kit to 1600 manually without XMP and just set the timings to auto, which should set to it 9-9-9-24. It might be just enough to stop the problem, but the stutter or skips may just be the game. Your CPU is also quite hot at 53c idle, you might want to get an after market cooler for that. I don't play fortnite, so I can't tell you personally how it runs on my 2600k. Does any other game have the same issue?
About the RAM : I don't agree - CPU-Z does report 800 MHz, doubling to 1600 - so that's not it.
As for bad cooling causing throttling, that could be it. @Third-Eye : have you updated your BIOS ? It could be an incompatibility between your motherboard and your GPU.
 
Aug 22, 2019
61
12
35
0
About the RAM : I don't agree - CPU-Z does report 800 MHz, doubling to 1600 - so that's not it.
As for bad cooling causing throttling, that could be it. @Third-Eye : have you updated your BIOS ? It could be an incompatibility between your motherboard and your GPU.
Yup I've update my bios to the latest version.

The cpu temps was recorded when a game was opened.
 

Third-Eye

Distinguished
Jun 26, 2011
281
37
18,720
3
About the RAM : I don't agree - CPU-Z does report 800 MHz, doubling to 1600 - so that's not it.
As for bad cooling causing throttling, that could be it. @Third-Eye : have you updated your BIOS ? It could be an incompatibility between your motherboard and your GPU.
I had a quick look at the info so I messed up when reading it. I was going by
Slot #1 Module Corsair 4096 MB (DDR3-1337) - XMP 1.2 - P/N: CMX16GX3M4A1600C9, which is why I suggested IBeats's memory might be set to the CPUs FSB and not the xmp or manual 1600mhz. I was wrong though, it actually is running 1600Mhz.
Frequency 800.7 MHz (DDR3-1602) - Ratio 1:6
Timings 9-9-9-24-2 (tCAS-tRC-tRP-tRAS-tCR)
 

Third-Eye

Distinguished
Jun 26, 2011
281
37
18,720
3
atm I dont have another cpu intense game. All the other games like overwatch and destiny 2 runs fine. It might be the game itself.
Other people with similar systems have reported the same stutter issue with Fortnite, so it might be something about the game or the older systems. The only game I know stutters on my 2600k is Far Cry 5, but that is an issue with the game itself from some patch a while back. I didn't find about it until I bought the game during the last steam sale.
 

digitalgriffin

Distinguished
Jan 29, 2008
963
203
19,390
21
Hey there, I've recently bought a GTX 1070. My current CPU bottlenecks the freak out of it. I was wondering if should I get an i7-3770/k or would the bottleneck it too?

Thank you in advance.
It all depends on the rest of your system and which games you play. Yes the 3770 will be faster. How much so is arbitrary depending on a number of factors.

Paying a super premium ($300+ Used) for a 3770K is pointless if you can't overclock it. You'll need a Z67 or Z77 chipset to do so.

If you run at > 1080p your GPU will become the bottleneck. I'm running 1440p and my 3770K chugs along just fine with a RX580 on a lot of games. But I'm running with a 60Hz screen cap and 4.4GHz overclock.

Most reviewers agree, the more modern CPU's (ie: 9700K, 2700X, 3700X) will deliver a smoother experience. (Less frame time variance)
 
At 1080p with a 1070, my overclocked 3770k held me back in a handful of games - BFV, Shadow of Tomb Raider specifically.

It was still pretty capable with most modern games though. It was beginning to feel its age in Windows/multitasking.
 

Karadjgne

Titan
Herald
It's simply amazing how many ppl get things twisted around.

The cpu sets fps. Period. It pre-renders all frames according to the game code. Those pre-renders are then sent to the gpu which finish renders the frames according to resolution and detail settings.

A cpu can't bottleneck a gpu. Ever. The cpu sets the frame limit, the gpu either lives up to that number or doesn't. A bottleneck is something that slows down the flow of data/info. Since the cpu is basically the source of the frames, it's not slowing anything down, it is what it is.

If you get worse fps (a lot of fps) between low and ultra, the issue isn't the cpu, it's the gpu. If changing to ultra doesn't really affect fps, then the issue is that the cpu is giving all it can, whether the gpu is capable of more, or not is moot.

Fortnite is a pain. There's been several patches that fix one issue, only to create more. So buggery play can be normal behavior. If the pc is already maximized and optimized. That means updating all motherboard chipset drivers, not just the gpu. That means turning off windows Xbox DVR and gamebar crap. It means turning off services or setting services to run later, not during gaming usage. Get rid of Cortana, windows store updates, stop indexing. In nvcp turn pre-rendered 3d frames from 3 to 1. Turn down grass detail. Make sure physX is on gpu not cpu.
Check online for speed tips as there's a bunch of stuff like clouds that really have almost no visible impact to your view but have massive physics impact on the cpu. Make sure you've done a good ccleaner/registry clean (say yes to backups!)

Under normal usage, that i7 should have no issues with fortnite, but it will need to be fine tuned. After all that, if its still buggery, chalk it up to the game code/patches, not the pc.
 

mitch074

Distinguished
Mar 17, 2006
2,111
58
19,940
27
It's simply amazing how many ppl get things twisted around.

The cpu sets fps. Period. It pre-renders all frames according to the game code. Those pre-renders are then sent to the gpu which finish renders the frames according to resolution and detail settings.

A cpu can't bottleneck a gpu. Ever. The cpu sets the frame limit, the gpu either lives up to that number or doesn't. A bottleneck is something that slows down the flow of data/info. Since the cpu is basically the source of the frames, it's not slowing anything down, it is what it is.

If you get worse fps (a lot of fps) between low and ultra, the issue isn't the cpu, it's the gpu. If changing to ultra doesn't really affect fps, then the issue is that the cpu is giving all it can, whether the gpu is capable of more, or not is moot.

Fortnite is a pain. There's been several patches that fix one issue, only to create more. So buggery play can be normal behavior. If the pc is already maximized and optimized. That means updating all motherboard chipset drivers, not just the gpu. That means turning off windows Xbox DVR and gamebar crap. It means turning off services or setting services to run later, not during gaming usage. Get rid of Cortana, windows store updates, stop indexing. In nvcp turn pre-rendered 3d frames from 3 to 1. Turn down grass detail. Make sure physX is on gpu not cpu.
Check online for speed tips as there's a bunch of stuff like clouds that really have almost no visible impact to your view but have massive physics impact on the cpu. Make sure you've done a good ccleaner/registry clean (say yes to backups!)

Under normal usage, that i7 should have no issues with fortnite, but it will need to be fine tuned. After all that, if its still buggery, chalk it up to the game code/patches, not the pc.
I don't disagree with your explanation on how graphics cards will process whatever the CPU sends them, but yes, in case the graphics card "stalls" waiting for the CPU to feed it, it can be considered a bottleneck.
As to why Fortnite could be CPU-hungry, the fact that's it's badly optimized and full of bugs is probably not worth discussing - I don't play it. I wouldn't be surprised though that it uses AVX with a badly optimized FPU+SSE fallback somewhere, making it slow on anything older than a Haswell CPU but fast on anything Haswell or more recent.
Freeing as much resources from the system is a workaround - it should be done, yes, but that's valid even on recent machines. Considering how much crap Windows 10 runs in background and how many apps install a resident "helper", you can easily free the equivalent of 0.5-1 GHz of single core CPU time by decrapifying a system.
 

ASK THE COMMUNITY

TRENDING THREADS