I am sorry if this is not the right thread, I wasn't sure if it should go in cpu, ram or graphics thread...
So long story short:
Pc built in 2013 had fx 4130 and gtx 650 with 8 GB 1333mhz ram
2016 I upgraded the PSU, mobo, the cpu to an fx 8370e and gpu to gtx 1070..and 2 ssds. same ram but bought another 8Gb so now 16 gb . THEN I SAW CPU BOTTLENECK
A couple of months ago I decided to go intel but didn't have that much money so wanted to at least keep ram I bought.So I found a very cheap i7 4770 (non k) The new motherboard is a unused ROG Maximus Ranger VI.
From the stock 3.4ghz with 3.9ghz turbo on one core, I now got it thanks to the brilliant mobo, to 3.9ghz on all cores, all locked all the time at 1.184v (and even that is -0.040v less that what the mobo wanted to give it).If I give it any less v it is still stable but gives slightly lower geekbench result. Now during summer it runs at 35 degrees Celsius (95 Fahrenheit) in idle, and 48 degrees (118F) while gaming Bf1 64 Server which is by far the most cpu intensive game I play, so everything else I am doing an my pc is under that temperature. I am on air with Schyte Muggen cooler and 9 fans making a storm down there. The cpu isn't delided yet, but I am thinking about doing it.
Thanks to the mobo I can OC further via Bus (I got to 4.2ghz locked on all cores easily) and of course the Geekbench and other tests showed higher results and even then didn't even exceed 55 degrees. THE PROBLEM is, that starting with 4,0ghz the pcie 3.0 becomes 2.0 x16.. Did some benchmarking and results are inconclusive..most scenes in gta5 are better with cpu at 4.2ghz and pcie 2.0 but one scene was a lot worse..(pcie bottleneck). In bf 1 I don't really notice a difference (it's hard comparing online matches) although I expected there to see the biggest difference because it's tho only game that brings this cpu to 100%..So 4.2 ghz or 3.9ghz both have framedrops and dips (75 fps max locked to monitor refresh rate) to like the 50s but not like the fx 8370 in the 30s or even 20s.
My gpu usage in bf 1 on ultra maxed out on 1080p Monitor is between 40-60% at 50 degrees Celsius. So:
1) is it worth it to oc above 3.9 ghz and lose pcie 3.0? Also the cpu will die faster at 4.2ghz even if I stay below 1.2v and under 55 degrees?? (e.g electron migration at low temp?)
2) Is maybe my Gskill 16 gb dual channel 1333mhz 9-9-9-24 DDR 3 ram (the only component I didn't change from the initial 2013 pc) my bottleneck? Would I see massive improvements if changing to DDR4 2400 or above?
Note: the Ram is just partially in dual channel. Mobo only supports dual channel but I got a 8 GB stick and two 4 gb..because I couldn't find the exact model /latency at 8 GB when I bought the second 8gb. Cpu z says everything is fine and lists dual channel.
So long story short:
Pc built in 2013 had fx 4130 and gtx 650 with 8 GB 1333mhz ram
2016 I upgraded the PSU, mobo, the cpu to an fx 8370e and gpu to gtx 1070..and 2 ssds. same ram but bought another 8Gb so now 16 gb . THEN I SAW CPU BOTTLENECK
A couple of months ago I decided to go intel but didn't have that much money so wanted to at least keep ram I bought.So I found a very cheap i7 4770 (non k) The new motherboard is a unused ROG Maximus Ranger VI.
From the stock 3.4ghz with 3.9ghz turbo on one core, I now got it thanks to the brilliant mobo, to 3.9ghz on all cores, all locked all the time at 1.184v (and even that is -0.040v less that what the mobo wanted to give it).If I give it any less v it is still stable but gives slightly lower geekbench result. Now during summer it runs at 35 degrees Celsius (95 Fahrenheit) in idle, and 48 degrees (118F) while gaming Bf1 64 Server which is by far the most cpu intensive game I play, so everything else I am doing an my pc is under that temperature. I am on air with Schyte Muggen cooler and 9 fans making a storm down there. The cpu isn't delided yet, but I am thinking about doing it.
Thanks to the mobo I can OC further via Bus (I got to 4.2ghz locked on all cores easily) and of course the Geekbench and other tests showed higher results and even then didn't even exceed 55 degrees. THE PROBLEM is, that starting with 4,0ghz the pcie 3.0 becomes 2.0 x16.. Did some benchmarking and results are inconclusive..most scenes in gta5 are better with cpu at 4.2ghz and pcie 2.0 but one scene was a lot worse..(pcie bottleneck). In bf 1 I don't really notice a difference (it's hard comparing online matches) although I expected there to see the biggest difference because it's tho only game that brings this cpu to 100%..So 4.2 ghz or 3.9ghz both have framedrops and dips (75 fps max locked to monitor refresh rate) to like the 50s but not like the fx 8370 in the 30s or even 20s.
My gpu usage in bf 1 on ultra maxed out on 1080p Monitor is between 40-60% at 50 degrees Celsius. So:
1) is it worth it to oc above 3.9 ghz and lose pcie 3.0? Also the cpu will die faster at 4.2ghz even if I stay below 1.2v and under 55 degrees?? (e.g electron migration at low temp?)
2) Is maybe my Gskill 16 gb dual channel 1333mhz 9-9-9-24 DDR 3 ram (the only component I didn't change from the initial 2013 pc) my bottleneck? Would I see massive improvements if changing to DDR4 2400 or above?
Note: the Ram is just partially in dual channel. Mobo only supports dual channel but I got a 8 GB stick and two 4 gb..because I couldn't find the exact model /latency at 8 GB when I bought the second 8gb. Cpu z says everything is fine and lists dual channel.