Question Literally no difference after upgrading cpu

Mar 18, 2019
9
0
10
0
I just upgraded to and i7 4770k from an i5 4430. I had ran a previous benchmarks on the i5 before I did the upgrade just to see the performance gains and expected temps.

The i5 got a max of 62c on aida64 (overall core temp) using a Thermaltake Gravity i2 cpu cooler, stressing cpu, fpu, and cache. On Superposition, I got a score of 9057 coupled with my 780 ti and 16gb of ram.

I did the upgrade to the 4770k and ran aida64 for about five minutes and the got to around 90c on the cores. The temps were were varying wildly though, being in the mid 80s for a second then back up into the low 90s, with some cores being 10c under the others at times. I just look at cpu temp and it stays around the high 60s, but I have no reference for that because I only looked at core temps on the i5. The Superposition score was lower for some reason, at 9050.

I also ran a couple games, the two being Minecraft and Forza Motorsport 7. No fps difference whatsoever.

Other things to note:
I have 16gb ram
My power plan is on ultimate performance
GPU is not throttling
CPU SEEMS to not be throttling
Cooler was mounted properly and has been repasted
 

PC Tailor

Dignified
Herald
Why did you upgrade the CPU in the first place (just to understand background)?
The games you've listed will be more dependant on GPU than CPU, so more CPU power won't be used if not needed.

Also what graphical settings, refresh rate, and resolution are you playing?

  • Firstly you'll want to make sure all MB/Chipset and GPU drivers are all up to date (not using a third party driver update software).
  • Also do you have latest BIOS installed?
  • What is the CPU and GPU temps under normal load (in game, not Aida for example)?
I wouldn't use benchmarks as much of an example really as there not so reliable to go off.
 
Mar 18, 2019
9
0
10
0
Why did you upgrade the CPU in the first place (just to understand background)?
The games you've listed will be more dependant on GPU than CPU, so more CPU power won't be used if not needed.

Also what graphical settings, refresh rate, and resolution are you playing?

  • Firstly you'll want to make sure all MB/Chipset and GPU drivers are all up to date (not using a third party driver update software).
  • Also do you have latest BIOS installed?
  • What is the CPU and GPU temps under normal load (in game, not Aida for example)?
I wouldn't use benchmarks as much of an example really as there not so reliable to go off.
I upgraded the cpu without justification, I felt like it was upgrade time and I finally got the money to do it.

I use a Sceptre E255B-1658A 165hz. Set to 165hz in Nvidia control panel. In Minecraft with 22 chunks and everything on fancy w/ Optifine, no AA and 1080p, version 1.14.3, with typical fps 150-350. Modded Minecraft is 22 chunks, w/ everything on fancy, no Optifine and 1.12.2, typical fps 40-160.

In FM7 I have the medium preset and I get a very stable and 80fps all day. All the frames I listed were the same as with the i5 in Minecraft and FM7.

1. GPU drivers from Nvidia's website. Mobo drivers from Asus' website (the board is a Rog Maximus VI Formula) I'm not dumb enough to fall for driver update ads.

2. Bios is 2.10.1208. This bios isn't listed on their website and bios flashback on the motherboard simply doesn't work when I try to update it.

3. I never ran the gpu on aida64. However on a Superposition the gpu was at pegged at 100% and stabilized around 82-83c. This temp is unrealistic for actual gaming because my gpu is never maxed out and my cpu usage was overall higher than gpu when gaming before the upgrade.

Quick question, too. Why do you say that benchmarks are not good to go on?
 

PC Tailor

Dignified
Herald
I upgraded the cpu without justification, I felt like it was upgrade time and I finally got the money to do it.
Well it only really needs to be ugpraded if it's currently limiting you in some way. If it wasn't the limitation before, upgrading the CPU won't make a difference in those applications.

Based on the fact that you said you are getting the same frame rates, I'd be inclined to think that the CPU was never the limitation, so therefore frames probably won't increase. Although the 4770K has more threads and a higher single core performance, so you'll get benefits in other applications I'd suspect.

I'm not dumb enough to fall for driver update ads.
Apologies, meant no offense, I wasn't suggesting you were, just 9/10 that we advise, they use driver update software :).

This temp is unrealistic for actual gaming because my gpu is never maxed out and my cpu usage was overall higher than gpu when gaming before the upgrade.
Exactly, but even then it's not that unrealistic because 83 degree under highest stress means you'll definitely be below that under normal loads.

Quick question, too. Why do you say that benchmarks are not good to go on?
Because in reality they don't reflect real world performance, same as you saying the Superposition temperature is not reflective because you'll never accomplish that level of stress, benchmarks are the same. The classic example is NVMe M.2 drives. In benchmarks, insane difference in speeds, in real world, hardly any difference.
 
Reactions: NightHawkRMX
Mar 18, 2019
9
0
10
0
Well it only really needs to be ugpraded if it's currently limiting you in some way. If it wasn't the limitation before, upgrading the CPU won't make a difference in those applications.

Based on the fact that you said you are getting the same frame rates, I'd be inclined to think that the CPU was never the limitation, so therefore frames probably won't increase. Although the 4770K has more threads and a higher single core performance, so you'll get benefits in other applications I'd suspect.


Apologies, meant no offense, I wasn't suggesting you were, just 9/10 that we advise, they use driver update software :).


Exactly, but even then it's not that unrealistic because 83 degree under highest stress means you'll definitely be below that under normal loads.


Because in reality they don't reflect real world performance, same as you saying the Superposition temperature is not reflective because you'll never accomplish that level of stress, benchmarks are the same. The classic example is NVMe M.2 drives. In benchmarks, insane difference in speeds, in real world, hardly any difference.
In understand. But I don't understand why there was no difference AT ALL. It was as if the I never upgraded the cpu when I ran Superposition. The score was the same within margin of error. Even if the i5 wasn't holding the gpu back, I would expect some gains consider the higher clocks and hyperthreading on the i7.
 

PC Tailor

Dignified
Herald
In understand. But I don't understand why there was no difference AT ALL. It was as if the I never upgraded the cpu when I ran Superposition. The score was the same within margin of error. Even if the i5 wasn't holding the gpu back, I would expect some gains consider the higher clocks and hyperthreading on the i7.
I may be mistaken as Superposition is not software I use, but my understanding is there wouldn't be because it's pushing your GPU more than the CPU. If the previous CPU was never struggling to prepare the frames for the GPU in the last benchmark, then the results will be the same.

If it doesn't need to use the CPU any more, then the result will be the same.
 
Considering Haswell and older chips were hurt the hardest from Spectre/Meltdown mitigations, with double-digit decreases in performance with Hyperthreaded chips, it's hard to imagine that adding Hyperthreading would help much nowadays. Remember, Hyperthreading at best improved performance by 20% at the time those chips were new.

If the old CPU wasn't pegged at 100% before, then the new one will just run at an even lower utilization now. As your GPU was pegged at 100% during the test, it is the bottleneck.
 

ASK THE COMMUNITY

TRENDING THREADS