Question Upgraded from a 1060 6GB to a 2070 Super, barely notice the difference in fps???

Dec 13, 2024
8
0
10
I got a 2070 super for myself this christmas, and switched out my old 1060 6gb believing itd be a huge upgrade, but i can hardly tell the difference in most games i play.


My most played currently is Helldivers 2, where i was told i could run medium settings 60 fps just fine, but it struggles to reach even 50 on most planets, in extreme cases even 40.


Whats the issue?
 
I installed new drivers using nvidia app (used custom installation to reset all profiles and such), pretty sure it used to be called geforce experience or sum like that.

CPU: I7-4790
GPU: Zotac Mini RTX 2070 Super
PSU: Corsair RM750e (Legit just bought it off amazon)
Storage: Using a 1TB SATA SSD and a 1TB HDD for games i dont play too often or dont take too long to load

Not sure what GE is, but all i know is that i switched out my gpu and now its just about the same performance as before.
 
Oh my DAYSSSSSS. I just used the msi afterburner overlay to look at my cpu usage and the temps are at 100°C...

No wonder i dont get any frames ;-;
Thanks for the help though, guess ill be investing in a proper cpu cooler (i have the stock intel one installed with 5 kg of dust)
 
Oh my DAYSSSSSS. I just used the msi afterburner overlay to look at my cpu usage and the temps are at 100°C...

No wonder i dont get any frames ;-;
Thanks for the help though, guess ill be investing in a proper cpu cooler (i have the stock intel one installed with 5 kg of dust)
Clean it and check if the temp goes down, there's probably no necessity to buy a new one.
 
Clean it and check if the temp goes down, there's probably no necessity to buy a new one.
Just looked at it, not a stock intel cooler. Just so yall know my Build is a Frankenstein of New parts slapped into an old Office PC. I'll list all the specs i can here

Case & Original Specs: EliteDesk 800 G1 TWR
CPU: I7-4790
GPU: Zotac Mini 2070 Super
Storage: 1 TB SSD & 1TB HDD
Motherboard: HP 18E4
RAM: 4x 8GB DDR3 (2 different sets, one recent upgrade & one from the original pc)
PSU: RM750e

Also, ive never taken out the motherboard or cpu cooler and im not sure how to safely uninstall and reinstall them

Would like to put screenshot of HWiNFO in here but it wont let me paste ;-;
Also i saw that my RAM speed is 800Mhz in HWiNFO but in Task manager it shows 1600Mhz which is what its supposed to be.
 
There are different ways utilities read ram speed. Windows reads double data rate whilst others like Hwinfo, cpuz read single data rate. Both are correct.

So with an overheating cpu, your previous card wouldn't have been performing well either.
Sure didnt, thats kinda why i bought a new one...

Im gonna take increasing the CPU fan RPM as a temporary solution as ill be buying a new motherboard along with a new cpu (+ cooler) and ram next month anyway

Welp, my gratitude upon thee for helping me find my problem.
 
Just looked at it, not a stock intel cooler. Just so yall know my Build is a Frankenstein of New parts slapped into an old Office PC. I'll list all the specs i can here

Case & Original Specs: EliteDesk 800 G1 TWR
CPU: I7-4790
GPU: Zotac Mini 2070 Super
Storage: 1 TB SSD & 1TB HDD
Motherboard: HP 18E4
RAM: 4x 8GB DDR3 (2 different sets, one recent upgrade & one from the original pc)
PSU: RM750e

Also, ive never taken out the motherboard or cpu cooler and im not sure how to safely uninstall and reinstall them

Would like to put screenshot of HWiNFO in here but it wont let me paste ;-;
Also i saw that my RAM speed is 800Mhz in HWiNFO but in Task manager it shows 1600Mhz which is what its supposed to be.
The 100% safe way is buy a bottle of compressed air (is it called so? sorry for my poor English), and splash it onth the cooler to blow the mist away. Usually it's powerful enough.
 
Your CPU is most likely bottlenecking that GPU. I have a machine with an i7 6700 and a 2060 Super, that originally had a GTX 1080 in it, and the improvements were less than spectacular. I did it for the feature set upgrade. The GTX 1080 takes too much of a performance hit with RT on, and can't do DLSS at all. A 2060 Super, or 2070 non-Super, is really all you need with the CPU that you have. A 12gb RTX 3060 is also in the same performance class.

If you're buying a new motherboard, I wouldn't count on being able to reuse the case with it. Pre-built's will use a lot of proprietary parts just to prevent you from upgrading them later. They would rather their old machines became e-waste, than for them to be useful for one day longer than they intended. I have an older Asus machine that uses a motherboard that *looks* like a standard mini ATX, but is actually narrower than a standard mini ATX, and the case is just wide enough to fit the intended motherboard into it. A standard ATX board is too wide to get in there.
 
Your CPU is most likely bottlenecking that GPU. I have a machine with an i7 6700 and a 2060 Super, that originally had a GTX 1080 in it, and the improvements were less than spectacular. I did it for the feature set upgrade. The GTX 1080 takes too much of a performance hit with RT on, and can't do DLSS at all

The way i read that seems like you have the opinion the i7 6700 was the let down from gaining more performance coming from a 1080? 2060s with dedicated rtx cores, sure will do ray tracing better, that's a given since as you know gtx doesn't have the rtx cores nor is officially supported. Putting rtx aside, 2060s base line performance is only slightly faster, more of a side grade really.

Op's cpu is more than capable of driving a 2070s, it's just that he noticed cpu running hot and that is most likely the problem. Has to be. Why i think 4790 is adequate enough is that i used to run 2600k at 4.5 with a 1080Ti and that ran pretty well in either 1080p or 1440p, 10 or so fps less compared to 8700k in same benchmarks tested.
 
If the CPU is overheating, then performance should be absolutely terrible as the CPU throttles, but 60fps sometimes dropping to 40fps may not actually be that bad depending on the game and settings.

Haswell isn't unusable and with a little extra cache in the Broadwell 5775c was even competitive all the way up with 11th gen + fast DDR4, because there was only so much Intel could do on 14nm--and that's why there was such a big bump with 12th gen when they finally moved to "Intel 7" node:

Not that much difference between 4790k and 6700k, is there? Intel only claimed ~5% improvement per generation during that era and that looks about right

However it is well known that nVidia GPUs require CPU performance to perform, because their driver does things in software that AMD normally does in hardware because of weak CPUs in consoles (both PS4 and Xbox One used AMD Jaguar Family CPUs). It depends on the game + API, and here's a comparison using a 4790k (note as you'd expect, a 3060 performs about like 2070 or 1080, and while 1060 is clearly a class below those it can sometimes be surprisingly close with only a 4790k)
View: https://youtu.be/G03fzsYUNDU?si=hnj_MaPpMLLpbe3G&t=728

This is why nVidia cards for reviewers always ship in a complete high-end system--with enough CPU it's actually faster to do those tasks in software, plus then they can claim lower power consumption as some of the work and thus heat is moved from GPU to CPU.