[SOLVED] Is my i5 7500 causing low frame rates at 1440p ultrawide?

Mar 3, 2019
5
2
15
System:
i5 7500
MSI H270M Mortar Arctic
Corsair LPX 3000mhz 2x8gb
EVGA GTX 1070ti Black
860 EVO 500gb SSD
CM MasterLiquid 240 AIO
CX650M
Samsung CF791

EDIT: I'm asking because I have read that at 1440p+, the GPU becomes the major player, but I've also read that the i5 7500 is less than stellar and that the 1070ti is supposed to great for 1440p gaming.

My question is three-part:
Should I be getting better than 30-40fps at 3440x1440 on high-ultra settings in games like Mass Effect: Andromeda and Anthem?
If not, is the 7500 holding my FPS back in this setup or is it the 1070ti?
If it's the 1070ti, will the 7500 become a major bottleneck if I upgrade to 2080ti?
Thanks in advance!
 
Last edited:
Solution
So, in Anthem I did turn the settings down to the "Medium" preset from "High" and it seemed to have no effect on my FPS, just made the graphics noticeably worse. I also tried tweaking and/or turning off some settings in ME:A that are supposed to be GPU intensive, and it also made no difference. Lowering the resolution to 1080p UW did increase the FPS but looked like hell so I opted for the lower FPS and prettier image.

Does that mean that in my case I'm being limited by the CPU more than the GPU, but really both are the overall problem?

What program are you using for that overlay? That seems like it would be really helpful for me.
Before saying yes for certain it is your CPU I suggest bringing up a performance tracker like...
1070 Ti should handle 1440p OK.
You could always try turning the settings down a step and see if it still does it.

I can tell you from experience that my 6600K@4.7GHz is horribly thread-bound in a lot of newer games. The cores are fast but there's just not enough of them so I've seen a few games show 100% on all 4 cores and it sits there pegged while the GPU utilization drops off. For example:
gCThdTr.png

FrMemH1.png

Granted, this game is very demanding in the first place but it was the only example I had on-hand at the moment.
 
Last edited:
System:
i5 7500
MSI H270M Mortar Arctic
Corsair LPX 3000mhz 2x8gb
EVGA GTX 1070ti Black
860 EVO 500gb SSD
CM MasterLiquid 240 AIO
CX650M
Samsung CF791

EDIT: I'm asking because I have read that at 1440p+, the GPU becomes the major player, but I've also read that the i5 7500 is less than stellar and that the 1070ti is supposed to great for 1440p gaming.

My question is three-part:
Should I be getting better than 30-40fps at 3440x1440 on high-ultra settings in games like Mass Effect: Andromeda and Anthem?
If not, is the 7500 holding my FPS back in this setup or is it the 1070ti?
If it's the 1070ti, will the 7500 become a major bottleneck if I upgrade to 2080ti?
Thanks in advance!


The GTX 1070Ti is on the weak side in reality for 1440P with the newer games especially and then your CPU is entry level also and an i5. The 1070Ti was never that great really, it was overblown by the reviewers and people believed them. 🙄

It was the most obvious time that the reviewers cooked the results to make the 1070Ti look better than they actually were and by a lot. A lot of us were screaming about it and how obvious it was, it was absolutely dishonest.

They are doing the exact same thing now with the GTX 1660Ti and it is blatantly obvious. 🙄

Yes your CPU will become a massive bottleneck for a RTX 2080Ti.
 
Last edited:
1070 Ti should handle 1440p OK.
You could always try turning the settings down a step and see if it still does it.

I can tell you from experience that my 6600K@4.7GHz is horribly thread-bound in a lot of newer games. The cores are fast but there's just not enough of them so I've seen a few games show 100% on all 4 cores and it sits there pegged while the GPU utilization drops off. For example:
gCThdTr.png

FrMemH1.png

Granted, this game is very demanding in the first place but it was the only example I had on-hand at the moment.

So, in Anthem I did turn the settings down to the "Medium" preset from "High" and it seemed to have no effect on my FPS, just made the graphics noticeably worse. I also tried tweaking and/or turning off some settings in ME:A that are supposed to be GPU intensive, and it also made no difference. Lowering the resolution to 1080p UW did increase the FPS but looked like hell so I opted for the lower FPS and prettier image.

Does that mean that in my case I'm being limited by the CPU more than the GPU, but really both are the overall problem?

What program are you using for that overlay? That seems like it would be really helpful for me.
 
So, in Anthem I did turn the settings down to the "Medium" preset from "High" and it seemed to have no effect on my FPS, just made the graphics noticeably worse. I also tried tweaking and/or turning off some settings in ME:A that are supposed to be GPU intensive, and it also made no difference. Lowering the resolution to 1080p UW did increase the FPS but looked like hell so I opted for the lower FPS and prettier image.

Does that mean that in my case I'm being limited by the CPU more than the GPU, but really both are the overall problem?

What program are you using for that overlay? That seems like it would be really helpful for me.
Before saying yes for certain it is your CPU I suggest bringing up a performance tracker like GPU-Z's sensor tab and play those games for a bit. If the GPU is not being 98-100% utilized most of the time then it is likely your CPU is holding back the 1070 Ti.

That overlay is Rivatuner Statistics Server but it's the one bundled with, and configured by, MSI Afterburner. I'm also using HWiNOF64 and feeding the sensor data from it to Afterburner so that I get more specific information. The basic setup for the overlay doesn't show that much information. All that you need is per-core utilization and GPU utilization along with a frametime graph (there's a drop-down for text/graph/combo -> combo) and FPS counter. The other FPS counter is from FRAPs but you don't need that.
 
Solution
The GTX 1070Ti is on the weak side in reality for 1440P with the newer games especially and then your CPU is entry level also and an i5. The 1070Ti was never that great really, it was overblown by the reviewers and people believed them. 🙄

It was the most obvious time that the reviewers cooked the results to make the 1070Ti look better than they actually were and by a lot. A lot of us were screaming about it and how obvious it was, it was absolutely dishonest.

They are doing the exact same thing now with the GTX 1660Ti and it is blatantly obvious. 🙄

Yes your CPU will become a massive bottleneck for a RTX 2080Ti.

Honestly, I did buy it because of the rave reviews and the massive price difference down from the 1080ti at the time(I was impatient and upgraded from a 970 while the crypto buzz was still in some effect). I have pretty much wished I had gone with the 1080ti ever since.

Then I read some things about how the CPU could be the issue. In your opinion, since it sounds like I have to upgrade both to truly be where I want to be(3440x1440 100hz), should I upgrade the CPU first or the GPU first? I will have to save up between upgrades but would love to see some improvement soon.

For reference I have my eye on a final setup of the i7 9700k, MSI z390m Gaming Edge, and EVGA 2080ti SC Hybrid. Will the i7 9700k be enough processor for the 2080ti?
 
Screenshots with OSD. Vsync on/off.

I got that OSD working, and did some screens. I tried with Andromeda as well, with similar results on usage. This basically tells me that my CPU is more of a problem than my GPU, but that both will need to be upgraded to get to my 100 fps goal. Is that about right?

Basically the CPU is maxed out virtually the entire time I am playing. With Vsync on, the GPU usage is relatively low. With it off, CPU usage stays maxed even more constantly and the GPU usage shoots up to 98%+ and stays there, but the FPS goes from 30ish to 45ish.

Along those lines, am I right in thinking that I should upgrade the CPU first, and see where that gets me while I save up for a 2080ti?
If that is the case, will the i7 9700k be enough processor to take advantage of the 2080ti or do I need to blow the whole load and go for the i9 9900k?
 
  • Like
Reactions: compprob237
For reference I have my eye on a final setup of the i7 9700k, MSI z390m Gaming Edge, and EVGA 2080ti SC Hybrid. Will the i7 9700k be enough processor for the 2080ti
Looks OK. I don't know about the MSI board though. If it were me I'd save money on the 2080 Ti and get the Black or XC instead and use it toward a different board (I prefer ATX boards), more RAM (I prefer to have more than 16GB), or just pocket the difference. This is me though and with what you posted you will have great performance.
Along those lines, am I right in thinking that I should upgrade the CPU first, and see where that gets me while I save up for a 2080ti?
Yes, CPU first because you are clearly bottlenecked by it just like I am.
If that is the case, will the i7 9700k be enough processor to take advantage of the 2080ti or do I need to blow the whole load and go for the i9 9900k?
The 9900K gains ~20% with hyper-threading so in contrast the 9700K's an OK deal. The extra ~$110 for the 9900K might be justifiable if you really want to squeeze everything out of that 2080 Ti. I'd personally stick with the 9700K but this is your decision to make. 😉
 
  • Like
Reactions: StickyLiver
Looks OK. I don't know about the MSI board though. If it were me I'd save money on the 2080 Ti and get the Black or XC instead and use it toward a different board (I prefer ATX boards), more RAM (I prefer to have more than 16GB), or just pocket the difference. This is me though and with what you posted you will have great performance.
Yes, CPU first because you are clearly bottlenecked by it just like I am.
The 9900K gains ~20% with hyper-threading so in contrast the 9700K's an OK deal. The extra ~$110 for the 9900K might be justifiable if you really want to squeeze everything out of that 2080 Ti. I'd personally stick with the 9700K but this is your decision to make. 😉

Thanks, I really appreciate your input on this stuff.:)

My reasoning was I wanted to use the mATX board because I like the mini case I have, and I wanted to use the XC Hybrid because the 1070ti SC Black that I have now is pretty loud(I can hear it over my headset) and I have heard that the 20xx cards are even louder. I can add 2 more sticks of memory to the MSI board(and I plan to) and get up to 32gb which should be fine.

I have had 2 MSI boards already and I really like the quality and the BIOS so far, so I trust them. The Asus z390 boards all have terrible reviews due to a BIOS update that they aren't shipped with that is required to run a 9th gen chip, and the only way to get the update is to get it to POST with an 8th gen chip that you're supposed to just have lying around. The other Asus boards I have used have been very nice but they all seem to have dumb, unnecessary setup workarounds like that so I've pretty much decided they aren't worth the hassle.
 
  • Like
Reactions: compprob237
Honestly, I did buy it because of the rave reviews and the massive price difference down from the 1080ti at the time(I was impatient and upgraded from a 970 while the crypto buzz was still in some effect). I have pretty much wished I had gone with the 1080ti ever since.

Then I read some things about how the CPU could be the issue. In your opinion, since it sounds like I have to upgrade both to truly be where I want to be(3440x1440 100hz), should I upgrade the CPU first or the GPU first? I will have to save up between upgrades but would love to see some improvement soon.

For reference I have my eye on a final setup of the i7 9700k, MSI z390m Gaming Edge, and EVGA 2080ti SC Hybrid. Will the i7 9700k be enough processor for the 2080ti?

If you want to be at that resolution then do a system, upgrade 1st then get the GPU once that is all done.
 
  • Like
Reactions: StickyLiver