gtx 750 ti low gpu usage while gaming

Soarim

Reputable
Nov 13, 2015
41
0
4,530
0
so, i´ve been dealing with this problem for a while, what happens is that my gpu usage go under 60% while playing games, and i lose fps, the game lags...i monitored the temps of my gpu, its around 60~65 degrees, also monitored the cpu and motherboard temps, the cpu stays at 50 degrees, motherboard at 30~32, i used aida64 for those, also monitored the the clocks of both cpu and gpu, and its at it max when gaming, the gpu usage drops when im playing gta v, bf4, bf1, and i notest that when i play the singleplayer in bf1 the usage of the gpu always stays above 90%, when i go to multiplayer, specially playing conquest, where it has a lot of explosions, the usage drops to 50% a lot of times, making me lose a lot of fps, i cant find anything that can help me solving this, all my hardware seem to be working perfeclty but the gpu, i even tried ocing the cpu, but didnt help, if anyone can give me a clue of whats happening i will appreciate

pc specs:
gpu: gtx 750 ti 2gb
cpu: amd fx 8150
ram: 8 gb
mobo: asus m5a88-m
psu: thermaltake 750w
 

amtseung

Honorable
Jan 28, 2014
1,057
0
11,960
285


The only real "fix" is to get a better CPU, and probably a full system overhaul while you're at it. A temporary solution would be to overclock your CPU to like 5ghz or something, but not on that motherboard. Buying a motherboard to overclock your CPU isn't worth it. Intel for gaming, they say, and for good reason. If you want me to try come up with a parts list, I can try, but I suck at these things, and usually let other forum members here do the dirty work. ;)

For now, look for settings that work the CPU very hard and minimize them as much as possible. They include, but are not limited to: shadows, render distance, particles (and effects, especially explosions and clouds and fog), ambient occlusion, lighting effects, motion blur, and foliage. If possible, turn them off, otherwise, set them as low as possible. You will still have massive FPS dips, but hopefully you'll be starting with a high enough FPS that the dips don't make it unplayable. Also, look into killing as many background processes and services as possible, and see if you can run BF1 in DX12 mode (are you using windows 10? wait, I don't think a 750ti will let you, but may as well try).
 

Soarim

Reputable
Nov 13, 2015
41
0
4,530
0


yeah, totally forgot to mention that i have a fx 8150, so i dont think its cpu bound, what you say?
 
Run a non game benchmark, like 3dmark. What do your scores look like compared to other people with a similar setup? It could be that you are CPU limited and the reason the fps drops and GPU usage drops is during those moments the CPU can't keep up. Using a benchmark program could verify that your system is performing as expected, meaning bad game performance isn't the fault of the videocard malfunctioning.
 

Soarim

Reputable
Nov 13, 2015
41
0
4,530
0

i did the benchmarks using the 3dmark, and the result is very close to similar setups, some have more points cause the cpu is overclocked, but overall my result is normal compared to other people

 

amtseung

Honorable
Jan 28, 2014
1,057
0
11,960
285
You have an FX CPU, and you're playing games (BF4, BF1, GTAV) that rely very heavily on single core strength, which is the very thing FX CPU's are known to be the worst at. In single player, where the CPU isn't nearly as stressed from having to deal with other players' network, positional, and firing data, you don't see that much of an FPS drop from particle physics. In multiplayer, your CPU is literally dying under the load, and is unable to cope, resulting in usage drops across the board.

You're CPU bound, and pretty heavily at that.
 

Soarim

Reputable
Nov 13, 2015
41
0
4,530
0

so its impossible for me to fix this? i would have to change my cpu? and do you have any tips to minimize these usage drops?
 

amtseung

Honorable
Jan 28, 2014
1,057
0
11,960
285


The only real "fix" is to get a better CPU, and probably a full system overhaul while you're at it. A temporary solution would be to overclock your CPU to like 5ghz or something, but not on that motherboard. Buying a motherboard to overclock your CPU isn't worth it. Intel for gaming, they say, and for good reason. If you want me to try come up with a parts list, I can try, but I suck at these things, and usually let other forum members here do the dirty work. ;)

For now, look for settings that work the CPU very hard and minimize them as much as possible. They include, but are not limited to: shadows, render distance, particles (and effects, especially explosions and clouds and fog), ambient occlusion, lighting effects, motion blur, and foliage. If possible, turn them off, otherwise, set them as low as possible. You will still have massive FPS dips, but hopefully you'll be starting with a high enough FPS that the dips don't make it unplayable. Also, look into killing as many background processes and services as possible, and see if you can run BF1 in DX12 mode (are you using windows 10? wait, I don't think a 750ti will let you, but may as well try).
 

Soarim

Reputable
Nov 13, 2015
41
0
4,530
0


thanks man, you were very helpful
 

Soarim

Reputable
Nov 13, 2015
41
0
4,530
0


hey man, im getting a gtx 1060, and in the future will upgrade the cpu, do you have any idea of how hard will be the bottleneck? its still the fx 8150, i heard some people taliking about it and saw some bechmarks os this setup(8150 + 1060), and it didnt have a notable loss of fps in most games, so if you can give me a hint on it, i will appreciate
 

amtseung

Honorable
Jan 28, 2014
1,057
0
11,960
285


If you're upgrading from a 750ti to a 1060, you won't lose any frames. You'll probably gain like... 1.

In general, if you're upgrading your graphics card when you're already seriously CPU bound, you won't gain anything, because your CPU is still the limiting factor. You won't lose anything either. You may have to tweak power settings with your new 1060 because of how GPU Boost 3.0 works, and how your card may actually drop to idle power loads because the CPU is holding it back so much.

A good example of this is my Planetside 2 gameplay. I upgraded from an r9 380x to a GTX1060 about a month ago (I wanted the power savings of the Pascal architecture, mainly) and gained exactly 0fps. As you might suspect, my i5 is what's holding back my framerates and not my graphics card. Therefore, upgrading graphics cards will net you no extra fps, but you won't lose any. At least now I can use adaptive vsync or fast sync whenever I want, options that were unavailable or oddly buggy on AMD. And gaining more framerates wasn't particularly important to me anyway, since I play on completely potato graphics settings because it's hard to see through smoke and explosions and fog, and I'm usually at 120fps+ anyway.

I'm still going to say this: go intel. Get a new motherboard and CPU, ditch this POS that is the AMD FX lineup.
 

Soarim

Reputable
Nov 13, 2015
41
0
4,530
0


i will upgrade the cpu and mobo soon, but i really wanted to see if the bootleneck is gonna be so bad that im not going to get any better performance at all, ive seen some videos with the setup(1060/rx 480 + fx 8150), and the fps were pretty decent, most of games getting 50+ fps in high/ultra settings, i just dont getwhat you say, cause if the cpu holds the 750 ti in a game like bf1 for example, on low i get 60 fps, with like 80%~99% of gpu usage, but if it bottlenecks the 1060, wouldnt be the 80% usage of it better than the 80% usage of the 750ti? or will it bottleneck so hard that the card wont be able to reach even 50% of usage or something like that? thanks for the reply anyway, really thinking about getting the mobo and intel cpu now before the gpu upgrade
 

amtseung

Honorable
Jan 28, 2014
1,057
0
11,960
285


If the CPU bottleneck already exists with the 750ti, it won't magically disappear when you put the 1060 in. If the 750ti is at 80% now, the 1060 will probably be, at most, somewhere around 35%.

Here's my personal example: I had an Athlon 760k at 4.9ghz. When paired with an R7 240 playing Planetside 2, which is one of the most CPU-heavy games in existence, the R7 was around 75-85% used at about 35-40 fps with dips into the seconds-per-frame category. When paired with an HD7970 (which is still really powerful today), it was used, at most, around 35%, with absolutely no changes in fps. When I paired the HD7970 with my current i5 4460, its usage was pinned at 100% all the time, and my fps went up to about 85. Paired the i5 with my new GTX1060, and now I get about 115-135 fps. I'm now CPU bottlenecked again.

I know the FX8150 isn't as weak as the 760k, but it isn't far off. The CPU itself just prevents the CPU from being efficient at being a CPU. Potato graphics are almost mandatory.

80% of the 750ti is like 20% of the 1060. Good luck.

I'll also say this: usages aren't that important. Just because something is only being used 50% doesn't mean it's always a bad thing. It just means that 50% is all it needs to get it done as quickly as possible.

After all, it's physically impossible to build a system that doesn't bottleneck anything at all. If it was possible, we'd have colonized the universe by now.
 

ASK THE COMMUNITY

TRENDING THREADS