88%usage on 1050ti old games like bfbc2

Bannora

Prominent
Aug 8, 2017
35
0
530
i see the usage of graphic card its high but i ask how 1050ti have a high usage on bfbc2 ?! (v_sync on too!)i want to know fast if there is a problem to go to warranty before it ends ^^
 
Solution
I assume by drops you are meaning low frame rates?

The FX-4320 is a budget processor, so I am not surprised it struggles when your GPU is running at 88%. Actually explains why it isn't at 100%, the CPU isn't fast enough to keep up. And running BC2 at whatever resolution and settings is too much for the whole system, so V-sync should probably be off for smoother gameplay.

Dunlop0078

Titan
Ambassador
It's supposed to, 100% GPU usage would be ideal assuming you have vsync turned off.

It does not matter if it's an older low demand game, the GPU usage should ideally go right to 100% usage in any game assuming the game does not have a game engine FPS cap, vsync is turned off, and you have no cpu bottleneck.
 

Eximo

Titan
Ambassador
BC2, despite its age, still uses the Frostbite engine which can take a lot to run.

Without resolution and settings it is hard to give you an answer.

Have you tried a more standardized benchmark to compare your card's performance to others?
 

Bannora

Prominent
Aug 8, 2017
35
0
530

v_sync is on
 

Eximo

Titan
Ambassador
I assume by drops you are meaning low frame rates?

The FX-4320 is a budget processor, so I am not surprised it struggles when your GPU is running at 88%. Actually explains why it isn't at 100%, the CPU isn't fast enough to keep up. And running BC2 at whatever resolution and settings is too much for the whole system, so V-sync should probably be off for smoother gameplay.
 
Solution

rgd1101

Don't
Moderator
MERGED QUESTION
Question from Bannora : "finally know there is a cpu proplem!!"



On what resolution/detail? And don't believe everything you see on youtube.
 

Eximo

Titan
Ambassador
The trouble with youtube videos is that they were recorded, uploaded, compressed and then played back at a constant rate. So even if the numbers look good the experience they were seeing and what you are watching aren't always going to match up.

Some games require more resources than others. FPS shooters often employ hitbox registration on various body parts so there is a lot of calculation involved for damage figures and sending that information off to other players, in the case of multiplayer. Add on top of that the data the GPU needs to process every frame and some CPUs will be pushed to the limit.

Assassin's Creed III is another title entirely with its own game engine and it will have different requirements. Given the single player nature, I suspect less CPU is required to maintain adequate FPS. I don't know a lot about it, but pretty much exclusive to the AC series and a very few others. (Didn't realize Rainbow Six Siege also uses it, I'll make a note of that)

Some very old benchmarks showed an FX-4100, and a ludicrous GTX690, getting an average of 33 FPS at 1920x1080p. At 900p, and with the slightly faster FX-4320, I would expect those 30-60 FPS numbers, but not consistently.