• Now's your chance win big! Join our community and get entered to win a RTX 2060 GPU, plus more! Join here.

    Meet Stan Dmitriev of SurrogateTV on the Pi Cast TODAY! The show is live August 11th at 2:30 pm ET (7:30 PM BST). Watch live right here!

    Professional PC modder Mike Petereyns joins Scharon on the Tom's Hardware Show live on Thursday, August 13th at 3:00 pm ET (8:00 PM BST). Click here!

Question High usage of CPU (i5-6500) while playing video games

Jun 20, 2020
Hello everyone ! I believe I have a CPU (i5-6500 running with GTX 1070) problem. Every time I play video games the CPU usage spiked really high and sometimes up to maximum in the lowest settings. Here are some examples (All the games' resolution is 1920×1080, some in Full-Screen some in Border-less):

Tomb Raider 2013 in highest settings.

Borderlands 2 in lowest settings (spiked up to maximum) while this is in highest settings (it spiked up to maximum for a bit then drop down to ~65%.

Left 4 Dead 2 in the highest settings.

I tested a game named "10 Second Ninja X", a platform video game that eating around 35% of the CPU power.

Shadow of The Tomb Raider in lowest settings spiked up to maximum while in highest settings the usage running around 60%-85%.

Same here with GTA V in lowest settings while in very high settings it runs stable at ~65%-70%.

I have checked the temperature when in maximum usage, the cores are all around 40°C - 47°C. I have updated all the drivers with Snappy Driver as well, but the BIOS is still in the ver. 2016 (which I'm not sure if this is the culprit). The CPU is used and was given to me from a friend. Before this, I was using AMD FX-6300, and I have never encounter this problem before for older video games such as Borderlands 2/Tomb Raider 2013/L4D 2.

I would greatly appreciate if anyone would help me shed some light on this problem, since I don't want to damage the CPU.

Thank you.
A 4 thread cpu will be limiting in some games, especially at lower resolutions and settings. Running low settings makes it worse. At low settings the gpu is far more like to be able to push more FPS than the cpu can keep up with, creating what many like to call a bottleneck.
Jun 20, 2020
so the CPU is also limited in older game as well ? such as L4D 2 ? I have never recall running L4D 2 on AMD FX-6300 would push more than 50%-60% usage. While here the i5-6500 is running around 70% of usage.
Jun 20, 2020
When running the FX-6300 I use the same GPU, different RAM and motherboard. The settings for L4D 2/Borderlands 2 are the same (at highest). but a bit different for newer video games.


You aren't getting exactly what usage really is, what's cpu and what's gpu.

Usage is not how much of the processor is used, but how much the processor has to use. There's a difference.

The cpu pre-renders all the game code. It'll do so at 100% of its ability. The amount of resources, bandwidth, threads, cache and other things it has to use to pre-render a frame is Usage. The amount of frames it can pre-render in 1 second is the fps limit. The amount of fps will be determined by the complexity of the game code and the IPC, clock speeds and ability of the cpu.

That limit is sent to the gpu which finish renders the frame into a picture. The amount of frames it can finish in one second is the fps you see onscreen and in counters. The amount of frames the gpu can put onscreen will be determined by the resolution and game detail settings.

So let's say a game allows the cpu to pre-render 100fps. That's what gets sent to the gpu. At ultra 1080p, the gpu might only paint 80fps. Lowering settings means the gpu can finish more frames, so fps onscreen goes up. To 100. It cannot get more than that because the cpu only supplied a 100 frame limit. Or, changing that resolution to 1440p, the gpu might only get 50fps onscreen, lowering details from ultra to low gets 80fps.

You are not increasing fps by lowering settings, the fps is already at a maximum of 100. The only thing you increase is the amount of frames the gpu is able to complete. If you get 100fps at ultra 1080p, lowering settings does nothing, cpu capped at 100.

What can make a difference, especially in multi-player, is disable cpu bound settings, like name tags, floating damage numbers etc, because they are a part of the AI the cpu has to pre-render. They aren't actual graphics detail settings like the amount of hair detail, the sharpness of edges, blends of color or shadows etc. Those are gpu bound.

If fps changes because you raise/lower detail settings, that's all gpu. Not cpu. If usage is high, it's because the cpu is using a lot of resources and things like using 4 threads in a game like tomb raider which is optimised for 8 threads. Graphics details or resolution won't change that. It's all cpu and usage happens long before that info ever makes it to the gpu.

The only way to lower usage in such games is to raise the ability of the cpu. Like an i7 and Hyperthreading, overclocking, new platform with a stronger IPC cpu.

Usage doesn't change fps ability until the cpu hits right close to 100%. Which it can do on a core to core basis. Many older, slower cpus will hit 100% usage in CSGO on just 2 cores, and the other 2 cores are at 30ish%. Taskmaster averages 4 core use to a single number, that being @ 62% usage, but reality is its at 100%, because CSGO only uses 2 threads. It's maxed out and doesn't use the remaining 2 threads, because that's how the game code was written to do so.

You are having usage issues now that you didn't have prior because even though you moved to a faster, stronger cpu than the FX 6300, that cpu had 6 threads, so the games had a lower overall usage, even if 4 of the threads were maxed or close. 100 x4 + 30 x2 = 460. /6 = @ 76% in task master. With the i5, even with the stronger IPC, the cores are still going to get close to 100%, higher fps limit. 4x 100 = 400. /4 = 100% usage.

Usage being the amount of resources used, not the ability of the cpu to use the resources.
Last edited: