Why overclock for gaming?

CubsWin

Distinguished
Apr 20, 2012
88
3
18,535
I am in the process of building a new PC and plan to install a new 3770k Ivy Bridge CPU.

I have been reading countless forums and benchmarks regarding the overclockability of the 3770k, but the question that comes to mind is why would anyone really need to overclock a modern CPU for gaming? My understanding is that most games are graphics card dependent, so how much difference could it really make if you are running at 5 GHz instead of 3.5 GHz? I understand that it is a huge difference in benchmarking, but I fail to see the real world benefit when it comes to gaming.

Would anyone care to explain the impact that overclocking an already powerful CPU can have on gaming? I apologize for this rather general question, but I really would like to understand how an extreme overclock can benefit gaming when the graphics card is usually the bottleneck.
 
Solution
For CPUs, there isn't much point. It can help bring the minimum framerate up/alleviate some stutters, but really it's pretty minimal.

However in 2 or 3 years if you still have that same CPU, the modern games of that time might be more demanding, so it's good to have the option of improving your performance from overclocking.

Regardless, it's more of an enthusiast/hobby thing. Just for shits and benchmarks. Plus, it's so damn easy now there's kind of a "why not?" attitude.

rglaredo

Honorable
Mar 26, 2012
170
0
10,690
I was thinking the same thing the other day . I read about some games are cpu dependent maybe thats why. Or system older than game so needs a boost ..I'm happy at the 50 to 70 fps area ..dont see the point of going over that ..
 

djscribbles

Honorable
Apr 6, 2012
1,212
0
11,460
I totally agree with you, I never really got into overclocking; however the point is really more about the hardware than the user experience in games.

Some people own a car to get from point a to point b, and some people own a car to drive it.
 

catatafish

Distinguished
Feb 6, 2012
448
0
18,810
Here's an actual example. I have a widget open at all times on one of my two screens that shows CPU and RAM utilization. When I load BF3 that widget pegs itself on 100% for a good 15 seconds while everything is loading. Then when I join the server it works pretty darn hard too. I'm overclocked to 4.5. I use 8gb of RAM and that pretty much hovers at the 50% utilization mark during load, as well as during game play.

So I can't speak to how many FPS you will get from overclocking (probably no more), but I would hate to think how long I have to wait to play while the thing loads if I wasn't rockin 4.5. 20 seconds is a long time when you're wanting to join and get to Ticket Hall before Russia caps it and camps you below the escalators.

Plus, with the ease of overclocking it moderately (most boards give you an ez button now), I just don't know why you wouldn't do it.
 

djscribbles

Honorable
Apr 6, 2012
1,212
0
11,460
I don't know for sure, but my guess is a load screen isn't utilizing the processing power anyway, more than likely it's in a tight loop checking the status of the work being done in loading form the hard drive and feeding the GPU, probably multiple jobs in parallel since your at 100%.

There are lots of ways to write code that use 100% of a processor that don't provide any tangible benefit at higher clock speeds, and my gut feeling is that your load screen is one of them.
 
For CPUs, there isn't much point. It can help bring the minimum framerate up/alleviate some stutters, but really it's pretty minimal.

However in 2 or 3 years if you still have that same CPU, the modern games of that time might be more demanding, so it's good to have the option of improving your performance from overclocking.

Regardless, it's more of an enthusiast/hobby thing. Just for shits and benchmarks. Plus, it's so damn easy now there's kind of a "why not?" attitude.
 
Solution

catatafish

Distinguished
Feb 6, 2012
448
0
18,810
@scribbles and laredo

So you're saying that it (CPU usage) would be 100% usage if I was at 1ghz or 10000ghz? If that's true, wouldn't it affect the length of time it spent there? I admit I have no clue what you mean by the closed loops and stuff, but I do know that 100% of a larger number, is a larger number :). So wouldn't an overclocked CPU help load thing faster when you opened the game and connected to the server?

@wolfram

Yeah exactly.

So to the OP, and along that theme....... let's say the 2500k cpu was only 1.1ghz. Would you still be like "meh, it'll still be good for gaming, I don't see why 3.4 would be any better?"
 

djscribbles

Honorable
Apr 6, 2012
1,212
0
11,460


Hypothetically (I'm not a game programmer, so I'm speculating a bit), when you are loading a level, the game spawns a number of threads to do some different tasks, like a thread to load textures into your GPU, a thread to load the skeleton for the level, a thread to load the character models, a thread to communicate with the server. Each of these threads is focusing on it's task, in order to ensure the fastest load possible, it asks the HDD to load the files it needs, and it starts talking to the server to get data it needs; but once it makes the requests, it can either go to sleep for a period of time (which allows the OS to schedule other jobs on that thread until the thread wakes back up), or it can continuously check to see if the file it was trying to load is finished or that it got a response from the server. The first method will let your CPU utilization fall back to a lower % but will sleep through the end of the file being loaded, and it won't load the next until it wakes up again, while the latter will keep a core at 100% per thread (so if you have 4 threads on a quad core, your at 100%) and will be able to immediately start loading the next file when the first one finishes, but the only thing your higher clock speed will buy you is a faster "are we done yet?".

Since it's just a load screen, the programmers likely just do the easier method (which is also probably slightly faster), which is the one that chomps your whole CPU.
 


4.5? Hell, 4.0!

Actually I think with the turbo boost stuff, 3.5 ish for quad with single thread boosted to 4.2 ish would probably be more than enough.

Of course it's always possible to purposely create a CPU bottleneck to see some gains, like running 1024x768 and minimal graphic settings. But realistically I think even 4ghz is more than sufficient. I mean I'm on an i5 750 and it almost never sees max usage nor can I say it is bottlenecking the GPUs - and SB/IB chips are of course faster per clock.
 

teh_gerbil

Honorable
Apr 9, 2012
515
0
11,060
Yes, no matter what, unless hit by other bottlenecks, your CPU will be at 100% for a period of time during intense load, regardless of the frequency.

However, when you strike the more powerfull CPU's (i7's etc) then you strike other bottlenecks, like GFX card speed/RAM speed etc that means your underutlizing your CPU.

this is I figure it working..

You have 1000 jobs:

1 core @ 100mhz can do it in 10 seconds, which makes it the bottle neck and the RAM/GFX have to wait for it to pass it information.

Alternatively 4x cores at 50mhz which saturates the pipelines of the RAM/GFX etc and can do it in 2.5 seconds.

so if you have an older 1 core CPU, you get a much greater gain from over clocking that you do from an already faster CPU which is coming close to/already saturating(being bottlennecked) by the other components.

I rewrote this a few times, hope it makes sense!
 

Probably not. If you are doing online gaming, the primary bottleneck is the internet connection.

As far as "Why overclock?", my gaming machine is a Q9550 OC'd to 3.6 GHz. Yes, it is getting kind of old, but it is still pretty capable. And I have been able to bypass two generations of upgrades.
 
HDDs are easily the biggest bottleneck when it comes to loading stuff. Anyone with an SSD for Windows and programs can tell you their PC feels a lot snappier/faster compared to a HDD. That's my experience and I went from RAID0 7200rpm 500GB Seagates to an Intel X25M 80gb, which isn't even a fast SSD. My 120gb is much faster, but it's strictly for gaming ;)
 

raj072

Distinguished
Sep 8, 2009
38
0
18,530
Well the truth of the matter is many of these people have OCD. They really don't need the OC they just have some need to know its faster than stock. Most of them just like to say "I'm cranking at 4.8 on water, temps are yada yada yada...... They just like tinkering with cpus and running a benchmark. They spend more time tinkering with the OC than actually saving time from the OC.

Not saying its a bad thing, its a hobby, just calling a spade a spade.

Overclocks are not free. More power and more heat generated and shorter lifespan. But if you are rendering 24/7 with 3dsmax and can cut 1 hour for every 12 (2 hours/day) then oc might be of use. For most though its just a hobby thing with little real world benefit.

 

Mephic

Distinguished
Apr 1, 2011
116
0
18,680
Well.. bollocks.

OC is needed most when You cannot buy something more expensive, still want this part performance.

Got two good examples:

When I was an AMD user I had money only to buy Phenom II X3 720 (those X4s looked beautiful - just couldn't afford it). It seemed that my lower GHz and lack of 1 core will be a problem and it was in some games. Than I clocked it to 3,7 GHz and my FPS in many games went more stable. Same with the GPU - I couldn't afford a 7970GHz or GTX 680 now - I bought 7950, checked it, went to 1200 on core (still have a lot of margin, so will try to beat it today... yada yada yada ;)) and got myself over 7970 GHz performance.

Sometimes You also need OC because of bottlenecks - You bought a cool card, but Your CPU cannot handle it and Your FPS outcome is much lower.

I'd say - OC is very needed. IMO GPUs are ahead of CPUs now, so You always need to OC Your CPU a bit to get the best performance and don't see a bottleneck. And than, You can also OC Your GPU to get performance of 100 bucks more expensive card.