The Game Rundown: Finding CPU/GPU Bottlenecks, Part 2

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

howiejcee

Distinguished
Nov 14, 2008
34
0
18,530
wtf? I'm not sure why a 4GHz 4core setup performs worse than when its CPU is downclocked to 3GHz in SC2...

Higher clock => more overhead? still doesn't really make sense though
 

funtastic

Distinguished
Nov 26, 2009
4
0
18,510
How many agree with me on the title?
From: The Game Rundown: Finding CPU/GPU Bottlenecks, Part 2

To: The Game Rundown: Finding CPU/GPU Bottlenecks, Part 2 of 20.
or
To: 20 Game Rundown: OC'd Core i5-750 (1-4cores) on gtx460 (768MB)

Really? I mean c'mon really?
Sniff* =0( c'mon Tom's. Don't let your intelligent readers jump ship! I rarely comment but this is nuff! I speak on behalf of millions of readers (and ones with no brains), really try to find what is the bottleneck for each game. Many great points made before this post so I'll try not to repeat.
I know it's a BIG job to do but (said from the bearded leader in the movie 300) THIS.....IS.....TOOOOOOOM'z! But to redeem is to redux!

At least tell us which game got you so addicted you couldn't sleep the night you thought of the title? I want some of that!!!
More proof below (at time of this writing):
Part 1 of this article is located in the >Special>Buyers Guide section and,
Part 2 of this article is located in the >HowTo>Tweaking&Tuning Guide section.
LAN PARTY AT TOM'S!
 

LORD_ORION

Distinguished
Sep 12, 2007
814
0
18,980
[citation][nom]howiejcee[/nom]wtf? I'm not sure why a 4GHz 4core setup performs worse than when its CPU is downclocked to 3GHz in SC2...Higher clock => more overhead? still doesn't really make sense though[/citation]

It does make sense if the CPU is not really a bottleneck. High CPU utilization % does not mean the CPU is overtaxed. There are a variety of other places that can suddenly start to bottleneck if the CPU is suddenly able to bottleneck them.

eg: One possible explanation to check out of 2 dozen performance counters. Clock the CPU higher when you are getting context switches when there is not enough ram. Now you get more context switches to process that are piling up because there is not enough ram. You simply have more unnecessary overhead.
 

Fokissed

Distinguished
Feb 10, 2010
392
0
18,810
[citation][nom]ntrceptr[/nom]Question: On the titles that appear GPU limited how do we know they are now suffering from the the Starcraft II problem where simple scenes are redrawing so quickly that it taxes your graphics card? Is there a way to prove/check this?[/citation]
The Starcraft II problem was when the menu was being rendered at thousands of fps. The menu is simple to render and requires few cache/vram hits so the fps could skyrocket because there was no fps limit.
 

Fokissed

Distinguished
Feb 10, 2010
392
0
18,810
[citation][nom]coldmast[/nom]I'm intrigued by the StarCraft II benchmark, is it just the way the processor is handling the request, or is it specially using Core 2 (& 4) instead of Core 1; thereby leaving Core 1 to handle miscellaneous process requests. Or is this simply intel speed-step in action?[/citation]
Threads are assigned quite randomly to different cores, it seems that core 2 and core 4 in SC2 are getting the heavy threads (physics and AI?) and the other cores are getting the less taxing threads, such as preemtive loading and GC (possibly dozens of threads)
 

mapesdhs

Distinguished
jtt283 writes:
> At high resolutions, I'm not sure my eyes are even good enough
> to notice whether or not AA is on.

This is why, with my older 6000+/8800GT setup, I found better
visual results from running 2048x1536 with no AA instead of, say,
1600x1200 with AA (Oblivion/Stalker). 16X AF used always though.

Ian.

 

Kodiack

Distinguished
Oct 8, 2010
24
0
18,520
This is a great article. It goes to show that software development is still well behind hardware, especially when you look at CPU utilization. Most of these games handle hardware a lot better than some of the games I've played. The Sims 2 and World of Warcraft, for example, will chug on the latest and greatest hardware simply because they utilize processors so poorly. My i7 sits at around 20% utilization max, yet it's holding the games back because they lack the parallelism needed to produce a smooth framerate.

You can tell when developers create games with the future in mind. Games that scale well always make me very happy. Crysis, Grand Theft Auto IV, Resident Evil 5, and virtually any game running on the Source engine are shining examples of this. Whenever I upgrade a piece of hardware, my FPS in such games reflects it. I would love to see more developers make games with scalability in mind. I'm always extremely disappointed when clock speed is the only factor that pushes FPS up.

Once again, this was a great article! Fascinating read. I sincerely hope that game developers - or rather, all software developers - will start pushing more for improvements to utilize resources efficiently. Hardware development has skyrocketed, and without software capable of parallel processing, a lot of that potential power is going to waste.
 

mapesdhs

Distinguished
When I was a sysadmin at a university in the late 1990s, where the course wa
in "Computing" (emphasis on more practical stuff), the teaching of parallel
programming was not delved into that much because it was a subject even the
lecturers didn't understand. By contrast, the research-oriented uni where I
obtained my degree (Computer Science) covered the field in great detail since
it impacted so much on other areas of the course - databases, AI, gfx, etc.

The problem IMO is that students leave education now with a rather poor
knowledge of parallel programming techniques, so it's no wonder games companies
find it hard to write appropriate code.

Sad part aswell is that I expect companies do not share their learned knowledge,
so there's probably a lot of reinventing of the wheel.

Ian.

 

Graham_71

Distinguished
Jul 30, 2010
72
0
18,630
Great article Toms, hope we see many more like this in the Future.

Also would like to know, what software is used to measure CPU/GPU & VRAM utilization ?
 

L0g1c

Distinguished
Oct 1, 2010
11
0
18,510
Well-written and concise conclusion: Dual/Quad, then GPU all the way for the foreseeable future.
 

elcentral

Distinguished
Apr 19, 2010
459
0
18,790
well i actualy newer tested my rig for bottle necs but i did a test today for fun.

test programs used Rivaturner for my grapic card and cpu-z/Core temp on my cpu. running a 295 evga and a azus Amd quad 3,4Ghz whit 4 gig ram at 800mhz.
i did the tests in dual screen mode so i had a on the spott wye.

1st bench is Lost planet 2 all setings max "no anti". it seemd i had almost no cpu load 30/40% at best on all cores,

the next one is Bad Company 2 max all 8xCSA,Anti 8xAnisotrophic, it sems my computer is almost perfect for this one 87/95% load on all cores so it cant botle neck to much atleast not in this game.

Last one Napoleon Total war: all setings max 0xAnti 2XAnisotrophic, quick batle 2vs2 low funds. well this one is a hard one core0 90-100% core 1-2 around 30% core3 40-70%. why the diffrent stresses dunno but its seems this game like to eat grapics but how glorius it locks.

i gess its all up to the games to get the bottle neck out of the way
 

WarraWarra

Distinguished
Aug 19, 2007
252
0
18,790
The basic home pc would soon be as slow as 2x cpu sockets 6 8 12 24 cores avg for more serious med / hard core gamers.

Multi GPU used to be default back in the NV6800gtx days surely it should continue to be a trend where budgets permit.
PS> How would a old Dual core Intel 965 3.7ghz stack up to current tested above system provided it has a basic weak ATI5850 headless server video card installed just for the joke of testing it.
Power consumption would be bad but gaming performance ?
(Intel SL9AN)

Show us how the CPU "Re-Branding / activating extra cores" has actually made a difference to our lives gaming wise.
 
[citation][nom]mapesdhs[/nom]The problem IMO is that students leave education now with a rather poorknowledge of parallel programming techniques, so it's no wonder games companiesfind it hard to write appropriate code.Sad part aswell is that I expect companies do not share their learned knowledge,so there's probably a lot of reinventing of the wheel.Ian.[/citation]

On the education front, that is quite possible. I don't believe they teach much on making threads either, at least to the best of my knowledge.

As far as sharing... too many MBAs and Lawyers are involved in protecting company IP/assets for them to share anything outside of a paid license. The day of freely shared discovery and collaboration outside a single "corporate" entity died decades ago.
 

cable4

Distinguished
Aug 12, 2010
5
0
18,510
So Starcraft 2.....

I first must say I am not great at it but I find the gameplay amazing. Now the testing done here was soo unclear about how they taxed the CPU and GPU that it should be redone.

Try this

1v1 2v2 3v3 and 4v4. My system handles 1v1 just fine with a slow single core 2.4Ghz and 8800gtx. As the onscreen units grow and the total action offscreen grows my FPS goes down down down until, in some battles I am looking at 4FPS. Please give this game another look. It's hugely popular and shouldn't be sold short in a simple test. This extra bit of testing would take maybe an hour or 2 and provide hundreds of people who read it more sound advice for thier next upgrades. I am thinking of getting the 460 1GB version and the 760 cpu. I only use 1680x1050 so this should work and I can grab another 460 if I want to hook to my 55 inch tv.
 

thelaw

Distinguished
Oct 3, 2010
3
0
18,510
LOl, dual cores are so...well socket 775. To bad dual cores aren't available for
for the newer sockets.
 

dallaswits

Distinguished
Sep 16, 2010
77
0
18,630
Well, the article was pretty clear about needing a dual core for SC2, and a triple or more core makes things even better still, though the game doesnt "fully utalize" the cores beyond 2.
[citation][nom]cable4[/nom]So Starcraft 2.....I first must say I am not great at it but I find the gameplay amazing. Now the testing done here was soo unclear about how they taxed the CPU and GPU that it should be redone. Try this1v1 2v2 3v3 and 4v4. My system handles 1v1 just fine with a slow single core 2.4Ghz and 8800gtx. As the onscreen units grow and the total action offscreen grows my FPS goes down down down until, in some battles I am looking at 4FPS. Please give this game another look. It's hugely popular and shouldn't be sold short in a simple test. This extra bit of testing would take maybe an hour or 2 and provide hundreds of people who read it more sound advice for thier next upgrades. I am thinking of getting the 460 1GB version and the 760 cpu. I only use 1680x1050 so this should work and I can grab another 460 if I want to hook to my 55 inch tv.[/citation]
 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
I still remember when I figured out that a 500mhz cpu was the limit and then the graphics processor bottlenecked when using a Voodoo II graphics card :p Haven't tried to find bottlenecks since then.
 

CptTripps

Distinguished
Oct 25, 2006
361
0
18,780
[citation][nom]nativeson8803[/nom]It's disappointing to see that devs still aren't taking advantage of multiple cores like they could.[/citation]

Agreed. I was very happy to see all 4 cores being used when playing BC2, Dice seems to be ahead of the game in this area. Hope others start better utilizing more cores soon.
 
Status
Not open for further replies.