The Game Rundown: Finding CPU/GPU Bottlenecks, Part 2

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
AMD/ATI runs these differently and would like known more about why...
BC2 will peg 100% cpu usage on a X3 or X4 with a ATI good video card.
It seems funning that if want a gaming rig stay away from AMD. With the amount of Driver issues, you cannot even imagine who weird Civ5 is on 5850 not being able to render all the textures...
Is there games that play nice with this hardware at all???
 
[citation][nom]nativeson8803[/nom]It's disappointing to see that devs still aren't taking advantage of multiple cores like they could.[/citation]

Take a look Steam's hardware survey. Slightly more than half are sporting dual core machines, only a quarter are using quad cores. As far as most developers are concerned a second core is near useless as it is already clogged up with background programs. Until quad-hex cores represent the majority you are never going to see many be willing to put in the extra time to take advantage of them.

I'm waiting for the day that a secondary light app can run on the extra cores to control the AI for bots in games. They could treat them like other players that are logged into a multiplayer game. Co-op games would become that much more interesting as each player could each bring an extra 2-4 cores to help boost the AI. Instead of using cheats to ramp up the AI players, they could actually have some smarts!
 
awesome piece TOM's, thank you.

now can we get some developer interviews on concept of software optimization. so many forum trolls complaining that games don't run well because they are made for consoles first and not optimized well for pc.

also a piece look at how games that exist both in the consoles and on people compare in visual quality and fps performance. maybe even looking at how much of the xbox 360 hardware is being used.

is there any truth to games being made to the lowest common hardware denominator...the consoles?

 
This series of articles is the best Tom's has had in quite some time. These are the kinds of things I like to read about. It would be interesting to see how much difference to CPU makes in SLI/Crossfire. I've heard that in order to run SLI/Crossfire effectively you need a quad-core. However, in most benchmarks I have not seen that that is the case.
 
Good article. All the major review sites including this one recommend the 1GB GTX 460, yet when I look at all the articles they are always using the 768MB version. I wish everyone would use the 1GB GTX 460 like they recommend, but oh well.
 
Question: On the titles that appear GPU limited how do we know they are now suffering from the the Starcraft II problem where simple scenes are redrawing so quickly that it taxes your graphics card? Is there a way to prove/check this?
 
Ummm whats wrong with this picture, your trying to find bottlenecks right? The CPU core count is only to an extent, most of your latency is going to come from your memory interface, the FSB, and GPU Bandwidth/Speed. You should try and look at the HT/QP connections and what speed they have on the baring of the game. Compare different speed, latency of memory, and the rest of your IO system. Test at what bandwidth the gpu makes its peek offerings. Then do your CPU core count. then try and combine all the data you have reached to find the optimum combination for today's games. (with the expectations of not going overboard)
 
I am glad to now that my upgrade to my system following a failed motherboard wasn't a "waste" in the DX11 environment, as many posters (even here) said. (Failed MB in a system with a C2D e6750 running at 3Ghz with a Nvidia 9800 GT 512MB, replaced MB and upgraded to a Radeon 5770 1GB.)

The fact of the matter is this: in the current recession/depression economic environment & with gaming slowly entering the land of "consoles are what people are wanting again," game developers are not "advancing" system requirements as quickly as they used to. (People are not upgrading computers as quickly as they did in the 90s and early 00's.)

As another poster stated: A large majority of Steam users are only using single and duel core CPUs. Why write a game that won't perform well (IE: Sell well) on anything but a quad core +? From a company's bottom line standpoint, the game would be an economic failure... or at least not sell as well as it could if it had been written for a mainstream computer. (IE: The first Crysis.)
 
And, as for the people that are saying "test memory bandwidth, latency, .... and anything more complex"... that wasn't the point of the article.

The typical computer user knows where the power button is and how to install and play a game. An "advanced" user will know that they have a cpu running at xGhz with 2 cores and that they have a X video card. (They may even know they have X GB of ram.)

For the "mainstream" environment, very few people know their FSB speed, GPU bandwidth, ram speed and settings...
 
[citation][nom]terr281[/nom]And, as for the people that are saying "test memory bandwidth, latency, .... and anything more complex"... that wasn't the point of the article.The typical computer user knows where the power button is and how to install and play a game. An "advanced" user will know that they have a cpu running at xGhz with 2 cores and that they have a X video card. (They may even know they have X GB of ram.)For the "mainstream" environment, very few people know their FSB speed, GPU bandwidth, ram speed and settings...[/citation]

So what you are saying is that it's ok to use vodoo magic instead of proper testing methodology to backup these claims?

These claims might be right, but the problem is that the testing is flawed at a fundamental level. The results could just as easily be completely wrong without proper performance counters, analysis, and a wider GPU/CPU range running the same tests, and then also checking cache, memory and drive performance.

eg: CPU queue length and context switches might explain some things with this setup's results, and then those problems go away with more memory or different CPU + GPU and you get a radically different result.

Testing it bad, these interpretations are bad.
 
The numbers for the Starcraft FPS looks high, I wonder if the purpose built Benchmarking scenario Toms made a few weeks back was used. I tested it on my setup and made my comp look like a POS.
 
Metro 2033: Of course, you do realize testing a GTX 460 @ 1920x1200 with max settings and DX11 turned on is not a great way to test CPU limitations. You've created an instant GPU limitation by using those graphic settings.

This would have been a more interesting benchmark (Metro 2033 in particular, not the others) if you'd tried it with different settings to see if the CPU could BECOME a limitation.
 
Is it me or I didn't see what I was expecting to see... bottleneck of GPU and CPU.

I only saw a some compared stuff about the amount of cores...

Simple here... put the strongest CPU and put different graphic solution to test the result...
Do the same for the strongest graphic solution and change the cpu. If you are doing it with a LGA1156 and an AM3 platform, it will be possible.

What we want to know is the maximum ratio price/performance we can achieve with a certain cpu.
 
so essentially what these 2 reports have shown us (like it says in the conclusion) is single core gaming is pretty much dead and dual core is the norm. Quad core gamers are still future proofed and their bottleneck is the gpu. Thanks for the info but you could have done this in a grid format on one report and been done with it.
 



10x7 low detail dx9 mode? that's how I tested the difference between my core2 and my i7. taking the video card all but completely out of the equation.
 
I knew it!! Having a graphic card with a memory less than 1GB is no longer a viable option if a hardcore gamer likes to crank it up its graphic performance. I also said in the part one that the minimum core count is a true triple core CPU (something that AMD already have). Even a dual core has to struggle when a player decides to up the ante on, which means: that ALL SINGLE CORE CPUs AND LESS THAN 1GB GRAPHIC CARDS MUST BE THROWN OUT THE WINDOW if anyone wants to increase its gaming experience.
 
I'm intrigued by the StarCraft II benchmark, is it just the way the processor is handling the request, or is it specially using Core 2 (& 4) instead of Core 1; thereby leaving Core 1 to handle miscellaneous process requests. Or is this simply intel speed-step in action?
 
Perhaps I missed it, but are these average frame rates? I'm hoping they are minimum. I get an average of 45fps on TF2 using this Athlon based PC. But sometimes my FPS dips to 24 or so while playing, which really ruins the experience for me. Minimum rates are everything.
 


You still aren't even comprehending the REASON for these tests - they are for what people have out there NOW (on avg) and where they can upgrade to get better game performance.

They aren't for the guy who can afford to spend $10k on his rig. (If you're THAT Guy, why are you even here for this article, this is LOW end stuff if you have that much to spend...)

They are for the guy/gal that has $300~ in his pocket and wants to upgrade his game experience. What should he/she upgrade FIRST. It's like you go right to the tests and don't read the first couple pages of the article on WHY they picked the particular CPU, GPU, Memory and all that - they explain it right there IN the story.
 
I thought this article was great, you take a mainstream ~$200 processor and a mainstream ~$200 graphics card which will get most people up and running quite happily.

I wonder though why there weren't any SLI benchmarks with the GTX 260; though I guess we already know about the wicked scaling from other articles (thanks toms), it appears that an extra graphics card is the obvious upgrade path for people sporting a the extra power and right motherboard.

I would appreciate an article taking an i7 980 as the minimum CPU bottleneck and going through various graphics cards to isolate specific performance; and for another article taking a set of GTX 480 in SLI and going through various processors i5 750 - i7 870 - i7 980. Maybe ASUS ARES in Crossfire X and go through AMD's line up as well. Of course that will still leave people wondering about every possible paring in between...
 
Status
Not open for further replies.