The Game Rundown: Finding CPU/GPU Bottlenecks, Part 2

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
3years ago,i had bought intel core E2140(1.6G,OC 3.2)and xfx 8800gt 512mb.
3weeks,ago,i had bought inyel core quard Q6600(2.4G,3.2)and graphic card be the same.
MY CPU ability improved very much and what performance in game?
NO!
OR just 4-6 fps higher.
PS:i have played many games,my HARD Disk is 3X1.5TB,all of them is game.
 
Am i the only person on the planet that hates the way AA looks? I could understand the point of AA back in the days when people were playing on CRT's at 1024x768, or 1280x1024, etc. But once you got to about 1600x1200 you basically had to look for jaggies. At "modern" resolutions like 1920x1200, AA actually makes it look worse (makes edges look slightly fuzzy), in my opinion. Especially in games where having sharpness is important, like in an FPS.
 
Are there any RAM, PCI-E or HD bottlenecks ? Sounds like we hit the financial bottleneck before we hit the technology limit.
 
Ever played on a 60inch plasma at 1920 x 1080 with no AA? If the screen gets bigger you will always see jaggies no matter what resolution your running (unless you sit far enough back that it looks the size of a 22inch screen in which case what is the point of using a larger one).

On my 22inch at 1680x1050 I don't play with AA, I can easily notice the jaggies and with AA on it looks alot better but I leave it off so the fps doens't drop to low in certain situations and it still doesn't look that bad without it.

On my 46in plasma though I either have AA on or just dont use it if the game can drop low enough with AA on because it looks just completely crap without AA.

If the screen is small enough and the resolution high enough that you dont see any jaggies then AA will look worse and in that case you'd be stupid to have it on however if is unlikely most people would be sitting far enough away or have a screen small enough that whatever resolution they are running at they do not see any jaggies. Even 1920x1200 on a 19" screen if you sit at normal viewing distance you can easily see the jaggies and AA looks much better.

And with all monitors now in 16:9 aspect full HD to match tvs the resolution is pretty much capped at 1920x1080 so as the screen gets bigger the jaggies just get more and more pronounced and looks so much worse then if you have AA enabled.
 
So, upgrading me E8400 Dual which I have O/C to 3.7 Ghz wouldn't make a hell of a difference. I have a 5850 so that should suffice for now till Bull Dozer and Sandy Bridge come out. And I get a second 5850 when prices drop off for the now AMD 6000 series.
 
I think a good way to show off these types of studies would be a chart where you match a video card with a processor. Such as:
If you have X Processor get X Videocard or something close to it.
That way people can go in knowing there won't be a bottleneck.
 
If it hasn't already been pointed out. There is probably a major difference between the 4Ghz machine and the 3Ghz machine. That is the memory clock speed. Usually the multipliers require slower memory in order to hit 4Ghz. So the "sweet spot" on the 3Ghz configuration in Mass Effect 2 is probably only reflecting a faster memory bus in the 3Ghz configuration being more beneficial than the faster core clock.
 
I really hope this testing could be done to Guild Wars 2 when it launches. It's hard to get it running well without overkill setups.
 
What really pisses me off is that there are some games that really need to use multiple cores (eg games such as rift and many others)

If you are getting less than 60FPS on a game and the GPU usage is around 50% and the CPU usage shows either 1 core maxed out or 2 cores at 50% then something is wrong.



PS why not try some classic games, eg will a core i7 3770k run outlaws for the PC (the game from 1997) if I restrict it to just 1 core and use a GTX 680. :)



 
[citation][nom]Anonymous[/nom]I'd like to see them test the first Supreme Commander.It still taxes the most powerful systems[/citation]thats the truth. when i built my sandy bridge i3 system, i was amazed that game could bring my rig to its knees.
 
[citation][nom]KingArcher[/nom]Would there be any performance difference between windows 7 32bit and 64bit?Assuming you use the same amount of RAM [4GB].[/citation]

Perhaps if a game comes with a x64 executable. Otherwise, I cannot see there being any difference. Question is since Windows 8 uses system resources so much better than Windows 7, would there be a noticeable difference on that?
 
Status
Not open for further replies.