The Game Rundown: Finding CPU/GPU Bottlenecks, Part 2

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

leishushu

Distinguished
Oct 12, 2010
2
0
18,510
3years ago,i had bought intel core E2140(1.6G,OC 3.2)and xfx 8800gt 512mb.
3weeks,ago,i had bought inyel core quard Q6600(2.4G,3.2)and graphic card be the same.
MY CPU ability improved very much and what performance in game?
NO!
OR just 4-6 fps higher.
PS:i have played many games,my HARD Disk is 3X1.5TB,all of them is game.
 

kutark

Distinguished
Jan 10, 2007
193
0
18,680
Am i the only person on the planet that hates the way AA looks? I could understand the point of AA back in the days when people were playing on CRT's at 1024x768, or 1280x1024, etc. But once you got to about 1600x1200 you basically had to look for jaggies. At "modern" resolutions like 1920x1200, AA actually makes it look worse (makes edges look slightly fuzzy), in my opinion. Especially in games where having sharpness is important, like in an FPS.
 

tygrus

Distinguished
Nov 24, 2005
42
0
18,530
Are there any RAM, PCI-E or HD bottlenecks ? Sounds like we hit the financial bottleneck before we hit the technology limit.
 

unwanted

Distinguished
Feb 7, 2010
30
0
18,530
Ever played on a 60inch plasma at 1920 x 1080 with no AA? If the screen gets bigger you will always see jaggies no matter what resolution your running (unless you sit far enough back that it looks the size of a 22inch screen in which case what is the point of using a larger one).

On my 22inch at 1680x1050 I don't play with AA, I can easily notice the jaggies and with AA on it looks alot better but I leave it off so the fps doens't drop to low in certain situations and it still doesn't look that bad without it.

On my 46in plasma though I either have AA on or just dont use it if the game can drop low enough with AA on because it looks just completely crap without AA.

If the screen is small enough and the resolution high enough that you dont see any jaggies then AA will look worse and in that case you'd be stupid to have it on however if is unlikely most people would be sitting far enough away or have a screen small enough that whatever resolution they are running at they do not see any jaggies. Even 1920x1200 on a 19" screen if you sit at normal viewing distance you can easily see the jaggies and AA looks much better.

And with all monitors now in 16:9 aspect full HD to match tvs the resolution is pretty much capped at 1920x1080 so as the screen gets bigger the jaggies just get more and more pronounced and looks so much worse then if you have AA enabled.
 

ern88

Distinguished
Jun 8, 2009
882
12
19,015
So, upgrading me E8400 Dual which I have O/C to 3.7 Ghz wouldn't make a hell of a difference. I have a 5850 so that should suffice for now till Bull Dozer and Sandy Bridge come out. And I get a second 5850 when prices drop off for the now AMD 6000 series.
 

tuch92

Distinguished
Jul 12, 2009
101
0
18,680
I think a good way to show off these types of studies would be a chart where you match a video card with a processor. Such as:
If you have X Processor get X Videocard or something close to it.
That way people can go in knowing there won't be a bottleneck.
 

zerxezz

Distinguished
Dec 23, 2009
2
0
18,510
If it hasn't already been pointed out. There is probably a major difference between the 4Ghz machine and the 3Ghz machine. That is the memory clock speed. Usually the multipliers require slower memory in order to hit 4Ghz. So the "sweet spot" on the 3Ghz configuration in Mass Effect 2 is probably only reflecting a faster memory bus in the 3Ghz configuration being more beneficial than the faster core clock.
 

payne_ksharp

Distinguished
Oct 2, 2010
45
0
18,530
i am really pleased with this review... but i would definitely like a similar one with less powerful GPUs and lower resolution bottlenecks.
 
G

Guest

Guest
I really hope this testing could be done to Guild Wars 2 when it launches. It's hard to get it running well without overkill setups.
 

razor512

Distinguished
Jun 16, 2007
2,157
84
19,890
What really pisses me off is that there are some games that really need to use multiple cores (eg games such as rift and many others)

If you are getting less than 60FPS on a game and the GPU usage is around 50% and the CPU usage shows either 1 core maxed out or 2 cores at 50% then something is wrong.



PS why not try some classic games, eg will a core i7 3770k run outlaws for the PC (the game from 1997) if I restrict it to just 1 core and use a GTX 680. :)



 

TheinsanegamerN

Distinguished
Jul 19, 2011
363
0
18,810
[citation][nom]Anonymous[/nom]I'd like to see them test the first Supreme Commander.It still taxes the most powerful systems[/citation]thats the truth. when i built my sandy bridge i3 system, i was amazed that game could bring my rig to its knees.
 

Christopher1

Distinguished
Aug 29, 2006
667
3
19,015
[citation][nom]KingArcher[/nom]Would there be any performance difference between windows 7 32bit and 64bit?Assuming you use the same amount of RAM [4GB].[/citation]

Perhaps if a game comes with a x64 executable. Otherwise, I cannot see there being any difference. Question is since Windows 8 uses system resources so much better than Windows 7, would there be a noticeable difference on that?
 
Status
Not open for further replies.