9600 GT is severly being bottlenecked.

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Wow this is an interesting thread. It sounds like he had a few software issues slowing him down a few points, he did a clean install and got a few more points and is on par for his hardware or at least within a reasonable margin of par.


I have an old system and I know how you feel. I gave up trying to squeeze more out of less. But then again I have an even slower and older pc than you.

 






There is your answer to the problem at hand, again its the cpu. Would you put 87 octane in a Ferrari? You could but I can sure guarantee it won't run to its full potential. This is why I made the suggestion for spending a little bit of money that will go a long with with a nice upgrade path (socket 775/45nm in the future when its cheap). If you wish you could spend hours on google just to come back from where you started. BTW grats on getting somewhat better performance.
 
Ah. I like XP and Vista just the same, but XP is more scalable, to remove XP from mainstream markets and prevent XP from being installed on lower end PC's would drive PC costs to the roof! Not to mention the fact that more games run on XP than vista, and DOS is great on XP, not so great on vista.

😀


I never thought I would actually sign that petition until today. [:kentuckyranger]
 
LOL, look how far XP has come. Anyone remember when XP came out and everyone was talking about how you needed a beefy computer inorder to run it? Now its being kept around longer for low end devices. I bet the same will be true with vista 8 years from now...
 
The same goes for OSX. There are plenty of old Macs that run 10.4 and earlier fine, but put on 10.5 (leopard) and they run like rubbish. It's called software progress, and it always goes backward before it goes forward.
 
To the OP, check this out http://www.techspot.com/article/73-crysis-performance/page7.html Theres very little difference between a 1.6Ghz C2D vs a 2.66Ghz C2D in Crysis. And its using a faster card, making the cpu the bottleneck for sure, but it hardly exists, because, the biggest increase a gamer gets is from his gpu, and the biggest bottleneck also comes from the gpu. Now I know your cpu can compete quite nicely with a C2D clocked at 1.6Ghz, hope this helps
 
This only means that the majority of the load goes to the gpu. And, it only shows what people actually play at, not some 800x600 trying to bottleneck the cpu stuff, where no one actually uses the resolution. I think its fair. It just shows that a cpu at normal play DOESNT bottleneck the gpu much at all
 
In other words, this was done to show at normal play, even a severely underclocked cpu WONT cause any bottlenecks, so no need to post about these things, unless theres another problem causing it 😱 😱 😱 :kaola:
 
2 cents.. and i really hope this can help to clear some things up..

well... as much as i know i was an amd 'fanboy', it didnt look like my s939 X2 3800+ @ 2.4 2.5GB PC3200 would really pose a performance problem for current games, as it didnt seem like a problem at all before... that was until i went from a 7800GT 256 to an 8800GT 512, simply because i wanted newer games to play better.

the 8800GT 512 upgrade, both oddly and sadly, wasnt really faster at all than my 7800GT, and no, im not talking about 3dmark scores...

it wasnt until i spent $200-300 more to upgrade my cpu motherboard and memory, that the gpu stopped exhibiting poor performance while gaming for current games.

no, the X2 series of cpus arent poor performers, even the slowest versions are quite capable... but unless youre running your specific cpu around 3GHz+, youre going to see subpar performance for gaming on current games, on average, particularly for games that can take real advantage of more than 1 cpu core

again, no, im not an intel fanboy, but the truth is, a 'noticably' faster subsystem would do your gpu a lot of good when it comes to gaming in general nowadays.

im not disregarding benchmarks, theyre what i referred to a lot before making the purchases i did... but the truth is, and what benchmarks 'often' fail to tell, is that your cpu is what is causing your 9600GT to seem similar to the 6800 you had... no way around it really, unless you can invest in some better cooling, to overclock the heck out of your cpu.

i didnt believe a cpu could matter that much when it came to gaming, it never seemed to before, and the typical benchmark review didnt do much to indicate it would either... but again, that was before i upgraded my gpu... but not anymore.
 
i hav a amd x2 5600 @ 2.8ghz an 2 gb ram....vindows vista 32. 3 days ago i got a 9600gt i the performance is gr8. i get 20fps on crysis wit 1200x800 resulution wit all settin veri high xcept 4 shader details n shadow detail or sumthin, which r set 2 high.
i installed assasins creed last nit n got 30fps wit max details at 1440x900.they both look awsum.
but the thing is GPUz 0.1.7 shows dat im runin ma card in 16x n not x8. onli difrenc i find is dat ma mem is of 800bus n ma cpu is at 2.8Ghz. but i don think dat should b a big deal wen it comes 2 playin the games.
i recomend if ur psu is givin da rit power.that mit slo ur games.i don think mobo is n issue coz m using a crapi mobo too.msi k9agm3
 
if you can underclock your cpu a lot (such as by enabling power management in vista to stay only at the reduced speed regardless of load, eg, always @ ~1GHz+, OR using rightmarks cpu management utility to find an inbetween cpu speed), you may find things to be somewhat different for performance on average... such as a much smaller difference between max fps and min fps.

the cpu architecture can have a very large impact on fps consistancy... a P4 @ 2.0GHz for example, is going to be much the same case as the situation i gave above, though im not sure how many people are still using P4s or AXPs for high performance purposes.
 
3840x1024 with a triplehead2go digital edition. its what prompted me to go for the 8800GT 512, since i felt the 7800GT 256 would still be okay since i reduced the resolution to being 1920x480, though, i didnt like using it at that res for all that long for many games.

so, after increasing the resolution to the max supported, i took more of a performance hit than i cared for, and proceeded to upgrade as such. and saw more limitations than there should have been 'why didnt the playability improve, even though i just spent $250 on a new gpu???'

checking 3dmark scores, they were just as they shouldve been too (using futuremarks orb system), so no problems there, it was at 1280x1024 so its a low res, just as youre saying... i did find out though, that by OCing my X2 3800+ to 2.6 with the 7800GT, that certain scenes in 3dmark06 saw increases in fps in low fps areas (parts of return to proxycon for example), i started finding out more and more though after that realization

edit: no, the 7800GT wasnt AS fast as the 8800GT at very high resolutions, but there was still enough of an impact for me to exclaim 'what the heck!?' and wonder just what was going on. things were improved, but not nearly as much as they shouldve been (such as in oblivion, which was a game i had been wanting to play on the th2g since reading thg article on the th2g analog edition) especially after reading how close the 8800GT 512 was to the 8800GTX, and resulted in the remaining upgrades being well worthwhile, where i really wouldnt have believed so otherwise (512mb is still largely enough though at that res, even if the 768mb of a GTX and such would have been more ideal still).

although, i did read something awhile back that someone mentioned in another forum, something about the difference in cpu architectures that they had read, which allowed the core architecture to maintain a higher fps in some intensive game situations than what i had been seeing, which had me wondering too, that was before i had upgraded anything though, so i only vaguely wondered, but didnt care all that much either, since what i had was fast enough at the time, and had been for the nearly 3 years that i had it.
 


Barely understandable, but true. On XP with the Very high cheap cheat, it runs at 20 fps for me (except in large battles, where it can throttle down to 12 or even 8.)