I really think it should be made clear- no matter how high the numbers are in games and even though it reflects some CPU bottlenecking, no mainstream monitor will show over 160 FPS (160 Hz)@1024x768. For that you need 130 kHz horizontal refresh which is rare even on professional CRTs. Same monitor will wind up at 120@1280x1024 and just over 100@1600x1200. For a normal LCD monitor 85kHz-92kHz is common which translates into maximum possible vertical refresh of 100 Hz@1024, and capped to 85 Hz on most.
So it may be fine and dandy game benchmarks proving it's better CPU but the results are useless in real life games. Unless extra eye candy is turned on there is hardly a difference since both GTS and GTX will be better than the monitor, not even speaking about what the human eye could possible notice over 60 FPS.
And when the extra eye candy is turned on, most games are bound to be GPU limited and GTX will win.
When different resolutions are compared on the same system in the Doom and F.E.A.R. benchmarks it's clear that the old system is CPU bound and the new system is GPU bound- so no matter the net increase , the slower GPU shows. IMHO Doom and F.E.A.R. are getting old to compare brand new hardware and not even get in high (1920 and above) resolutions. Oblivion is the only game on the test that shows the better scaling of the older system (GTX) as the resolution increases.
So to recap, the entire game section (almost) proves nothing. The choice of cpu is hardly a deciding factor in quality of game play, just as FPS is a poor quality indicator. Minimum FPS and FPS fluctuation has much greater impact on game play. In that light you managed to effectively lower the real game performance while increasing encoding and other CPU intensive application's performance. So it's not win-win situation, or balanced system- it's called trade off.
@randomizer
Do you even know what vsync is? Vertical refresh is how many times the screen is refreshed per second. If you have 100Hz, the monitor will not display 101 FPS but 100 with or without vsysnc. Period. Also you can't assume that if your top FPS are 120 everything else is fine- whats the point of comparing systems then? It's very possible that the GTX will have higher min FPS, which would be low enough on both cards to be noticable( below your monitor's maximum vertical refresh).
@zenmaster
I only pointed that the choice of games and resolutions doesn't allow fair comparison since both systems are over what normal monitor will display(in a way the monitor is bottleneck), and then there is a claim( at the end) this system is faster by 18%- yes it is, but on old games like Doom 3 and FEAR and on resolutions lower than 1600x. You don't buy a brand new computer and use over 3 year old game( Doom 3) to compare it.
I said both Doom and FEAR are CPU bound on the old system which is logical since the video is so powerful , but even then the FPS are above the capability of 99% of the monitors sold. The second point is that the current system doesn't scale well in the same game with different resolutions- meaning that if tomorrow I get HD 2560x1600 this system will suck. You can argue that hardly anyone will use that resolution, I can argue the opposite. My point is for gaming it's always more important to be well prepared- spend your money on a better video card- it costs more but it lasts longer, and you can't put a price on smooth game play. I do agree with you on the $500 system.
@cleeve
I only disagree with the final conclusion, which is that this system is 18% faster and I argue it is so because of the choice of games and resolutions. I use the benchmarks, assuming they are solid facts, to show you that the games chosen do not show the difference that well because both system exceed the video bandwidth of most monitors. The "almost" I left there for Oblivion. All in all my argument is that almost everyone falls into the hype "higher FPS is better, who cares you can't see it" which IMHO is pushed by the graphics chips makers.