Avenger_K :
http://www.crysissector.com/info/systemrequirements.php
Link to what the CEO of Crytek said about system requirements. (Note: the min. and max. settings are unconfirmed; the ultra-high requirements are from Cevat Yerli himself.)
Sure the GamesRadar interview says it will pay on Ultra-High, but not at what resolution.
http://www.gamesradar.com/us/pc/gam...9019§ionId=1001&pageId=200706121061280027
So Ultra high @ 1024x768 is a possability, but is that really 'Ultra-High', especially when the settings sofar seem to end at 'Max'
Unless he comes out and says something to contradict this interview statement;
Gaming Heaven: Performance, the main question every gamer has. What will be needed to run the game at playable framerates? How about playing with everything maxed? Will current rigs be able to run the game at 1920x1200 without making sacrifices? Do you recommend ATI or Nvidia and have you any performance figures for us for the Nvidia 8800 and ATi 2900?
Cevat: Everything maxed and at that resolution, you will need a seriously high rig of the latest generation available now.
Bear in mind we don’t expose, but have built in scaleability for the upcoming 1-2 years. That will be available as hardware catches up. So when I say maxed, its maxed for now.
http://www.driverheaven.net/gamingreviews/crysisinterview/index.php
then it's far from a done deal, and if there's more to offer than what will curently be exposed, then what does 'Ultra-High' mean, other than it's still not the absolute maximum which sounds like it's Uber-Fantasma-Wonder-high if Ultra-high is still held back at 'max'.
And sofar all the other links cite other people's estimations and guesses they call 'guidelines', nothing official from Crytek itself.
So from what is on the page and actually supported above, Cevat says that the GTX should play Ultra-high (but no idea of resolution), and that they haven't unlocked everything yet and more is to come (thus Crysis won't even technically play at it's maximum upon release despite having the sliders pushed to 'max').
Seems pretty straight forward, and while it may or may not offer far tougher settings in the final SP release, it's use as a benchmark will likely depend alot on how it uses the available hardware, because as we saw with the difference between COH DX10 and Bioshock DX10, not all games are useful as tests of their relative impacts, and no two games act the same. So even if it's kicking out 150fps @ 1920x1200, it may still have alot to offer as a benchmark. Oblivion isn't a good benchmark just because it makes system come to a crawl, it's how and what systems that it does that (to) that matters most. If Crysis is wicked optimized and runs smooth as silk it doesn't make it less relevant than Oblivion, because it's more 'current' and more in line with the current gaming scene, not last year's.
Anywho, even if it does run at 300 fps, to me it'll be a more important benchmark than the original FartCry, D3, HL2, BF2 etc because I won't play them much anymore, however I will likely play Crysis more than all 4 combined, and also likely games based on it. That's my thinking on it's utility.