TheGreatGrapeApe
Champion
DeAtHrApToR :
Does anyone else besides me remember reading about Crysis having code that current hardware couldn't handle that would be enabled through patches when the hardware existed?
Yeah, see my post;
http://www.tomshardware.com/forum/245048-33-benchmarking-crysis#t1747588
"Bear in mind we don’t expose, but have built in scaleability for the upcoming 1-2 years. That will be available as hardware catches up. So when I say maxed, its maxed for now."
http://www.driverheaven.net/gamingreviews/crysisinterview/index.php
Also to other talking about the Beta, if the MP Beta was DX9, then how can that be anywhere near 'maxed out', it seems only the single player beta is DX10.
As for the demos we've seen sofar, most have been on projectors at 720P which is 1280x720/1366x768, some have been on monitors at unknown resolutions, but most of the E3 and such footage has been from 720P gameplay. Mostly we see HD trailer, but not HD gameplay, look at the HUDs and particles, and that doesn't look like HD, also look at the size of the artifacts and tears and especially the aliasing on angles, that's not hi resolution gameplay. The few times I've seen high resolution live action was with things ike the sandbox demo and it stuttered like a stepchild.
I'm not saying what is will and won't run fluidly on, but it's guaranteed that nothing currently available can truly max out this game, and I doubt that a single GF8800GTS or HD2900XT will give you fluid gaming with the highest available settings and highest available resolutions 1920x1200+, however a clluster of them 2-3-4 might be able to at what's currently exposed to users.