But Can It Run Crysis? 10 Years Later

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
Thanks for this revisit-review.

This game is still a staple for me and every time I do an upgrade it's a part of my testing. With both of my current rigs I get very similar results as yours. My 1080SLI get's slightly better but I also use that one at 4096x2160. Parts of the game really do continue to look better at hi-res. AA really takes it's toll though.

Even they've no reason too, I always hope Crytek would do a new more optimized version for Windows 10/64bit. I honestly believe at this point it could really help. Warhead also is incredible in 4k. 2 and 3 both look great but comparatively speaking the engines are a bit further along.
 


Yep. Games today, especially AAA games, are developed for consoles (dumbed down optimized graphics) then back ported to the PC. It used to be the other way around. And more and more I'm seeing fails in said back porting for PC. The most notorious fail for that is perhaps Batman Arkham Knight. It's also a reason why we are seeing a decline in multiple GPU optimization.

I love my PS3/PS4, but I have different games between those and the PC for a reason: I enjoy graphics on my 1440p PC more than the dumbed down console graphics on my 1080p HDTV. Racing sims and shooters specifically. I have both PS4 and PC versions of Project Cars, and the difference in the graphics between the PC at 1440p in max quality settings and console at 1080p in detail is astonishing. For those who don't know, Project Cars was developed around the PC first instead of the console first just like the good old days.
 


To be fair, Crytek designed Crysis to be a showcase of what their engine could do for future games. Even then, it wasn't normal for games to be released that couldn't be played at their highest settings on any existing hardware. Unless a developer is trying to promote their own game engine, it might not even be in their best interest to include graphics options that no one can run at the time of a game's release. If anything, you're bound to see people complain about a game's performance if they can't run it at 60fps at its highest settings.

And perhaps more importantly, the rate of hardware advancement is no longer quite as rapid as it once was, and the cost of creating a game with cutting edge graphics has increased, to the point where you end up seeing diminishing returns when it comes to making games look even closer to photorealism. Once games reached a point where scene complexity became high enough where you weren't easily noticing polygons, and lighting and animations became relatively realistic, further improvements tend to be more subtle.

 

alttu

Prominent
Nov 16, 2017
4
0
510
What's missing are captures from the physical output port of the GPU for doing comparison of if there's any noticeable visual quality differences. I don't trust Nvidia for looking after image quality, they are super obsessed with beating whatever benchmark tech press is setting for them and NONE of the tech press looks if the driver updates boasting 50% improvements have some effect on image quality. If colors have changed AT ALL (for lets say more than 10% of pixels on the screen) for same exact frame between driver updates, that counts as image quality change. If tech press does not mention this but only mentioned % fps improvement, then tech press is full of you know what.
 

alttu

Prominent
Nov 16, 2017
4
0
510
What's missing are captures from the physical output port of the GPU for doing comparison of if there's any noticeable visual quality differences. I don't trust Nvidia for looking after image quality, they are super obsessed with beating whatever benchmark tech press is setting for them and NONE of the tech press looks if the driver updates boasting 50% improvements have some effect on image quality. If colors have changed AT ALL (for lets say more than 10% of pixels on the screen and with change larger than 0.8% in any of R,G,B values - so if 255,255,255 changes to 253,255,255 and this happened for >10% of pixels in the frame, then % FPS increases cannot be mentioned as it's now apples to oranges comparison) for same exact frame between driver updates, that counts as image quality change.
 

alttu

Prominent
Nov 16, 2017
4
0
510
I would further add that if GPU hardware is benchmarked between vendors, the tech press has the DUTY to ensure all the GPUs are running the shaders that ship with the game executable - NOT whatever replacement shaders ship with drivers.

In the Crysis case I have a theory that Nvidia has done few more optimization passes on the shaders the drivers ship (>50% of Nvidia installer size is likely shaders that replace the ones in games and I doubt if anyone in Toms or Anand has looked if these updates affect colors within the specified 10% area, 0.08% deviation) than AMD.

What Nvidia is doing is the FPS-equivalent of injecting new shaders with Reshade or ENB. Where ENB and Reshade users attempt to restore or improve color quality, Nvidias aim is to beat competition in benchmarks. The way Nvidia often releases a faster GPU revision weeks/days after AMD announcements tells me they are so obsessed about being in top of the benchmarks that there is "reasonable doubt" that their drivers include shader modifications that have potentially visible image quality detoriation. I could be wrong but given how tedious this is to prove (measure needs to be from GPU external output port and on exact matching frame and using different GPUs where one has not been allowed to update or connect to internet, because in theory drivers can persist updates permanently, so rollback of drivers would not restore original functionality) ... I don't think anyone has done it.

Anyway the point ultimately is this:
If one were to start a new GPU company with equally fast or faster hardware than Nvidia - Nvidia would still win in tech press because they would release a driver update than modifies the shaders and nobody checks if the image quality changes between these updates. Most people can't notice 0,08% RGB variance but those that can feel strongly that if this type of thing is taking place and tech press isn't mentioning it, then tech press is for lack of better word, ignorant, corrupt or in industrys pocket (- ie. not corrupt but making it seem like they look for consumers best interest while in reality are looking for OEMs best interest by avoiding investigating possible "cheats")
 

alttu

Prominent
Nov 16, 2017
4
0
510
btw I took screenshot of my comment that shows the site received it. I'll keep checking back on another computer and another ip to see if it's visible. If it's "gonesky" then that should help confirm that I'm on the right track on some dirty industry "secrets". (where the secret isn't the shaders shipping in the drivers, but the lack of tech press effort to look into whether the huge FPS improvements come at hidden cost in image quality)
 

macsquirrel_jedi

Honorable
Nov 17, 2017
4
1
10,510
All of the benchmarks we’re running today invoke DirectX 10 in a 64-bit environment.

it means? (Win10 x64, Crysis x64) or (Win10 x64, Crysis x86) ? because i know, that specially on Steam, the game have to be manually pushed to x64 mode...
 


You must have forgotten about the Far Cry series. FC2 in DX10 maxed out at 1920x1200 specifically brought every high end rig way down. For example, a GTX 285 and i7 965 overclocked to 3.7GHz would only get 40FPS average in it.
 

I actually had a couple lines mentioning Far Cry in that post, but edited them out prior to posting it since I didn't think they added much to the conversation. : P

Pointing out Far Cry only supports what I was saying though, since the original Far Cry was developed by Crytek as well, again featuring advanced graphics to help promote their engine, and much like Crysis, the original game included the option to enable some advanced features like HDR lighting that helped it continue to look good for years after its release. Crytek sold the Far Cry series to Ubisoft, along with the rights to continue to build off the original CryEngine. So Crytek continued to develop CryEngine into CryEngine 2, while Ubisoft forked their version of the code base into the more open-world focused Dunia Engine. Far Cry 2 and all subsequent games in the series continue to use that modified CryEngine, although the engines have no doubt diverged significantly over the years. Since Crytek's original Far Cry had the most advanced graphics of its time, Ubisoft undoubtedly wanted to continue to have demanding graphics options in the sequel, to not disappoint those expecting shiny new graphics from a Far Cry game.

And really, while you mention Far Cry 2 being a demanding game, that still didn't compare to Crysis. Compare that 40fps at max settings on a high end graphics card at a high resolution for the time, to the 10-15fps that Crysis got on high end hardware at the same resolution when it had launched a year earlier. Far Cry 2 was at least playable at max settings on high end hardware, while Crysis was not. Far Cry 2 might have been relatively demanding, but it wasn't really trying to show off what future games could look like on its engine. The game was also released on consoles at the same time, unlike Crysis, which wasn't modified to run acceptably on consoles until several years after its PC release.
 

almighty151986

Prominent
Nov 20, 2017
1
0
510
You guys should test in DirectX9 with the hacked .cfg file..... you should be able to add an easy 20fps+ on top of the DirectX 10 results.

Only thing doing this sacrifices is object based motion blur but Crytek intentionally gimped DX9 mode to make DX10 look good and it didn't take the community long to find that changing the values in the .cfg file enabled people to play 'Very High' settings in DX9 and get much higher performance.
 

crymsonsunset

Prominent
Nov 23, 2017
1
0
510
Can anyone repro the 1080 TI/1440p/DX10 results? I'm running Win10 (1709), an i9-7900X, Asus STRIX 1080 TI OC, and 32GB RAM. I avg 90fps or so, quite a difference from the 110 tested in the article. All settings are very high and 8X AA is enabled. Here are the results I get:

TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)

!TimeDemo Run 0 Finished.
Play Time: 23.52s, Average FPS: 85.02
Min FPS: 57.21 at frame 1961, Max FPS: 107.86 at frame 1032
Average Tri/Sec: -4215609, Tri/Frame: -49582
Recorded/Played Tris ratio: -18.49

!TimeDemo Run 1 Finished.
Play Time: 22.20s, Average FPS: 90.11
Min FPS: 57.21 at frame 1961, Max FPS: 116.17 at frame 1007
Average Tri/Sec: -3046215, Tri/Frame: -33806
Recorded/Played Tris ratio: -27.11

!TimeDemo Run 2 Finished.
Play Time: 21.99s, Average FPS: 90.95
Min FPS: 57.21 at frame 1961, Max FPS: 116.17 at frame 1007
Average Tri/Sec: -2894560, Tri/Frame: -31826
Recorded/Played Tris ratio: -28.80

!TimeDemo Run 3 Finished.
Play Time: 21.59s, Average FPS: 92.64
Min FPS: 57.21 at frame 1961, Max FPS: 116.17 at frame 1007
Average Tri/Sec: -3118280, Tri/Frame: -33660
Recorded/Played Tris ratio: -27.23

TimeDemo Play Ended, (4 Runs Performed)
 

mapesdhs

Distinguished


There's always the 1GB 9800GT. I have two of them.





Did you oc it? I bagged one way back, still not done anything with it yet. Devil of a time finding the version with the right stepping.





I wish they'd make some advances in modelling more complex real world phenomena such as fluids (includes water, lava, fire, etc.) Atm all such effects are simulated; kinda bugs me when there's heavy rain in a game, but puddles are not forming, because the rain isn't a real thing in the game world. I'd like to see game tech evolve where something like a mud slide could be properly moddled, or a flood. Or a volcano. :D I'm sure Tombraider fans would be hopping with glee.





Alas this is completely standard practice now. I never used to understand the whole point of game-release drivers, had no idea what the release driver was doing was changing how the game engine works. NVIDIA digs into the game and improves on what's there. AMD does the same thing AFAIK.





In case you're interested, I've been accumulating some misc results here (not had a chance to add many older GPUs, and I have plenty to test):

http://www.sgidepot.co.uk/misc/farcry2.txt

I'm still playing the game. :D





I tried using Crysis as a benchmark at one point (via the supplied tool), but I kept getting weird results, so I gave up.

One thing I've wondered about older GPUs is whether newer drivers slowly cripple their performance. Keep meaning to pick a couple of tests, an old card or two, and plot their performance as each driver comes out, up until the point where driver support ceases, but alas I never seem to have the time. I'd be surprised if such a graph was always a rising slope of some kind. I've already noticed hefty performance drops for certain tests running on Quadro cards after a driver update.

Ian.
 

mapesdhs

Distinguished


I think you should be able to do that from the Tracked Threads screen on the .co.uk site.

Ian.


 
Status
Not open for further replies.