Discussion What game (in your opinion) is the new Crysis?

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
I want to know what the community thinks is the new Crysis, both in terms of being unoptimized and graphics being ahead of their time. A short time ago, I would have said for sure it was Cyberpunk, not because it was unoptimized, but because of the newer PT mode which struggles even on a 4090. Not to mention the bugs Cyberpunk had at launch. But now, I would say Alan Wake 2 as the graphics look good, but the general performance leaves much to be desired, not to mention the fact that GPUs that don't have mesh shader support perform terribly.
 

Colif

Win 11 Master
Moderator
Crysis was created when everyone thought CPU's would be at 10GHZ in a few years. It was the CPU expectations that made it so hard to run. Not completely the Graphics

There shouldn't be any games that have the same lofty dreams. Sure, they all graphically challenging now but that isn't stress on CPU completely. Most are made for consoles that don't have 6ghz CPU in them.

Alan Wake is a great looking walking simulator... I don't want all games to look amazing and have no game underneath.
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
Crysis was created when everyone thought CPU's would be at 10GHZ in a few years. It was the CPU expectations that made it so hard to run. Not completely the Graphics

There shouldn't be any games that have the same lofty dreams. Sure, they all graphically challenging now but that isn't stress on CPU completely. Most are made for consoles that don't have 6ghz CPU in them.

Alan Wake is a great looking walking simulator... I don't want all games to look amazing and have no game underneath.
The thing about Alan Wake 2 is that it uses just over 7GB of VRAM at 1080p lowest. Somewhat strangely though, VRAM usage goes from 10GB to 12GB when switching from 1080p max to 4k max. I would think that it would be more than a 20% increase in VRAM considering the game is pushing 4x as many pixels at 4K.
 

Eximo

Titan
Ambassador
A lot of the latest titles seem to have big optimization issues. Not sure there is any game in particular that can't even run well on high end systems like Crysis had at launch.

When Crysis came out, my old 8800GTS 640MB could handle it, barely, I think around 40 FPS at 1024x768?. But even that was 3rd or 4th step down from the top model of the 8000 series that came out only a little bit before the game. I basically didn't play it much in fear my GPU would melt down as it regularly bounced to 100C trying to run it.

Keep in mind that back then 1080p was the equivalent of 4K today.

Basically a new game that a very high end system of the time could barely keep up with at 4K? So what you are looking for is a game where you can't get 60FPS at 4K with a 4090 at high or above settings.
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
A lot of the latest titles seem to have big optimization issues. Not sure there is any game in particular that can't even run well on high end systems like Crysis had at launch.

When Crysis came out, my old 8800GTS 640MB could handle it, barely, I think around 40 FPS at 1024x768?. But even that was 3rd or 4th step down from the top model of the 8000 series that came out only a little bit before the game. I basically didn't play it much in fear my GPU would melt down as it regularly bounced to 100C trying to run it.

Keep in mind that back then 1080p was the equivalent of 4K today.

Basically a new game that a very high end system of the time could barely keep up with at 4K? So what you are looking for is a game where you can't get 60FPS at 4K with a 4090 at high or above settings.
I have heard that the GTX 260 was the first single GPU that could run Crysis well. I think Jedi survivor right when it launched might have qualified for the new Crysis since even though the graphics are nothing revolutionary (still good though) the game was struggling on a 4090, and changing settings didn't seem to do anything.
 

Eximo

Titan
Ambassador
Reviews of the time still show the GTX 260 struggling at 1900x1200 at max settings, only 38 FPS without AA on. But for the more typical user at 1280x1024 it was probably alright. Crysis was unique in that it has hard to run with computers available at launch, but also stayed out of reach as people moved on to higher resolution monitors.

Most everything I am finding is games that rely on ray tracing and all the mentioned games without RT or DLSS on always come up against a CPU limitation.

But if you agree with the use of RT in terms of the "New Crysis" looks like Portal RTX, Cyberpunk 2077 with RT, and Metro Exodus with RT are it, but they actually run fairly well on a 4090.
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
Reviews of the time still show the GTX 260 struggling at 1900x1200 at max settings, only 38 FPS without AA on. But for the more typical user at 1280x1024 it was probably alright. Crysis was unique in that it has hard to run with computers available at launch, but also stayed out of reach as people moved on to higher resolution monitors.

Most everything I am finding is games that rely on ray tracing and all the mentioned games without RT or DLSS on always come up against a CPU limitation.

But if you agree with the use of RT in terms of the "New Crysis" looks like Portal RTX, Cyberpunk 2077 with RT, and Metro Exodus with RT are it, but they actually run fairly well on a 4090.
The only games I play with RT are Minecraft RTX and Jedi Survivor. I agree with the use of RT in this context as modern GPUs have gotten really good at traditional rasterization. Portal RTX runs at 26 fps at 4k native at max settings. I don't think even my 6800 would run portal RTX at 1080p native very well. Edit: looking at the techpowerup benchmark for this game, The 6900xt a card that is 22% faster than my 6800 gets 5 fps at 1080p max, yikes. even the 7900xtx gets about 10fps. I am fully aware that this is an Nvidia sponsored game and therefore shouldn't do well on AMD cards, but still, I am shocked that a $1000 GPU can't get 30 fps at 1080p max
 
Last edited:

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
A lot of the latest titles seem to have big optimization issues. Not sure there is any game in particular that can't even run well on high end systems like Crysis had at launch.

When Crysis came out, my old 8800GTS 640MB could handle it, barely, I think around 40 FPS at 1024x768?. But even that was 3rd or 4th step down from the top model of the 8000 series that came out only a little bit before the game. I basically didn't play it much in fear my GPU would melt down as it regularly bounced to 100C trying to run it.

Keep in mind that back then 1080p was the equivalent of 4K today.

Basically a new game that a very high end system of the time could barely keep up with at 4K? So what you are looking for is a game where you can't get 60FPS at 4K with a 4090 at high or above settings.
The 8800GTS (according to techpowerup) is about 20% faster than the 8800GTX. It is also only 16% slower than a GTX 260.
 
A common thing I see in topics like these are vague ideas of what performance should be.

You can't say something "runs well" or is "poorly optimized" unless you have some clearly defined performance requirements. And even those may be subject to opinion, because if my performance requirements are 4K 240FPS with maximum details, then every game runs poorly or is poorly optimized.

As an example of having to deal with this, I had to incorporate a new physics model in an in-house flight simulator program. The thing is we have to prove that software is doing what we said we'd do. So we have to test it. Testing needs requirements, otherwise how do you know your thing passes? Since the whole purpose of the new physics model was to improve on the old one, how do you write requirements against it? You can't just say "new physics model shall be an improvement over the old one" because:
  • Technically it passes even if it's 0.00001% better, even though subjectively, it's the same.
  • Ultimately, how do you know the physics model is going in the correct direction?
So after doing some research, I decided on the physics model output must be within a certain tolerance compared to real-world data. The tolerance I chose was whatever the FAA laid out with regards to flight simulators because frankly, there isn't enough time to figure out what tolerances we would be comfortable with.

But if you want to avoid all that, then we could go with the spirit of what "Can it run Crysis?" is going after: a game that the best video card can't manage even 30 FPS with everything cranked up to maximum. Does the game support 8K? Sure, throw that in.

Oh wait, then that means a lot of games are now the new Crysis.

EDIT: Another sticking point I want to add is sometimes adjusting a setting or whatnot may crush performance simply because of how much more data was thrown at the problem. It doesn't matter how "optimized" your algorithms are at that point.

For example, let's say on a "normal" setting, you calculate value + 1 4 times. But the "high" setting now calculates value + 1 16 times. If you were getting 60 FPS in the first setting, now you're getting 15 FPS because the calculations took exponentially longer.

And a lot of data in graphics tends to grow exponentially.
 
Last edited:
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
A common thing I see in topics like these are vague ideas of what performance should be.

You can't say something "runs well" or is "poorly optimized" unless you have some clearly defined performance requirements. And even those may be subject to opinion, because if my performance requirements are 4K 240FPS with maximum details, then every game runs poorly or is poorly optimized.

As an example of having to deal with this, I had to incorporate a new physics model in an in-house flight simulator program. The thing is we have to prove that software is doing what we said we'd do. So we have to test it. Testing needs requirements, otherwise how do you know your thing passes? Since the whole purpose of the new physics model was to improve on the old one, how do you write requirements against it? You can't just say "new physics model shall be an improvement over the old one" because:
  • Technically it passes even if it's 0.00001% better, even though subjectively, it's the same.
  • Ultimately, how do you know the physics model is going in the correct direction?
So after doing some research, I decided on the physics model output must be within a certain tolerance compared to real-world data. The tolerance I chose was whatever the FAA laid out with regards to flight simulators because frankly, there isn't enough time to figure out what tolerances we would be comfortable with.

But if you want to avoid all that, then we could go with the spirit of what "Can it run Crysis?" is going after: a game that the best video card can't manage even 30 FPS with everything cranked up to maximum. Does the game support 8K? Sure, throw that in.

Oh wait, then that means a lot of games are now the new Crysis.

EDIT: Another sticking point I want to add is sometimes adjusting a setting or whatnot may crush performance simply because of how much more data was thrown at the problem. It doesn't matter how "optimized" your algorithms are at that point.

For example, let's say on a "normal" setting, you calculate value + 1 4 times. But the "high" setting now calculates value + 1 16 times. If you were getting 60 FPS in the first setting, now you're getting 15 FPS because the calculations took exponentially longer.

And a lot of data in graphics tends to grow exponentially.
Portal RTX is the new Crysis for AMD cards. Even at 1080p max, the 6900 xt hits 5 fps. The 7900xtx hits less than 15 fps. Again, in case anyone missed my last post, I am fully aware that this game will not perform well on AMD cards. The best AMD card can't even play this game at 1080p max with playable frame rates, hence why I called it the Crysis for AMD cards.
 

Eximo

Titan
Ambassador
The 8800GTS (according to techpowerup) is about 20% faster than the 8800GTX. It is also only 16% slower than a GTX 260.
You are looking at the second release of the 8800GT and GTS, those used a G92 GPU, I had the older 8800GTS on the G80 which were cut down 8800GTX that were also G80 GPUs.

That generation is well known for the second 8800GT being an amazing value for the money. Akin to the 1080Ti or original MSRP 3080.
 
  • Like
Reactions: Order 66