A common thing I see in topics like these are vague ideas of what performance should be.
You can't say something "runs well" or is "poorly optimized" unless you have some clearly defined performance requirements. And even those may be subject to opinion, because if my performance requirements are 4K 240FPS with maximum details, then every game runs poorly or is poorly optimized.
As an example of having to deal with this, I had to incorporate a new physics model in an in-house flight simulator program. The thing is we have to prove that software is doing what we said we'd do. So we have to test it. Testing needs requirements, otherwise how do you know your thing passes? Since the whole purpose of the new physics model was to improve on the old one, how do you write requirements against it? You can't just say "new physics model shall be an improvement over the old one" because:
- Technically it passes even if it's 0.00001% better, even though subjectively, it's the same.
- Ultimately, how do you know the physics model is going in the correct direction?
So after doing some research, I decided on the physics model output must be within a certain tolerance compared to real-world data. The tolerance I chose was whatever the FAA laid out with regards to flight simulators because frankly, there isn't enough time to figure out what tolerances we would be comfortable with.
But if you want to avoid all that, then we could go with the spirit of what "Can it run Crysis?" is going after: a game that the best video card can't manage even 30 FPS with
everything cranked up to maximum. Does the game support 8K? Sure, throw that in.
Oh wait, then that means
a lot of games are now the new Crysis.
EDIT: Another sticking point I want to add is sometimes adjusting a setting or whatnot may crush performance simply because of how much more data was thrown at the problem. It doesn't matter how "optimized" your algorithms are at that point.
For example, let's say on a "normal" setting, you calculate
value + 1 4 times. But the "high" setting now calculates
value + 1 16 times. If you were getting 60 FPS in the first setting, now you're getting 15 FPS because the calculations took exponentially longer.
And a lot of data in graphics tends to grow exponentially.