Stalker 2 PC performance testing and settings analysis — another demanding game that uses frame generation and upscaling as a crutch

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
@1440 what, Low, med, high, ultra?
I am curious, I haven't bought the game yet, I don't think my older Ryzen system is going to run it well at 1440p, and I refuse to play at 1080p. That's not what I built this machine for and I am not into to upgrading every 2 years, especially to play one game.

They really need to go back to the drawing board and optimize this better. A$2000 gpu and high end cpu can't play it well, that's not acceptable.
high. i couldn't really tell a difference really between high and epic. even medium was hard to tell apart with High.
 
Regarding framegen (and to a lesser extent, upscaling) in Stalker 2, we need to have a frank conversation, because people often try to make it all black or white. It's not that simple, and I probably need to rework the text some in the article to make this clearer.

First, framegen isn't inherently evil and bad and something that should never be used. Anyone pushing that narrative is being disingenuous. However, it's also not sunshine and roses and the best thing since sliced bread. Anyone pushing that narrative is equally delusional. And, just like upscaling, when you compare DLSS3 framegen with FSR3.1 framegen, you are comparing apples and oranges. Getting into the details of which one looks better takes time, but the short summary is that Nvidia almost always does better — both for framegen and upscaling — with the current algorithms.

For AMD GPU owners, I'd almost say you should use XeSS with DP4a upscaling in UltraQuality mode. FSR3 in Quality mode (both at 67%) has some clear and obvious errors. It's especially noticeable at 1080p, less so at 1440p, and probably "good enough" at 4K that most won't notice a major difference. But if you're systematic in the testing, you can and will find plenty of differences where Nvidia DLSS looks best, then Intel XeSS 1.3.1, and last is FSR3.1/3/2 (and FSR1 is even worse than those). TSR, Unreal Engine 5's upscaling tech, ranks below FSR2 and perhaps just barely above FSR1, in my opinion.

Second, and I didn't get into this with the article too much, is that while Nvidia framegen may look nicer, AMD's framegen is almost always faster. I've got more benchmarks and I need to update the charts (and text), but if you want higher performance you could potentially make a real case for using FSR3.1 framegen with RTX 40-series GPUs — if that was supported (it's not). It's a big part of why AMD is "faster" in the framegen charts. Again, different algorithms, different results, and AMD's choice is more about speed than quality.

Finally, Stalker 2 with framegen (and possibly upscaling, though I'd stick with DLSS or XeSS) runs quite well on recent GPUs. I'd stick to the medium or high preset as well, because epic really just tanks performance for a negligible improvement in image fidelity — and that's true in 95% of games. I enable it mostly to push the GPUs as hard as possible, because today's ultra is tomorrow's high and the day after's medium. We will have games in a couple of years where the medium preset will effectively match whatever Stalker 2 and current games are doing on epic/ultra settings.

How does framegen feel in this game? If you're using FSR3.1 framegen, and getting about a 70~90 percent increase in "fps" (frame smoothing, in other words), I think it's a worthwhile tradeoff. Trying to play Stalker 2 at 30 fps without framegen versus 50 fps with framegen definitely feels better with the latter, at least to me. But you do want to aim for more like 80+ fps with framegen IMO. Some of the lower tier cards I've tested that only manage 35~50 fps with framegen absolutely feel laggy. I turn with the mouse and almost invariably overshoot. It feels to me about as good as using a controller to play a shooter (meaning: not good or precise).

If we're trying to match image quality between Nvidia with DLSS and AMD with FSR3? Depending on the resolution, it's probably something like:

1080p: DLSS Quality equals FSR3 Native
1440p: DLSS Quality equals FSR3 90%
4K: DLSS Quality equals FSR3 80%

But that compounds with framegen as well. So DLSS Quality plus DLSS framegen at 1080p probably needs super sampling with AMD to get a similar result — or XeSS UltraQualityPlus with FSR3.1 framegen. For 1440p, I'd say AMD native + FG might equal DLSS quality + FG, and for 4K it's probably AMD 90% scaling + FG to match DLSS 3 + FG.

This is why I started with the native TAA performance. TAA in Unreal Engine actually sucks rocks. You get a lot of ghosting and it doesn't even remove jaggies all that well. TSR is only a little better. But it's a universal algorithm and about as "fair" as we're likely to get. Almost everything with upscaling and framegen is about trading higher image fidelity for higher FPS (with TAA still being an issue). So DLSS Quality mode, there's no real equivalent other than maybe XeSS — it generally looks better than "native TAA" in my experience. But FSR Quality might have less ghosting and jaggies than native + TAA, while at the same time showing a ton more artifacting. Walking along and looking at the leaves and branches against the sky quickly shows just how bad FSR3 does with certain scenarios.
 
Regarding framegen (and to a lesser extent, upscaling) in Stalker 2, we need to have a frank conversation, because people often try to make it all black or white. It's not that simple, and I probably need to rework the text some in the article to make this clearer.

First, framegen isn't inherently evil and bad and something that should never be used. Anyone pushing that narrative is being disingenuous. However, it's also not sunshine and roses and the best thing since sliced bread. Anyone pushing that narrative is equally delusional. And, just like upscaling, when you compare DLSS3 framegen with FSR3.1 framegen, you are comparing apples and oranges. Getting into the details of which one looks better takes time, but the short summary is that Nvidia almost always does better — both for framegen and upscaling — with the current algorithms.

For AMD GPU owners, I'd almost say you should use XeSS with DP4a upscaling in UltraQuality mode. FSR3 in Quality mode (both at 67%) has some clear and obvious errors. It's especially noticeable at 1080p, less so at 1440p, and probably "good enough" at 4K that most won't notice a major difference. But if you're systematic in the testing, you can and will find plenty of differences where Nvidia DLSS looks best, then Intel XeSS 1.3.1, and last is FSR3.1/3/2 (and FSR1 is even worse than those). TSR, Unreal Engine 5's upscaling tech, ranks below FSR2 and perhaps just barely above FSR1, in my opinion.

Second, and I didn't get into this with the article too much, is that while Nvidia framegen may look nicer, AMD's framegen is almost always faster. I've got more benchmarks and I need to update the charts (and text), but if you want higher performance you can make a real case for using FSR3.1 framegen with RTX 40-series GPUs. It's a big part of why AMD is "faster" in the framegen charts. Again, different algorithms, different results, and AMD's choice is more about speed than quality.

Finally, Stalker 2 with framegen (and possibly upscaling, though I'd stick with DLSS or XeSS) runs quite well on recent GPUs. I'd stick to the medium or high preset as well, because epic really just tanks performance for a negligible improvement in image fidelity — and that's true in 95% of games. I enable it mostly to push the GPUs as hard as possible, because today's ultra is tomorrow's high and the day after's medium. We will have games in a couple of years where the medium preset will effectively match whatever Stalker 2 and current games are doing on epic/ultra settings.

How does framegen feel in this game? If you're using FSR3.1 framegen, and getting about a 70~90 percent increase in "fps" (frame smoothing, in other words), I think it's a worthwhile tradeoff. Trying to play Stalker 2 at 30 fps without framegen versus 50 fps with framegen definitely feels better with the latter, at least to me. But you do want to aim for more like 80+ fps with framegen IMO. Some of the lower tier cards I've tested that only manage 35~50 fps with framegen absolutely feel laggy. I turn with the mouse and almost invariably overshoot. It feels to me about as good as using a controller to play a shooter (meaning: not good or precise).

If we're trying to match image quality between Nvidia with DLSS and AMD with FSR3? Depending on the resolution, it's probably something like:

1080p: DLSS Quality equals FSR3 Native
1440p: DLSS Quality equals FSR3 90%
4K: DLSS Quality equals FSR3 80%

But that compounds with framegen as well. So DLSS Quality plus DLSS framegen at 1080p probably needs super sampling with AMD to get a similar result — or XeSS UltraQualityPlus with FSR3.1 framegen. For 1440p, I'd say AMD native + FG might equal DLSS quality + FG, and for 4K it's probably AMD 90% scaling + FG to match DLSS 3 + FG.

This is why I started with the native TAA performance. TAA in Unreal Engine actually sucks rocks. You get a lot of ghosting and it doesn't even remove jaggies all that well. TSR is only a little better. But it's a universal algorithm and about as "fair" as we're likely to get. Almost everything with upscaling and framegen is about trading higher image fidelity for higher FPS (with TAA still being an issue). So DLSS Quality mode, there's no real equivalent other than maybe XeSS — it generally looks better than "native TAA" in my experience. But FSR Quality might have less ghosting and jaggies than native + TAA, while at the same time showing a ton more artifacting. Walking along and looking at the leaves and branches against the sky quickly shows just how bad FSR3 does with certain scenarios.
It's strange that i actually get worse performance with FSR frame gen. From 90 to 100fps to 55-60 fps, on my 4070, tested in the first town area. Switching back boosted fps back to what it was.I couldn't really tell what was causing the issue. Both cpu and gpu utilization was the same with both frame gen options.

Despite the issues, the game does look good and I have to give credit to the dev's for implementing all the upscaling and frame gen options
 
  • Like
Reactions: Loadedaxe
It's strange that i actually get worse performance with FSR frame gen. From 90 to 100fps to 55-60 fps, on my 4070, tested in the first town area. Switching back boosted fps back to what it was.I couldn't really tell what was causing the issue. Both cpu and gpu utilization was the same with both frame gen options.

Despite the issues, the game does look good and I have to give credit to the dev's for implementing all the upscaling and frame gen options
I need to double check, but FSR3 upscaling runs quite a bit worse than DLSS upscaling on Nvidia RTX GPUs, while FSR3.1 framegen seems to outperform DLSS3 framegen. Because the fixed function OFA in Ada has a maximum level of throughput.

Update: This is actually wild. FSR3 framegen totally works on the RTX 30-series and 20-series, and even the 10-series. Try it on an RTX 40-series GPU? Nothing! It's ignored. I don't know if that's the game code or the drivers, but I suspect Nvidia influenced this in some way. Text in the article is now updated, along with charts.

Basically, you get maybe a ~50 percent performance boost from DLSS framegen compared to just upscaling (depending on the GPU), but FSR3 framegen typically gives about a 70~80 percent boost. They don't look the same, however, so comparing the two just on the resulting performance isn't an accurate depiction of what you get.
 
Has GSC indicated whether or not they're going to add hardware ray tracing in the future? From what I understand the UE5 engine supports hardware ray tracing.
 
Has GSC indicated whether or not they're going to add hardware ray tracing in the future? From what I understand the UE5 engine supports hardware ray tracing.
I doubt GSC will bother. Never say never, but the UE games that have used hardware RT fall into basically two categories: Black Myth Wukong with full RT that looks nice and pretty much requires an Nvidia GPU... and everything else where the RT effects are hit and miss and cause massive stuttering and performance issues.

Maybe that's a bit too stark a viewpoint, but off hand I'm trying to think of one UE4 or UE5 game that uses hybrid ray tracing where the end result is actually good. I can name several where the RT effects add very little and cause stuttering, though! And a bunch of UE5 games without RT that also perform poorly.

Both Star Wars Jedi games have crap ray tracing performance. Like, consistently slow I could live with, but they're super stuttery messes. MechWarrior 5 is the same issue. Frostpunk 2, Until Dawn (2024), and now Stalker 2.

Basically, UE5 means a game will need more VRAM and more CPU and probably upscaling and framegen to reach decent performance levels. Unless the game is Fortnite. LOL

In general, ray tracing is good for:
reflections
global illumination
maybe water caustics and transparency effects

RT is often used for ambient occlusion and shadows, but in my experience it ends up adding very little graphically. Many times, I think the non-RT shadows and AO look "better" just because they get the job done in a way I'm used to seeing, without tanking performance.
 
The RT effects in Control are instantly noticeable. Practically everything that is glossy reflects the environment.
In Metro Exodus though, I never really noticed much of a difference with or without ray tracing, except for the drop in FPS.
Stalker 2 will be getting a pass from me though, at least until I upgrade. Too bad, it's the one game I was looking forward to.
 
After a few hours in the game 1440:
RAM usage ~30GB (total system);
FG on my 7900XTX works correctly with vsync 100Hz of my monitor and reduces power consumption (as it exceeds 100fps in unlimited mode).
P.S. The game is excellent
 
Last edited:
30GB of RAM usage by Stalker 2? R U kidding? Holey frijole.

If Stalker 2 has so many perf. issues w/PCs, how does it run on the Xbox series X? Or does DirectStorage make up for the severely underpowered GPU and CPU the Xbox Series X sports?
 
It's strange that i actually get worse performance with FSR frame gen. From 90 to 100fps to 55-60 fps, on my 4070, tested in the first town area. Switching back boosted fps back to what it was.I couldn't really tell what was causing the issue. Both cpu and gpu utilization was the same with both frame gen options.

Despite the issues, the game does look good and I have to give credit to the dev's for implementing all the upscaling and frame gen options
stop it....your gonna make me buy it damit!
 
As far as Stalker is concerned, it has nothing to do with coders being crappy and everything to do with this specific graphic preset being that much demanding.

Unfortunately, upscaling is a must in today's gaming.
It shouldn't be, though. And sadly, it's setting a really bad precedent for the gaming industry. The most common card, according to Steam, is an RTX 3060. According to the article, an RTX 4060 can't play 1080p native, on medium settings, and maintain 60fps. In my opinion, that is very wrong.

While I can't say for certain, nor would I know where to look for the stats, I'm pretty sure that no competitive or hardcore gamers use upscaling technology, or fake frame generation. Can you imagine of CoD6 came out with these settings? There would be major backlash.
 
  • Like
Reactions: Loadedaxe and 80251
Stalker 2 looks no better than Metro Exodus Enhanced Edition and Metro Exodus Enhanced Edition actually uses ray tracing hardware and doesn't exhibit any of the massive perf. issues Stalker 2 is having. Could this be because instead of using hardware ray tracing they're using a async. computer shaders as per the pcgamer review?

"However, unlike most UE5 games that boast GPU-hogging rays, Stalker 2 uses the software mode for Lumen. Your graphics card will still be doing the bulk of the ray tracing but it's done through asynchronous compute shaders and not via any ray tracing hardware inside the GPU."

So basically Stalker 2 is using a less advanced engine than Metro Exodus Enhanced Edition?
lol, try comparing it to Doom Eternal, with Vulkan AND Ray-Tracing. If you want to talk about a beautiful looking game that runs fantastic, this is a great example. Game developers should really be paying attention to that.
According to Steam, the most common card is the RTX 3060. According to the article, an RTX 4060 can't play 1080p native, at medium settings, and maintain 60fps. That is seriously F'd up.

As I mentioned elsewhere, competitive and hardcore gamers don't use upscaling for fake frame generation. Can you imagine if CoD6 came out with these specs? There would be massive backlash.
 
Regarding framegen (and to a lesser extent, upscaling) in Stalker 2, we need to have a frank conversation, because people often try to make it all black or white. It's not that simple, and I probably need to rework the text some in the article to make this clearer.

First, framegen isn't inherently evil and bad and something that should never be used. Anyone pushing that narrative is being disingenuous. However, it's also not sunshine and roses and the best thing since sliced bread. Anyone pushing that narrative is equally delusional. And, just like upscaling, when you compare DLSS3 framegen with FSR3.1 framegen, you are comparing apples and oranges. Getting into the details of which one looks better takes time, but the short summary is that Nvidia almost always does better — both for framegen and upscaling — with the current algorithms.

For AMD GPU owners, I'd almost say you should use XeSS with DP4a upscaling in UltraQuality mode. FSR3 in Quality mode (both at 67%) has some clear and obvious errors. It's especially noticeable at 1080p, less so at 1440p, and probably "good enough" at 4K that most won't notice a major difference. But if you're systematic in the testing, you can and will find plenty of differences where Nvidia DLSS looks best, then Intel XeSS 1.3.1, and last is FSR3.1/3/2 (and FSR1 is even worse than those). TSR, Unreal Engine 5's upscaling tech, ranks below FSR2 and perhaps just barely above FSR1, in my opinion.

Second, and I didn't get into this with the article too much, is that while Nvidia framegen may look nicer, AMD's framegen is almost always faster. I've got more benchmarks and I need to update the charts (and text), but if you want higher performance you can make a real case for using FSR3.1 framegen with RTX 40-series GPUs. It's a big part of why AMD is "faster" in the framegen charts. Again, different algorithms, different results, and AMD's choice is more about speed than quality.

Finally, Stalker 2 with framegen (and possibly upscaling, though I'd stick with DLSS or XeSS) runs quite well on recent GPUs. I'd stick to the medium or high preset as well, because epic really just tanks performance for a negligible improvement in image fidelity — and that's true in 95% of games. I enable it mostly to push the GPUs as hard as possible, because today's ultra is tomorrow's high and the day after's medium. We will have games in a couple of years where the medium preset will effectively match whatever Stalker 2 and current games are doing on epic/ultra settings.

How does framegen feel in this game? If you're using FSR3.1 framegen, and getting about a 70~90 percent increase in "fps" (frame smoothing, in other words), I think it's a worthwhile tradeoff. Trying to play Stalker 2 at 30 fps without framegen versus 50 fps with framegen definitely feels better with the latter, at least to me. But you do want to aim for more like 80+ fps with framegen IMO. Some of the lower tier cards I've tested that only manage 35~50 fps with framegen absolutely feel laggy. I turn with the mouse and almost invariably overshoot. It feels to me about as good as using a controller to play a shooter (meaning: not good or precise).

If we're trying to match image quality between Nvidia with DLSS and AMD with FSR3? Depending on the resolution, it's probably something like:

1080p: DLSS Quality equals FSR3 Native
1440p: DLSS Quality equals FSR3 90%
4K: DLSS Quality equals FSR3 80%

But that compounds with framegen as well. So DLSS Quality plus DLSS framegen at 1080p probably needs super sampling with AMD to get a similar result — or XeSS UltraQualityPlus with FSR3.1 framegen. For 1440p, I'd say AMD native + FG might equal DLSS quality + FG, and for 4K it's probably AMD 90% scaling + FG to match DLSS 3 + FG.

This is why I started with the native TAA performance. TAA in Unreal Engine actually sucks rocks. You get a lot of ghosting and it doesn't even remove jaggies all that well. TSR is only a little better. But it's a universal algorithm and about as "fair" as we're likely to get. Almost everything with upscaling and framegen is about trading higher image fidelity for higher FPS (with TAA still being an issue). So DLSS Quality mode, there's no real equivalent other than maybe XeSS — it generally looks better than "native TAA" in my experience. But FSR Quality might have less ghosting and jaggies than native + TAA, while at the same time showing a ton more artifacting. Walking along and looking at the leaves and branches against the sky quickly shows just how bad FSR3 does with certain scenarios.
Good write up but I disagree with you about AMD settings. TSR with Framgen at 1440p on R77700x and 7800xt looks substancially better than FSR. And it seems to run about the same at least in my case. And people that turn framgen on need to also turn on AMD Antilag in the AMD app. I honestly can't feel any input lag at all the game looks fine and runs smooth for me.
 
  • Like
Reactions: 80251
Good write up but I disagree with you about AMD settings. TSR with Framgen at 1440p on R77700x and 7800xt looks substancially better than FSR. And it seems to run about the same at least in my case. And people that turn framgen on need to also turn on AMD Antilag in the AMD app. I honestly can't feel any input lag at all the game looks fine and runs smooth for me.
I noticed some clear ghosting with TSR when I poked around at it, enough so that I didn't want to do any additional testing. It's possible the game just got into a weird state and that this isn't the way TSR normally looks, but while I did notice similar performance to FSR3 upscaling, I'd say which looks better depends on the resolution and upscaling level. TSR looks better on certain areas, worse on others in my experience. Overall, I think it looks worse on average, but that's more subjective than performance measurements. :)
 
  • Like
Reactions: valthuer
Uh, I think he did.

Why does everyone have to be negative on others post. Instead, why don't you give him some other examples of survey's to go by instead of being condescending?
sounds like to me he is pretty much using steam, as i said.

why ? cause using steam, does not indicate what is out there. as i just posted else where, i have never had a steam survey come up, and i have changed my hard ware probable 3 or 4 times easily... a few people i know, who use steam daily, never have had a survey come up either...

do you know of any other hardware surveys that queries consumer hardware as much as steams supposedly does ? cause i sure dont....
 
  • Like
Reactions: 80251
sounds like to me he is pretty much using steam, as i said.

why ? cause using steam, does not indicate what is out there. as i just posted else where, i have never had a steam survey come up, and i have changed my hard ware probable 3 or 4 times easily... a few people i know, who use steam daily, never have had a survey come up either...

do you know of any other hardware surveys that queries consumer hardware as much as steams supposedly does ? cause i sure dont....
You missed it ....
 
maybe i did... but it still irks me how people still seem to use steam as the be all tell all for things like this....

for all we know, steam could only be using a small part of its users for these surveys, and leaves the data they tout, as inaccurate
 
TSR looks better on certain areas, worse on others in my experience. Overall, I think it looks worse on average, but that's more subjective than performance measurements. :)

I agree. Both DLAA and DLSS Quality, look much better than TSR.

I noticed some clear ghosting with TSR when I poked around at it, enough so that I didn't want to do any additional testing.

I noticed it as well and it was very annoying. Glad I'm not the only one.
 
  • Like
Reactions: JarredWaltonGPU
I agree. Both DLAA and DLSS Quality, look much better than TSR.

I noticed it as well and it was very annoying. Glad I'm not the only one.
But DLAA is not upscaling anymore. Funny how the original promise of DLSS as a superior AA solution took a while to materialize and got a rebrand. Still, DLAA is definitely the high water mark for image quality right now. In terms of AA algorithms, then, it goes like this IMO (based on some limited samples for certain algos):

DLAA > DLSS > XeSS 1.3 > XeSS 1.2 > FSR 3/2 > XeSS 1.1/1.0 > FSR1 > TSR > TAA

Considering FSR 2/3 is open source, there's literally no good reason for TSR if it can't even match that level of quality. (You could maybe swap FSR1 and TSR.)

"Hey, we did a bunch of work for a special upscaling solution... that's worse than AMD's free and open source solution." —Tim Sweeney and Epic
 
  • Like
Reactions: valthuer