News Alan Wake 2 Will Punish Your GPU: Full Path Tracing and Ray Reconstruction, at Launch

Status
Not open for further replies.

Admin

Administrator
Staff member
Talking about the 3080 as being the bottom for some features then not testing it and instead opting for a 2080 ti is just infuriating. It makes no sense at all especially on a game that is so feature specific. Tomshardware has skipped the 3080 in several tests recently and I just don't get it.
 
Last edited:
  • Like
Reactions: evdjj3j
Talking about the 3080 as being the bottom for some features then not testing it and instead opting for a 2080 ti is just infuriating. It makes no sense at all especially on a game that is so feature specific. Tomshardware has skipped the 3080 in several tests recently and I just don't get it.
@JarredWaltonGPU uses the gpu he have... and the 2080ti is good for old hardware compatibility.
 
It is quite demanding, for sure. Playing on an overclocked 4090 on liquid with an i9-10920X @ 5Ghz. It's very much GPU bound. FG on, DLSS at quality, max settings, max RT/Pathtracing, and I'm getting about the same results as the benchmark above. However, even though I've gotten used to 120fps gaming on the 4090, I honestly don't mind averaging around 60fps for a game like this. It's slow paced and film-like enough that there's no need to go above 60fps. Even when it dips down to 50fps, I don't mind. It 'kinda' gives it that 24p film experience without having that weird soap-opera feel. VRR, Freesync, or G-Sync are also super handy for a tear-free clean experience.

The game is gorgeous. Watching the shadows of the moon play on the environment from a window blowing in the wind behind me is unworldly. The immersion is amazing. The world lighting is constantly changing as trees sway, a car drives by, something moves through a light, casting shadows, or watching a room slowly fill with light as a character approaches with their flashlight. The GI, reflections, shadows, and lighting are spot on.

They also have properly implemented HGiG. If you have and HDR10+ or Dolby Vision TV/Monitor, especially OLED with HGiG support, you can finally experience gaming the way HDR10+ was meant to be. Dynamically adjusting tone mapping, brightness, contrast, gamma, etc looks so good. No color banding in gradients. Shadows are dark and moody. Bright lights blind you at first. I love that.

I'm about 10 hours in and haven't experienced any bugs of any kind. No shader compilation issues. It just works on day one. This is a rare thing these days.

Now if only Epic Game Store would allow us to disable achievement notifications!!! I'm sitting here scared out of my mind, loving everything about it, then "DING DING DING!!!!! YOU HAS ACHIEVEMENT!!!!". Immersion ruined. Thanks Epic. I hate you for having a terrible launcher and not releasing a game on Steam, where it belongs.
 
I guess I'll be targeting 30 fps on my RTX 2060 6GB. For this type of "cinematic" game that will hopefully be okay. Paying twice as much as my 4 year old GPU cost to get twice the performance is simply a non-starter (i.e. I would have to get an RTX 4070 to get that and it costs twice as much as my RTX 2060 did). I mean if SLI was still a thing I could have gotten twice the performance for twice the money 4 years ago (by buying two RTX 2060's).
 
I mean if SLI was still a thing I could have gotten twice the performance for twice the money 4 years ago (by buying two RTX 2060's).
Well, far from true. SLI usually gave around 20-60% increased performance, but the tearing and micro-stutters were absolutely terrible. I ran SLI on a couple different generations (GTX 480 and 680), and even though it was faster, the problems SLI had were very annoying. And the game had to support it as well, which many didn't. DX12 Multi-GPU was supposed to be an alternative to SLI that was more forgiving of mixing different card types, didn't require a bridge, but it never took off and kinda died right after it was released back in 2015/2016. It also required support from the game dev. If ever a true multi-gpu alternative was released, I'd pop in another 4090! You know Nvidia wants me to buy another one.
 
Talking about the 3080 as being the bottom for some features then not testing it and instead opting for a 2080 ti is just infuriating. It makes no sense at all especially on a game that is so feature specific. Tomshardware has skipped the 3080 in several tests recently and I just don't get it.
You can’t test everything, every time. Look at the 3090 and 4070, extrapolate from there. 3080 usually delivers about the same performance as the 4070. The 2080 Ti was because I wanted to see how the first generation RTX architecture held up.
 
I’m going to call this game out for lack of optimization. In one of the first forest scenes after you perform a ritual, where you see the trees lit with red lights, FPS drops to the 40s even on an RTX 4090 with DLSS Quality and Frame Gen on. I don’t think it’s acceptable as there’s not enough going on to justify 43fps or whatever the lowest I got was in that scene. Considering it’s with DLSS Quality and Frame Generation on an overlocked 4090 rocking over 3000MHz with unlocked power limit.

I had to resort to playing the game with a controller on the couch to cover up the low fps issues that become much more noticeable with a mouse. I wasn’t willing to drop settings and lower the visuals. 😛 Because the game does look great. But in some areas the visuals just don’t justify the cost.
 
You can’t test everything, every time. Look at the 3090 and 4070, extrapolate from there. 3080 usually delivers about the same performance as the 4070. The 2080 Ti was because I wanted to see how the first generation RTX architecture held up.
Sure but the 3080 is not everything...it was the top end last generation. You mentioned the card, you should test the card. With a game that is so specific about the versions of tech being used, not just the horsepower to do this just seems very odd to me. You tested a whole lot of cards here, not doing the 3080 but doing the 2080 ti and the 4090, and the 3090 seems very strange to me. Get rid of the 4070 and test the 3080 would have fit much better in your scheme considering you didn't test any other 70's.

Sorry this just seems weird to me (site not you) and you have done it multiple times. To skip over one of the most used cards and then just say meh too much to add that...I just don't get it. Can you test everything? No, but the 3080 is pretty damn main stream...much more so than the 3090 or 4090 at the very least.
 
  • Like
Reactions: Heat_Fan89
@JarredWaltonGPU uses the gpu he have... and the 2080ti is good for old hardware compatibility.
I get having the 2080ti but not in place of a 3080 and I don't get the 4070...as the only 70 card without a 3080. It's a weird progression that would work better without the 4070 and with the 3080 or without the 2080ti. It muddies the water on how they progress given all the versions of hardware acceleration brought into play here, and just leaves out one of the most popular cards of recent years in favor of a 2 generation old one, or the most recent gen version.

Hell don't do the 3090 and do the 3080 or don't do the 3060 but do the 3080...there are a lot of things here that seem out of bounds from the comparable systems here. I mean 4090, 4080, 4070, 4060, 3090, 3060? Seems odd to me.
 
I get having the 2080ti but not in place of a 3080 and I don't get the 4070...as the only 70 card without a 3080. It's a weird progression that would work better without the 4070 and with the 3080 or without the 2080ti. It muddies the water on how they progress given all the versions of hardware acceleration brought into play here, and just leaves out one of the most popular cards of recent years in favor of a 2 generation old one, or the most recent gen version.

Hell don't do the 3090 and do the 3080 or don't do the 3060 but do the 3080...there are a lot of things here that seem out of bounds from the comparable systems here. I mean 4090, 4080, 4070, 4060, 3090, 3060? Seems odd to me.

Talk about complaining over nothing. Newer hardware always gets benched more. Also, since when has any 80 series card been one of the "most popular"? There are plenty of other sources you can check out for 3080 numbers.
 
120 fps with RT on 1080p with a 4090. Does the word optimisation exist in the mind of devs today?

This makes me think the only proper devs are those who make console exclusive titles. Atleast 30 fps in 4k. Or 60.
 
Yeah I think I'll just wait until the 6000 series comes out to upgrade. I'm still rockin' my top of the line PC from 2014. It still works. For how much longer? I hope until then, at least.
 
120 fps with RT on 1080p with a 4090. Does the word optimisation exist in the mind of devs today?

This makes me think the only proper devs are those who make console exclusive titles. Atleast 30 fps in 4k. Or 60.
I agree with your main point but even console devs are cheating. Consoles for a while now have been using the trick of render at lower resolution and upscale it and tell you it is a higher resolution. Since you can't actually compare side by side to a native render there is no easy way to tell how much worse it really is. PC game developers just have not gotten deceptive enough to just remove all the higher options and just call "low" "normal".
 
  • Like
Reactions: Lucky_SLS
Sure but the 3080 is not everything...it was the top end last generation. You mentioned the card, you should test the card. With a game that is so specific about the versions of tech being used, not just the horsepower to do this just seems very odd to me. You tested a whole lot of cards here, not doing the 3080 but doing the 2080 ti and the 4090, and the 3090 seems very strange to me. Get rid of the 4070 and test the 3080 would have fit much better in your scheme considering you didn't test any other 70's.

Sorry this just seems weird to me (site not you) and you have done it multiple times. To skip over one of the most used cards and then just say meh too much to add that...I just don't get it. Can you test everything? No, but the 3080 is pretty damn main stream...much more so than the 3090 or 4090 at the very least.
Every card tested takes me up to ~30 minutes for this game, as there are 12 primary settings tested (three resolutions, medium, max, max+RT low, max+RT high) on each card, plus an additional six settings (RR enabled) on RTX cards. I've tested the 3080 and 4070 enough to know that, other than Frame Generation, there's generally relatively little difference between the two cards. But Alan Wake 2 might be different, so that's why we run some tests.

Then I provided some other data points to help with extrapolation. RTX 3090 and RTX 4070 Ti are usually roughly equal in performance. (Note that I didn't test the 4070 Ti or 4060 Ti, because it's easy to extrapolate those results. They fall between the card above and below, within a few percent of the midpoint.) In this case, with RT enabled, the 3090 falls a lot closer to the 4070 than usual, and even below it in some cases (1440p with RT High). It's not VRAM or raw bandwidth causing the difference, which means it's architecture.

Looking at the 3060 and 3090, you can also see that the 3090 is about 2.5X faster at RT High, and 2.3X faster at RT Low — not really playable in either case, either. Our GPU hierarchy puts the 3090 2.25X faster at 1080p ultra in RT games. So Alan Wake 2 with RT is maybe a bit more demanding than the typical RT game, but not by much. (Minecraft shows a 2.4X difference at 1440p, which lines up with Alan Wake 2 pretty well.) Basically, we can reference the usual scaling factors, or close enough.

So: 3080 Ti will be just a touch slower than a 3090, 3090 Ti would be just above the 3090, and 3080 would be about 10–15 percent slower than the 3090. Does it matter if it's 10% versus 15%? Not really. That's just splitting hairs. We don't need to retest every card on every game to have an idea where it will land. You can take that or leave it.

The net result is that, because this is an Nvidia tech heavy game, I tested the relative top of each of the prior two generations to give a starting point. I also added the RTX 3060, which actually is the most popular 30-series card (and the most popular GPU overall). Everything lower down the list for the 30- and 20-series would obviously be slower and can be interpolated easy enough if you really want to.

The bottom line is that if you want to play the game with RT enabled, none of the lower 30-series or 20-series cards are going to deliver a great experience in the game. And none of the AMD or Intel GPUs will be good either. 3080 should manage 1080p RT High at around 35 fps, give or take 2 fps. 3070 Ti and below won't break 30 fps, unless you turn down settings or crank upscaling to performance or even ultra performance mode (and maybe not even then).

And if you don't want to enable RT? Then the game is far less demanding, but the relative positioning of the GPUs won't change. 3080 at 1080p high (not max) should get around 80 fps. And even at 1440p high it can average more than 60 fps. That's with Quality upscaling, though.
 
  • Like
Reactions: P.Amini
I thank you for doing the bench marks with the cards you have on hand. Doing the lowest ranked card is also a blessing to the majority of games out here.

It's a starting or cut off point as to weather what you do own can even have any hope of playing game even without raytracing and all the bells and whistles.

The whole world just went threw this with Cyberpunk. Oh look at all the eye candy but only if you had the new at the time Nvidia 3000 cards. The Nvidia 3000's are the old kids on the block now.

Fast forward to Starfield and now Alan Wake and buckle up and open that hardware wallet to play your $ 80.00 game.

Thank you Jarred for your time and effort doing your benchmarks. It's a great eye opener where our personal rigs can or can not hang in there.
 
Sure but the 3080 is not everything...it was the top end last generation. You mentioned the card, you should test the card. With a game that is so specific about the versions of tech being used, not just the horsepower to do this just seems very odd to me. You tested a whole lot of cards here, not doing the 3080 but doing the 2080 ti and the 4090, and the 3090 seems very strange to me. Get rid of the 4070 and test the 3080 would have fit much better in your scheme considering you didn't test any other 70's.

Sorry this just seems weird to me (site not you) and you have done it multiple times. To skip over one of the most used cards and then just say meh too much to add that...I just don't get it. Can you test everything? No, but the 3080 is pretty damn main stream...much more so than the 3090 or 4090 at the very least.
As someone who also has a 3080, you are being pretty whiny dude.

Take a look at any review of the 4070 and you'll see they are nearly identical in performance. Nvidia basically added 2 GB and cut the price a bit. Most reviews lamented how similar they were. Testing both is a waste of time. And guess which is still available for sale and which is out of stock worldwide?

The 'if you mention it you must test it' is a weird argument. If it were me, I'd go delete the 1 sentence that mentioned the 3080 and say there I fixed it for you. But Jared is nicer than I am.

Look, if it is so important to you, then just download the graphs, and change the name in photoshop from 4070 to 3080. Problem solved. 😎
 
Thanks for the review!

In my debate over upgrading my 3080 to a rumored 4070 TI Super or waiting for the 5000 series, this game makes waiting seem like the smarter move. If more games go this route, I'll end up wanting to replace my card again before long.
 
I compared 5th images which are "Max Quality - no RT" or highest old school rendering quality against 10th images which are "Max Quality + RT High + RR" or highest new rendering technology quality. I noticed "no RT" is more flat and "RT + RR" is more 3D, more defined and refined and also sharper.
 
It is quite demanding, for sure. Playing on an overclocked 4090 on liquid with an i9-10920X @ 5Ghz. It's very much GPU bound. FG on, DLSS at quality, max settings, max RT/Pathtracing, and I'm getting about the same results as the benchmark above. However, even though I've gotten used to 120fps gaming on the 4090, I honestly don't mind averaging around 60fps for a game like this. It's slow paced and film-like enough that there's no need to go above 60fps. Even when it dips down to 50fps, I don't mind. It 'kinda' gives it that 24p film experience without having that weird soap-opera feel. VRR, Freesync, or G-Sync are also super handy for a tear-free clean experience.

The game is gorgeous. Watching the shadows of the moon play on the environment from a window blowing in the wind behind me is unworldly. The immersion is amazing. The world lighting is constantly changing as trees sway, a car drives by, something moves through a light, casting shadows, or watching a room slowly fill with light as a character approaches with their flashlight. The GI, reflections, shadows, and lighting are spot on.

They also have properly implemented HGiG. If you have and HDR10+ or Dolby Vision TV/Monitor, especially OLED with HGiG support, you can finally experience gaming the way HDR10+ was meant to be. Dynamically adjusting tone mapping, brightness, contrast, gamma, etc looks so good. No color banding in gradients. Shadows are dark and moody. Bright lights blind you at first. I love that.

I'm about 10 hours in and haven't experienced any bugs of any kind. No shader compilation issues. It just works on day one. This is a rare thing these days.

Now if only Epic Game Store would allow us to disable achievement notifications!!! I'm sitting here scared out of my mind, loving everything about it, then "DING DING DING!!!!! YOU HAS ACHIEVEMENT!!!!". Immersion ruined. Thanks Epic. I hate you for having a terrible launcher and not releasing a game on Steam, where it belongs.
You're the first person I've encountered with a similar setup as me. Wanted to thank you for this post, it confirms everything I'm experiencing as well - especially the audio. I've got my rig hooked up to the home theater with 7.2.4 Dolby Atmos and an LG G1 display. Cranked up in a dark room, the game's audio alone creates a terrifying experience.

For image quality, I have my G1 set for HGIG with 4:4:4 and 10bit color. Also, the HDR effect is almost too strong at times, even on my older G1 with considerably lower max peak brightness than the current model.

My only question is if you settled on running DLAA or opted for the Quality preset? I just started playing AW2, so I'm not that far in (just made it out of the basement). In the first outdoor city area with Alan, I can't notice any discernible image quality improvements with DLAA enabled. Maybe I'm just missing it? I elected to go with Quality, since it allows me to run the game over100fps, but wanted to ask you if you're using it? I get around 60fps with it on in this area.
 
You're the first person I've encountered with a similar setup as me. Wanted to thank you for this post, it confirms everything I'm experiencing as well - especially the audio. I've got my rig hooked up to the home theater with 7.2.4 Dolby Atmos and an LG G1 display. Cranked up in a dark room, the game's audio alone creates a terrifying experience.

For image quality, I have my G1 set for HGIG with 4:4:4 and 10bit color. Also, the HDR effect is almost too strong at times, even on my older G1 with considerably lower max peak brightness than the current model.

My only question is if you settled on running DLAA or opted for the Quality preset? I just started playing AW2, so I'm not that far in (just made it out of the basement). In the first outdoor city area with Alan, I can't notice any discernible image quality improvements with DLAA enabled. Maybe I'm just missing it? I elected to go with Quality, since it allows me to run the game over100fps, but wanted to ask you if you're using it? I get around 60fps with it on in this area.
I was running DLAA, but settled on Quality. There were times with DLAA that I was dropping into the 30ish fps range. Uncommon, but I didn't like when it went that low on FPS. DLAA renders the frame at native resolution, and uses DLSS tech to anti-alias the image. It looks great. But the sometimes low framerate wasn't worth it. Switching to DLSS Quality, and I'm always at least 60fps, and I honestly can't tell the difference in image quality.

The HDR effect is too strong with the flashlight, IMO. I agree with you. I spend a lot of time IRL in dark places using flashlights, and when I'm looking at what my flashlight illuminates, I can see just fine. It's dark outside of that, but I just move my headlamp around to see. In AW2 with HDR and HGiG and proper calibration, the flashlight is blindingly bright and washes out much of the scene direcly in front, and the scene outside of the flashlight is lit properly. I wish auto-exposure exposed for the flashlights highlights, and dimmed the surrounding area.

I'm curious about your surround sound experience with your Dolby Atmos system. I'm on a Samsung Q910B, which is a 9.1.2 Atmos soundbar system. When Atmos is properly implemented, it sounds fantastic, but in AW2, voices and some other sounds echo in an absolutely terrible way. Other games can do this as well on my audio system, unless they have a properly implemented Atmos solution.
 
Last edited:
As someone who also has a 3080, you are being pretty whiny dude.

Take a look at any review of the 4070 and you'll see they are nearly identical in performance. Nvidia basically added 2 GB and cut the price a bit. Most reviews lamented how similar they were. Testing both is a waste of time. And guess which is still available for sale and which is out of stock worldwide?

The 'if you mention it you must test it' is a weird argument. If it were me, I'd go delete the 1 sentence that mentioned the 3080 and say there I fixed it for you. But Jared is nicer than I am.

Look, if it is so important to you, then just download the graphs, and change the name in photoshop from 4070 to 3080. Problem solved. 😎
Completely missed my point. The 3080 is the better card to test than the 4070. It is far more widely used, and comparing the 3080 with the 4080 will be a better way to show disparities in generational features effects on game performance. Also that 2GB or vram is a big deal nowadays.

It would be better to do both but if you had to choose one the 3080 makes much more sense given market share.

Yes it would be a better article but still not make sense to choose the 4070 over the 3080. It's just more silly that he mentions it and doesn't test it.

Yeah that would be even worse to just pass off the 4070 as the 3080. They perform similar but given the dlss versions and the fact that the 4070 outperforms the 3080 in a bevy of tests. Sorry different cards different performance and it is all over the map on benchmarks that directly compare the 2.

This idea they are the same is exactly what bugs me. They just aren't.
 
  • Like
Reactions: Order 66
Status
Not open for further replies.