Sure but the 3080 is not everything...it was the top end last generation. You mentioned the card, you should test the card. With a game that is so specific about the versions of tech being used, not just the horsepower to do this just seems very odd to me. You tested a whole lot of cards here, not doing the 3080 but doing the 2080 ti and the 4090, and the 3090 seems very strange to me. Get rid of the 4070 and test the 3080 would have fit much better in your scheme considering you didn't test any other 70's.
Sorry this just seems weird to me (site not you) and you have done it multiple times. To skip over one of the most used cards and then just say meh too much to add that...I just don't get it. Can you test everything? No, but the 3080 is pretty damn main stream...much more so than the 3090 or 4090 at the very least.
Every card tested takes me up to ~30 minutes for this game, as there are 12 primary settings tested (three resolutions, medium, max, max+RT low, max+RT high) on each card, plus an additional six settings (RR enabled) on RTX cards. I've tested the 3080 and 4070 enough to know that, other than Frame Generation, there's generally relatively little difference between the two cards. But Alan Wake 2 might be different, so that's why we run some tests.
Then I provided some other data points to help with extrapolation. RTX 3090 and RTX 4070 Ti are usually roughly equal in performance. (Note that I didn't test the 4070 Ti or 4060 Ti, because it's easy to extrapolate those results. They fall between the card above and below, within a few percent of the midpoint.) In this case, with RT enabled, the 3090 falls a lot closer to the 4070 than usual, and even below it in some cases (1440p with RT High). It's not VRAM or raw bandwidth causing the difference, which means it's architecture.
Looking at the 3060 and 3090, you can also see that the 3090 is about 2.5X faster at RT High, and 2.3X faster at RT Low — not really playable in either case, either. Our GPU hierarchy puts the 3090 2.25X faster at 1080p ultra in RT games. So Alan Wake 2 with RT is maybe a bit more demanding than the typical RT game, but not by much. (Minecraft shows a 2.4X difference at 1440p, which lines up with Alan Wake 2 pretty well.) Basically, we can reference the usual scaling factors, or close enough.
So: 3080 Ti will be just a touch slower than a 3090, 3090 Ti would be just above the 3090, and 3080 would be about 10–15 percent slower than the 3090. Does it matter if it's 10% versus 15%? Not really. That's just splitting hairs. We don't need to retest every card on every game to have an idea where it will land. You can take that or leave it.
The net result is that, because this is an Nvidia tech heavy game, I tested the relative top of each of the prior two generations to give a starting point. I also added the RTX 3060, which actually
is the most popular 30-series card (and the most popular GPU overall). Everything lower down the list for the 30- and 20-series would obviously be slower and can be interpolated easy enough if you really want to.
The bottom line is that if you want to play the game with RT enabled, none of the lower 30-series or 20-series cards are going to deliver a great experience in the game. And none of the AMD or Intel GPUs will be good either. 3080 should manage 1080p RT High at around 35 fps, give or take 2 fps. 3070 Ti and below won't break 30 fps, unless you turn down settings or crank upscaling to performance or even ultra performance mode (and maybe not even then).
And if you don't want to enable RT? Then the game is far less demanding, but the relative positioning of the GPUs won't change. 3080 at 1080p high (not max) should get around 80 fps. And even at 1440p high it can average more than 60 fps. That's with Quality upscaling, though.