News AMD Polaris GPUs Stumble Over Forspoken’s DX12 Feature Requirements

zecoeco

Prominent
BANNED
Sep 24, 2022
83
113
710
It is not unexpected for a 6 year old GPU.
It's obvious and already stated by the game "RX 5500 XT" which is RDNA-based, the minimum requirement to support the game at its minimum visuals (720p 30fps).
The game itself recommends an RTX 3070/RX 6700 XT for a good experience, and that is already pretty high-end.
 

Sleepy_Hollowed

Distinguished
Jan 1, 2017
498
194
18,870
Yep, the game is a GPU/CPU buster. It’s hitting the 7 year ceiling for AAA games, they just happen to make a cutoff of what to support at 12_1.

if a lot of new games are going that route, that might get the GPUs selling a tiny bit.
 
  • Like
Reactions: artk2219

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
In the meantime, Forspoken isn’t receiving the warmest reception from gamers. For example, if you check the Steam user reviews section, many recommend holding off for a few weeks or months until patches have ironed out the launch wrinkles.

The game annoyed me enough that I gave it a thumbs down on Steam and a 0 on Metacritic.

  1. The game ran very poorly on my 3060, couldn't even get 60fps in 1080p. Ridiculous.
  2. The childish dialogue involving constant cursing was horrible.
  3. The gameplay was very clumsy, I hate games where you have to select abilities on a center wheel during combat.
  4. The world is boring and monotonous.
  5. The $70 price for this disaster is ridiculous, who does Square Enix think they are. thank God for Steam refunds.
After the Death Stranding disaster, the Sonic Frontiers disaster, and now the Forspoken disaster, I had about enough of Japanese games for now.

Looking forward to Hogwarts Legacy that actually looks good.
 

InvalidError

Titan
Moderator
Yep, the game is a GPU/CPU buster. It’s hitting the 7 year ceiling for AAA games, they just happen to make a cutoff of what to support at 12_1.

if a lot of new games are going that route, that might get the GPUs selling a tiny bit.
Or it could go the other way with budget gamers simply skipping anything that requires a newer DX support level than their GPU supports. On Steam's survey, Polaris 470-590 accounts for 4-5% of the market, not too much of a loss for the developers.
 
  • Like
Reactions: artk2219
The minimum system requirements do not list Polaris GPUs, so this shouldn't come as a surprise, and I'm almost certain the game DEVs did not code the game for older hardware, hence the feature-level requirement. Though, the game should have at least run/start on Polaris cards, no?.

This would mean that gamers running older Polaris GPUs are unable to run Forspoken but the good thing is that even if they could, the game would hardly run for them since it requires a ton of performance and even modern-day GPUs have a hard time running it unless you are using some sort of upsampling technique.

Even if you could run the game, the title itself isn't any good with a lackluster storyline, generic gameplay, and some of the worst dialogue choices ever used for a AAA production. But nonetheless, another major over-hyped AAA game bites the dust ! And not to mention so many technical issues currently plaguing the game.

If you ask me, Graphics-wise, Forspoken does not look as impressive as its initial in-engine reveal trailer. Its lighting system, in particular, looks as old-gen-ish as it can get. I don’t want to be negative here, but this does not feel like a current-gen-only game.

Don’t get me wrong, it doesn’t look THAT bad or anything. For instance, there are some cool particles effects, as well as amazing wind effects. However, and while the game does not use a dynamic day/night cycle, its pre-baked lighting looks worse than the one from Final Fantasy XV (assuming you have played this title).

Hell, it can even look worse than Uncharted 4. And then we have car lights that do NOT cast shadows. The original Watch_Dogs and Grand Theft Auto 4 had this feature, but Forspoken is suddenly unable to provide such a thing. Some distant objects/textures also look really blurry. And, for a game requiring more than 10GB of VRAM at least on higher resolution, this is unacceptable.

Forspoken is a huge technical disappointment. Not only was the game downgraded from its initial reveal, but it also has numerous performance/optimization issues. Furthermore, the game does not justify its huge GPU requirements. And don’t get me started with its borked RT implementation. Or its frame pacing issues. Or its AUDIO issues more importantly.


Luminous Productions will have to improve a LOT of things. Honestly, though, I don’t really believe they’re willing to do so. Nevertheless, I’d like to be proven wrong, so the ball is in Luminous’ and Square Enix’s court!
 
Last edited:

Neilbob

Distinguished
Mar 31, 2014
195
226
19,620
The system requirements these days always strike me as kind of amateurish, almost as if the people involved don't quite know what they're talking about - most notably where at Ultra the display resolution is noted as 2160p, but at this kind of resolution the CPU needs typically decrease, rather than increase because the heavy lifting all falls to the graphics. Hopefully some kind of benchmark to show it, but I'd bet that Ryzen 5 1600 in the minimum wouldn't be terribly far behind the Ryzen ... 5 ... 5800X in the ultra at 4K.

Not that I've any interest in this or pretty much any other modern game, and I've seen enough reviews of this one to recognise that it should be avoided, at least for a while.
 

Metteec

Distinguished
Jan 12, 2014
42
31
18,560
I would not count out the Polaris just yet. It matters not that Forspoken fails on these older chips. It barely runs well on my RTX 3090 with 1440p average fps of 67 dropping to 18 fps and texture pop-in/crashing in some instances. The graphics are blurry even at max settings. I don't think that Forspoken represents the "next gen" and what to expect for the future. Rather, it is just an example of lazy programming and poor optimization, another game rushed to the fore by Square Enix. As another poster mentioned, there is still life in the Polaris GPUs with other AAA games like Harry Potter, Dead Space, Star Wars: Jedi Survivor, Dead Island 2, and Starfield that will run well with the right graphics settings. You can bet that those games will still look great at reasonable framerates even with a Polaris GPU.
 

brandonjclark

Distinguished
Dec 15, 2008
492
204
19,820
I would not count out the Polaris just yet. It matters not that Forspoken fails on these older chips. It barely runs well on my RTX 3090 with 1440p average fps of 67 dropping to 18 fps and texture pop-in/crashing in some instances. The graphics are blurry even at max settings. I don't think that Forspoken represents the "next gen" and what to expect for the future. Rather, it is just an example of lazy programming and poor optimization, another game rushed to the fore by Square Enix. As another poster mentioned, there is still life in the Polaris GPUs with other AAA games like Harry Potter, Dead Space, Star Wars: Jedi Survivor, Dead Island 2, and Starfield that will run well with the right graphics settings. You can bet that those games will still look great at reasonable framerates even with a Polaris GPU.


Yup. This game is just REALLY poorly optimized. My 3090 runs just about anything at 2k fast as hell.
 
Apr 1, 2020
1,366
1,019
7,060
Normally things like this wouldn't be a huge problem, Polaris is 5 years old, it was never anywhere near top of the market, they were lower and mid range cards for under $300, and budget cards all come with the caveat that they won't last as long as flagship cards. DirectX 12_1 aside, as this game also runs in Vulkan on Linux (and likely will on Windows soon). the GTX 1060 and the RX 590 are basically equal overall, and those cards -barely- make the minimum spec for 1280x720 30fps. The problem is that thanks to a combination of factors including the crypto boom, inflation, and pandemic, the last three generations of GPUs (counting RX 7000/RTX 4000) are still out of reach for most people to upgrade to.
 

russell_john

Honorable
Mar 25, 2018
114
81
10,660
It may be time to retire your AMD Polaris GPU as the first AAA AMD Polaris GPUs Stumble Over Forspoken’s DX12 Feature Requirementss emerge which simply won’t run due to DX12 feature level issues. On the other hand Forspoken is getting a very mixed reception.

AMD Polaris GPUs Stumble Over Forspoken’s DX12 Feature Requirements : Read more

No way am I buying a game that can only do 30 FPS @ 1440p with a 3070 ...... For cripes sake the Witcher 3 Next Gen with broken CPU threading basically running on only 2 threads can do better than that with everything maxed out

When are game engine developers ever going to learn to work with DX12?
 
  • Like
Reactions: phenomiix6

DavidLejdar

Prominent
Sep 11, 2022
236
139
760
No way am I buying a game that can only do 30 FPS @ 1440p with a 3070 ...... For cripes sake the Witcher 3 Next Gen with broken CPU threading basically running on only 2 threads can do better than that with everything maxed out

When are game engine developers ever going to learn to work with DX12?
Then again, the recommended hardware may refer to max settings (with ray tracing). And in such case, it wouldn't necessarily be that bad. E.g. I recently picked up Cyberpunk 2077, and with a RX 6700 I get around 40 FPS at max setting with ultra ray tracing, in 1440p resolution with FSR. The benchmark says around 10 more FPS when I turn ray tracing off. And 50 FPS with max graphics (albeit without ray tracing), and perhaps some more FPS with 3D V-Cache soon, that's something I can "endure" for getting max viewing distance, detailed game world (where i.e. NPCs and trees are not just based on 1 template), reflections, weather effects, vivid visuals from interactions, and so on. Which isn't to say that optimization isn't a great thing. But there is only so much software can do when also talking about games advancing to visually richer game worlds.

The system requirements...
On the technical side, there is no reason why the CPU-load would be less at a higher resolution. Rather on the contrary. For example during the loading phase, the amount of decompression by the CPU wouldn't decrease, and during gameplay the user may actually want to see bigger crowds/more-NPCs at 4K (where such is applicable), for which the CPU needs to do some (extra) calculations. And the role of a better CPU may not be as significant to the end-result as a better GPU, but if trying to run 4K with i7-3770/ R5 1600 (possibly also with PCIe 3.0 for the GPU), I would say that there is likely to be quite some stutters at the least, as even a 4090 GPU would probably wait around quite some for the CPU to do its thing.
 

kiniku

Distinguished
Mar 27, 2009
244
65
18,760
Disappointing to say, AMD is still, today, the JV team for GPUs. Uncompetitive except for lower prices. I was also hoping Intel's new ARC line could have given Nvidia some competition too. But not yet.
 

blacknemesist

Distinguished
Oct 18, 2012
482
80
18,890
"holding off for a few weeks or month"? Do yourself a favor and wait a year. There's enough games out there.
It's a Square Enix game, it ain't never going to be fixed by the dev team, only hope is modding to tamper with the engine so it actually does not stutter at 4k on a 4090 with a 7700X, the worst part is that most of the workload is basically clutter that just fills the screen with ugly particle effects but dialed up to 11
 
This doesn't sound like a hardware limitation. Reading some reports on AMD Reddit site of people who're playing the game in Linux on an RX480 with Linux (meaning not AMD's) drivers. So either it's not hardware or there are software/driver workarounds possible. They even report decent enough performance...it's "playable".

As an aside, on my system it seems pretty good. No hiccups and it breezes through the benchmarks on "ultra" settings, 1440p. I'm running a 5800X and 6800XT so pretty decent, but hardly top mark. And "only" 16GB system memory...I was reading that wouldn't be enough in ultra settings. But really only messed around in the benchmarks so far, trying to find a balance of settings I like before starting the actual game. And if the 1-3 second scene load times in the benchmark is an indication of what DirectStorage can do all I can say is WOW.

I do wonder how much better it will get when (if) AMD releases a game optimized driver for RDNA2. But it does seem a bit odd since this was one of the free games in the promotional when I bought the 6800XT.
 
Last edited:
The PC game on everyone’s lips in January 2023
Forspoken is getting terrible reviews, and everyone seems to hate it. : P

Currently, the PC release has only 55% positive reviews on Steam, and a 1.9 user score on Metacritic. Only two professional review outlets on Metcritic bothered to review the PC version so far, with the average Metascore currently at a 58, which is about as bad as big releases get. And this isn't just down to issues with the PC version, as the reception has been nearly as bad on the PS5, where Forspoken currently holds a 66 Metascore based on 81 reviews, along with a 3.5 user score. It doesn't seem to be a particularly well-received game on either platform, and the issues appear to go beyond just performance.

The game itself recommends an RTX 3070/RX 6700 XT for a good experience, and that is already pretty high-end.
Not quite. They recommend that hardware for a subpar, 30fps experience at 1440p. The game is so poorly optimized that it doesn't sound like any of today's high-end systems could ensure a solid 60fps experience out of it, and that's pretty bad for a game that was showcased for its fast parkour movement.

If you ask me, Graphics-wise, Forspoken does not look as impressive as its initial in-engine reveal trailer. Its lighting system, in particular, looks as old-gen-ish as it can get. I don’t want to be negative here, but this does not feel like a current-gen-only game.

Don’t get me wrong, it doesn’t look THAT bad or anything. For instance, there are some cool particles effects, as well as amazing wind effects. However, and while the game does not use a dynamic day/night cycle, its pre-baked lighting looks worse than the one from Final Fantasy XV (assuming you have played this title).

Hell, it can even look worse than Uncharted 4. And then we have car lights that do NOT cast shadows. The original Watch_Dogs and Grand Theft Auto 4 had this feature, but Forspoken is suddenly unable to provide such a thing. Some distant objects/textures also look really blurry. And, for a game requiring more than 10GB of VRAM at least on higher resolution, this is unacceptable.

Forspoken is a huge technical disappointment. Not only was the game downgraded from its initial reveal, but it also has numerous performance/optimization issues. Furthermore, the game does not justify its huge GPU requirements. And don’t get me started with its borked RT implementation. Or its frame pacing issues. Or its AUDIO issues more importantly.
In my opinion, just based on their official marketing materials on Steam, the game looks pretty bad for a supposedly AAA release. There are some parts of the visuals that look decent, like some of the environments, but much of the lighting effects and many details look at times like something straight out of a game from 15+ years ago. Take this screenshot on the game's product page, for example...
https://cdn.cloudflare.steamstatic...._df2bd772883feab597c99364ae1720d77be3d6aa.jpg

That honestly looks like something straight out of The Elder Scroll's Oblivion, an early Xbox 360 title from 2006. And this is an image that they felt would showcase the game in a way that would encourage us to buy it. What are they even trying to show us in this unshaded, blurry mess? Nearly all of the images on the product page don't look much better. Did the marketing department need to upscale the game from 720p low settings just to get it to run on their Macbook to take the screenshots, or is this how the game is intended to look? Or perhaps they are just trying to manage expectations with these images.
 

KyaraM

Admirable
Mar 11, 2022
1,494
661
6,790
It is not unexpected for a 6 year old GPU.
It's obvious and already stated by the game "RX 5500 XT" which is RDNA-based, the minimum requirement to support the game at its minimum visuals (720p 30fps).
The game itself recommends an RTX 3070/RX 6700 XT for a good experience, and that is already pretty high-end.
In 1440p, though. They recommend a 1060 as the minimum on the NVIDIA side.

@cryoburner it looks quite pretty on PS5. Will play the benchmark a bit later to give an opinion on the PC side, but visuals should be quite similar. There are some areas where shading and visuals are distorted on purpose, though, due to the setting it plays in. This looks like one of them.
 
In my opinion, just based on their official marketing materials on Steam, the game looks pretty bad for a supposedly AAA release. There are some parts of the visuals that look decent, like some of the environments, but much of the lighting effects and many details look at times like something straight out of a game from 15+ years ago. Take this screenshot on the game's product page, for example...
https://cdn.cloudflare.steamstatic...._df2bd772883feab597c99364ae1720d77be3d6aa.jpg

That honestly looks like something straight out of The Elder Scroll's Oblivion, an early Xbox 360 title from 2006. And this is an image that they felt would showcase the game in a way that would encourage us to buy it. What are they even trying to show us in this unshaded, blurry mess? Nearly all of the images on the product page don't look much better. Did the marketing department need to upscale the game from 720p low settings just to get it to run on their Macbook to take the screenshots, or is this how the game is intended to look? Or perhaps they are just trying to manage expectations with these images.

I totally agree with your points. This game doesn't look like a current/next-gen title. That screenshot is shockingly awkward, lol. I guess some bad game design done, or the Luminous Engine is not at par like other game engines like ANVIL, 4A Engine, ID tech, Unity, CryEngine, REDengine, Unreal Engine just to name a few.

Some of the older METRO, Assassin's Creed and Far Cry titles really have great graphics till this day as well, and they look far better than Forspoken. But these days AAA gaming industry is doomed, and it's getting worse as each year passes by.

Some of the game DEVs don't even beta test/code the game properly, optimize it for varying hardware, or do a thorough performance analysis in their lab prior to release.
 

cfbcfb

Reputable
Jan 17, 2020
96
58
4,610
It is not unexpected for a 6 year old GPU.
It's obvious and already stated by the game "RX 5500 XT" which is RDNA-based, the minimum requirement to support the game at its minimum visuals (720p 30fps).
The game itself recommends an RTX 3070/RX 6700 XT for a good experience, and that is already pretty high-end.

The 1060 came out a year earlier than the RX580. The 580 was being sold against the 1650 at one point. Both of those cards will run the game.
 
  • Like
Reactions: KyaraM

KyaraM

Admirable
Mar 11, 2022
1,494
661
6,790
I totally agree with your points. This game doesn't look like a current/next-gen title. That screenshot is shockingly awkward, lol. I guess some bad game design done, or the Luminous Engine is not at par like other game engines like ANVIL, 4A Engine, ID tech, Unity, CryEngine, REDengine, Unreal Engine just to name a few.

Some of the older METRO, Assassin's Creed and Far Cry titles really have great graphics till this day as well, and they look far better than Forspoken. But these days AAA gaming industry is doomed, and it's getting worse as each year passes by.

Some of the game DEVs don't even beta test/code the game properly, optimize it for varying hardware, or do a thorough performance analysis in their lab prior to release.
At least play the demo before judging from some screenshots, holy cow... it's cost free and gives a really accurate impression of the full game since it, well. Comes from the full game. But I guess bashing is easier than trying.
 
  • Like
Reactions: kyzarvs