News RDNA 4's Unreal Engine 4 ray-tracing stutters may not be AMD-specific

I'm always curious;

Was this ever tried on Linux? What about Nvidia cards on both platforms? Whoops article addresses it.

It is very interesting that this was also affecting Intel's GPUs. That points back to the game itself.
 
It is very interesting that this was also affecting Intel's GPUs. That points back to the game itself.
that was hinted at by beign a nvidia specific version of UE4.

basically assumign Nvidia paid for it to be made and due to how they do raytracing different from intel/amd those cards have stuttering from not beign as well optimized as their own.

it 100% sounds like an Nvidia thing to do as they want you to NEED their products to play.
 
While I can't speak to the first Hellblade, I can definitely say that The Ascent has been a buggy mess since release. There are plenty of Steam reviews going over this, and it's not just an RT issue. I don't doubt the validity of the claims, but it's a terrible title to use as an example.
 
  • Like
Reactions: iLoveThe80s_90s
While I can't speak to the first Hellblade, I can definitely say that The Ascent has been a buggy mess since release. There are plenty of Steam reviews going over this, and it's not just an RT issue. I don't doubt the validity of the claims, but it's a terrible title to use as an example.
When it runs fine until the setting change it's a valid test case.

Is the B580 /that/ far behind the 9070? Or is it more driver support still catching up?
 
*Uses an Nvidia specific branch of an engine....

*Nvidia hardware also suffers weirdness, until Nvidia's handicap-tech is turned on....

We are definitely in the dark ages for gaming when the hardware the engine is made for (a gen or 2 down the line at that?) still needs the scalers and/or frame gen tech just to apply
 
When it runs fine until the setting change it's a valid test case.

Is the B580 /that/ far behind the 9070? Or is it more driver support still catching up?
Weren't Intel's own folk saying they're working on something that should finally compete with the ~5060 during the last tech conference.... where they unvieled the pro variants of what we already have?

If they themselves think the hardware they have out at the time doesn't even compare to Nvidia's current bottom card, there's no mental gymnastics to be had trying to down Radeon nor elevate Intel here, architectural oddities aside.
 
Is the B580 /that/ far behind the 9070? Or is it more driver support still catching up?
In terms of raw FP32 compute, the RX 9070 brings 36.1 TFlops to the B580’s 13.6 TFlops (the RTX 5070 has 30.9 TFlops, for reference). Theoretical compute doesn’t translate to actual frame rates 1-to-1, but you can’t optimise away that big of a gap through drivers.
 
How odd. It almost seems as if the programming behind Unreal Engine is undertaken in an occasionally slapdash fashion.

Or developers can't be bothered to perform more than token optimisation for anything other than Nvidia.

</conspiracytheorising>
I mean, this is the same thing as a certain era of games not running on AMD CPUs because they were targeted only at Intel back when it had 95+% of the market for CPUs.

It's the same reason every website targets Chrome. Why care about any standards when you can just target the monopoly player in the market?
 
I mean, this is the same thing as a certain era of games not running on AMD CPUs because they were targeted only at Intel back when it had 95+% of the market for CPUs.

It's the same reason every website targets Chrome. Why care about any standards when you can just target the monopoly player in the market?
Yes, I remember that era quite well. Lazy, sometimes even biased, developers not helping to balance the market. And in this case, Epic have long been a bit overly moist for Nvidia which probably doesn't help.

The website thing also resonates too, though from the time when I was a web designer and awful, stupid, LOUSY Internet Exploder 6 (*pounds desk with remembered anger*) was the one you had to target for compatibility for years on end. Ugh, that thing was such a nightmare, and it seemed like it would never just go away.
 
  • Like
Reactions: SomeoneElse23
The problem is with unreal engine itself. Unless explicitly told to do something like async compute the engine.ini file is the key to unlocking performance in a UE game is to find the right edits for your system and situation. I find Claude 4 Opus is the best option to find the right settings for games and OC.