News Redfall GPU Benchmarks: AMD, Intel and Nvidia Cards Tested

suwoyAvEbNq3cyao8JJCAA-970-80.png


I knew Radeon had great raster performance but didn't see this coming 😳

@JarredWaltonGPU thank you for the benchmarks, I have been hearing this game won't have the easiest time running on the Xbox X, did you get any such sense from the PC performance? Those 1% lows on both 4K and 1080p are a bit rough, innit?
 
So in essence, the AMD sponsored title only features FSR and prefers AMD cards, while the Nvidia sponsored title features all upscalers and does not prefer Nvidia cards. Amazing job with that performance skewing, AMD.

@Elusive Ruse it's a shooter. I'm not really surprised, seems like the same mechanism as in other shooters, which has been going for a while. Besides, it has been noted before that Nvidia cards drop less with higher resolutions this generation. Situation changes at 4K. Will many people play at that resolution? Probably not, but it illustrates my point.
 
The game actually got a new patch/update few hours ago. The size of the update is actually 65GB which is a ridiculous size for a patch. Maybe you can retest some of the game areas/levels once again.

Unfortunately, as observed by other gamers (I don't have the game with me), even after applying the patch Redfall still suffers from some graphical issues, like pop-in issues, ridiculous T-poses, square-ish/pixely shadows, and traversal stutters.
 
suwoyAvEbNq3cyao8JJCAA-970-80.png


I knew Radeon had great raster performance but didn't see this coming 😳

@JarredWaltonGPU thank you for the benchmarks, I have been hearing this game won't have the easiest time running on the Xbox X, did you get any such sense from the PC performance? Those 1% lows on both 4K and 1080p are a bit rough, innit?
This tends to be the way of open world games where you truly can go to just about anywhere, any time. You get lower fps dips when things that were out of range have to load in. It's probably also just a factor of using Unreal Engine at settings that push the hardware reasonably hard. Anyway, for the 6750 XT and above, minimums are above 60 fps, which isn't bad at all IMO.
 
So in essence, the AMD sponsored title only features FSR and prefers AMD cards, while the Nvidia sponsored title features all upscalers and does not prefer Nvidia cards. Amazing job with that performance skewing, AMD.
how is this any different then the " how its meant to be played " campaign by nvidia ? seems, from what i could find this is exactly what that hole thing was about. direct help from nvidia by way of optimizing the games code for nvidia hardware, and nvidia optimizing its drivers for that game.


this is NO different, if that is the case.
 
  • Like
Reactions: Avro Arrow
how is this any different then the " how its meant to be played " campaign by nvidia ?
Yeah, do you remember what hairworks used to do (or GameWorks in general, for that matter)?
seems, from what i could find this is exactly what that hole thing was about. direct help from nvidia by way of optimizing the games code for nvidia hardware, and nvidia optimizing its drivers for that game.


this is NO different, if that is the case.
Never mind them. They either work for nVidia or have some vested interest in them because their constant defending of everything that nVidia does and weird attacks on anything that AMD does is clearly driven by emotion, not logic.

Sure, AMD has pulled some crap and I've ripped them a new one each time, but their worst anti-consumer actions have always paled in comparison to the crap that nVidia has pulled.
Never forget about the "GeForce Partner Program"

Meanwhile, in some places where nVidia has anti-consumer practices, AMD has pro-consumer practices. This is even true if the consumer in question owns a GeForce card.

AMD created Mantle, an API designed to lessen the gaming load on a CPU so that gamers' CPUs would be viable for longer. This wasn't in AMD's best interest because they sell CPUs but they made it anyway and released it for free as an open-source API. Microsoft used elements from it in DirectX12. Then the Khronos Group took it and made Vulkan with it so that they could finally retire OpenGL.

Then of course, there's the fact that people who own GeForce RTX 30-series cards are denied DLSS3 and people who own ANY GeForce card beginning with GT or GTX can't use DLSS in any form at all. Meanwhile, those hapless GeForce owners can use AMD FSR without issue because, as was demonstrated with Mantle, AMD has always preferred open standards to locked proprietary solutions. It's also why their Linux drivers are so good and why nVidia's are so bad. Remember what Linus Torvalds, one of the greatest tech minds in history, had to say to nVidia.

We also owe the fact that we're no longer sandbagged with quad-core CPUs to AMD because Intel clearly felt that 8-core CPUs should cost over $1,000USD and that 10-core CPUs should cost over $1,700USD.

Let them ramble on about how "wonderful" Intel and nVidia are. Who knows, maybe they secretly own LoserBenchmark! 😉 😆
 
Last edited:
FSR is open sour e and works on all GPUs, can you say the same about DLSS?
What does that even matter? There is still no excuse not to include other upscalers. Especially since all usually work best on their respective hardware and with there now existing a pipeline that makes implementation of every upscaler easier. Which, btw, was developed by Nvidia. And Nvidia sponsored titles, last I checked, didn't block other upscalers. We are, in fact, commenting on of them right now.

When a company restricts developers in which upscalers they can include, then none of this matters. Especially AMD is absolutely notorious for bragging about the dumbest stuff and FSR being more widespread than DLSS is one of them. They are that desperate to gain marketshare. However, if you simultaneously block other upscalers in sponsored titles, well, than that supposed dominance means jack. It's bought and nothing else. Anyone else would get massive backlash for this, but it's AMD doing it, so who cares, right.

And about AvroArrow I can only say. I have yet to see them criticize anything AMD did. Ever.