Testing GPUs with AMD FSR3 and Avatar: Frontiers of Pandora — 16 graphics cards and hundreds of benchmarks

Admin

Administrator
Staff member
I feel extreamly graceful for what AMD did. We the consumers are the main beneficiary. With open source approach, and decent quality, expect FSR to be moders play ground and even Apple join the game,
 
Ok, so, yeah....WTH? This game is clearly broken somehow if not even the absurdly-expensive RTX 4090 can manage 60FPS average at 4K. It's like AMD and nVidia are trying to outdo each other with the question being "Who can sponsor a title that breaks video cards better?".

The opening salvo came from nVidia with CP2077 (which crushes video cards with its insane RT implementation) and now AMD answers with AFOP (which crushes video cards with its insane rasterisation requirements). Meanwhile, we as consumers are caught in the crossfire (which is pretty apt because games like this make me really miss Crossfire).

"Does the unobtanium mode look significantly better, though? Not really. We'll compare images below, and there are a few differences, but in general it's just much more demanding for extremely minor gains in image fidelity."

^^^^I'm actually chuckling IRL at this because it's exactly how I describe ray-tracing.^^^^ 😊
 
  • Like
Reactions: LuxZg and vehekos
Ok, so, yeah....WTH? This game is clearly broken somehow if not even the absurdly-expensive RTX 4090 can manage 60FPS average at 4K. It's like AMD and nVidia are trying to outdo each other with the question being "Who can sponsor a title that breaks video cards better?".

The opening salvo came from nVidia with CP2077 (which crushes video cards with its insane RT implementation) and now AMD answers with AFOP (which crushes video cards with its insane rasterisation requirements). Meanwhile, we as consumers are caught in the crossfire (which is pretty apt because games like this make me really miss Crossfire).

"Does the unobtanium mode look significantly better, though? Not really. We'll compare images below, and there are a few differences, but in general it's just much more demanding for extremely minor gains in image fidelity."

^^^^I'm actually chuckling IRL at this because it's exactly how I describe ray-tracing.^^^^ 😊
Pretty sure Unobtanium turns on more ray tracing along with higher resolution textures. Problem is that there's no clear description of the various settings and when they do / don't use ray tracing. Or maybe Unobtanium increases the cutoff distance for RT calculations?

Anyway, I'm a firm believer that RT can improve image quality, when done properly, but doing it properly requires a lot more RT than just doing some shadows, AO, or even reflections. Doing full global illumination (meaning, all the lighting and such calculated via RT, i.e. what Nvidia calls path tracing) probably requires at least 500 rays per pixel, and even that isn't really sufficient. (RT in Hollywood movies are probably closer to ~10K or more rays per pixel, and they're doing that at 8K these days.)

There are lots of things in CP77 where the difference between path tracing and even RT ultra is quite substantial, if you know what you're looking for. It doesn't fundamentally alter the game, but it does look better / more accurate. Same for Alan Wake 2. Avatar isn't doing full RT by any stretch, even at Unobtanium, but it does more shader calculations per pixel for sure. A lot of those calculations are just going to be the same as the faster approximations done at ultra, though.
 
  • Like
Reactions: Avro Arrow
@JarredWaltonGPU

I’m asking this as a 4090 owner, who would love to try this at 4K unobtainium: in your experience, is the frame rate situation even slightly improvable through better versions of drivers and/or updates?
 
Pretty sure Unobtanium turns on more ray tracing along with higher resolution textures. Problem is that there's no clear description of the various settings and when they do / don't use ray tracing. Or maybe Unobtanium increases the cutoff distance for RT calculations?

Anyway, I'm a firm believer that RT can improve image quality, when done properly, but doing it properly requires a lot more RT than just doing some shadows, AO, or even reflections. Doing full global illumination (meaning, all the lighting and such calculated via RT, i.e. what Nvidia calls path tracing) probably requires at least 500 rays per pixel, and even that isn't really sufficient. (RT in Hollywood movies are probably closer to ~10K or more rays per pixel, and they're doing that at 8K these days.)
Oh, there's no question about that. What I've always believed is that while RT and/PT are the future, PC tech isn't advanced enough to properly use them yet. When it's advanced enough to use RT and/or PT smoothly at 60FPS at 1440p or 2160p, then (and only then) will it be worth it to me to turn it on. This is because I honestly believe that games are beautiful already without it so it's not like anybody's suffering if RT/PT isn't turned on.

Hell, growing up, I gamed on an Atari, ColecoVision, Intellivision, TRS-80, C64, NES, Genesis, N64, PS2, PS4 and PC. I guess that since I've been gaming so long and have experienced pretty much every level of graphics, what matters most to me is the game's content. I don't think I'm unique in that regard when you consider how popular mobile games and online MMORPGs are despite the fact that they don't even come close to matching the graphical fidelity of modern AAA titles. A game is either fun or it's not, regardless of how pretty the graphics are.
There are lots of things in CP77 where the difference between path tracing and even RT ultra is quite substantial, if you know what you're looking for. It doesn't fundamentally alter the game, but it does look better / more accurate. Same for Alan Wake 2. Avatar isn't doing full RT by any stretch, even at Unobtanium, but it does more shader calculations per pixel for sure. A lot of those calculations are just going to be the same as the faster approximations done at ultra, though.
I absolutely agree with you but it's like I said, our tech isn't there yet so it's horrifically expensive and its performance generally sucks buttocks. Like, just think of how many people foolishly spent over $2,000 for an RTX 2080 Ti mere months before the launch of the RTX 30-series cards. At the time, the RTX 2080 Ti was "the pinnacle of RT performance" but, as we look at it now, it still sucked, despite the insane price.

When the RTX 20-series was first released, Jensen Huang said his (in)famous words:
"EVERYTHING JUST WORKS!"

Here we are, five years later (which, as you know is an eternity in PC tech) and it's still not true. How he managed to convince so many just blows my mind.
 
Last edited:
  • Like
Reactions: LuxZg and vehekos
I will never play this game, because I'm absolutely creepied at being forced to play the game as a woman in first person.

I watched youtube videos, and there is something in the sound design that provokes me a deep feeling of disgust.
 
  • Like
Reactions: Avro Arrow
I will never play this game, because I'm absolutely creepied at being forced to play the game as a woman in first person.

I watched youtube videos, and there is something in the sound design that provokes me a deep feeling of disgust.

Is there a problem as to how women are treated, even in fiction?

This is a rhetorical question.

That's quite insane that the games did not feel sluggish.

I guess that the tech would have its use in slower games.
 
  • Like
Reactions: PEnns
@JarredWaltonGPU

I’m asking this as a 4090 owner, who would love to try this at 4K unobtainium: in your experience, is the frame rate situation even slightly improvable through better versions of drivers and/or updates?
I mean, anything is possible but is it likely? I don't know. It's an AMD-promoted game, so in my experience that means Nvidia is less likely to do heavy optimizations for it, unless the game becomes very popular.

I think the Unobtanium setting right now might be a bit like the shaders per pixel setting in Metro Exodus. If you're not aware, lower settings in that game can do 1 shader per two pixels, ultra is a 1:1 ratio, and extreme is I think two shaders per pixel — and you can manually set that even higher, to like four shaders per pixel or whatever.

I don't have this as fact but merely speculation, but I do think a big part of the performance hit is basically just doubling up on calculations. Like super-sampling, that can improve image quality a bit but is a very costly approach. Toss in additional RT calculations and you get the >50% performance drop I measured. But maybe it's something else.

Note: I did snap some comparison images of the settings, so there are three shadow-related increases with Unobtanium, three more reflections-related, and two volumetric fog options. Two of the shadow settings explicitly mention RT, and the same for two of the reflections settings.
 
  • Like
Reactions: valthuer
I will never play this game, because I'm absolutely creepied at being forced to play the game as a woman in first person.
I'm not creeped out by it but I do prefer playing a male character since, you know, I'm male. In AC:Odyssey, I only ever played as Alexios because playing as Kassandra would've felt really fake to me.
I watched youtube videos, and there is something in the sound design that provokes me a deep feeling of disgust.
I'll have to check that out because I've never heard of something like that before.
 
I will never play this game, because I'm absolutely creepied at being forced to play the game as a woman in first person.

I watched youtube videos, and there is something in the sound design that provokes me a deep feeling of disgust.
Erm... you can choose to play as a male Na'vi. And you can choose what voice it should use. At least, I'm pretty sure those were options at one point, and I'm pretty sure my character is a male Na'vi.
 
I mean, anything is possible but is it likely? I don't know. It's an AMD-promoted game, so in my experience that means Nvidia is less likely to do heavy optimizations for it, unless the game becomes very popular.

I think the Unobtanium setting right now might be a bit like the shaders per pixel setting in Metro Exodus. If you're not aware, lower settings in that game can do 1 shader per two pixels, ultra is a 1:1 ratio, and extreme is I think two shaders per pixel — and you can manually set that even higher, to like four shaders per pixel or whatever.

I don't have this as fact but merely speculation, but I do think a big part of the performance hit is basically just doubling up on calculations. Like super-sampling, that can improve image quality a bit but is a very costly approach. Toss in additional RT calculations and you get the >50% performance drop I measured. But maybe it's something else.

Note: I did snap some comparison images of the settings, so there are three shadow-related increases with Unobtanium, three more reflections-related, and two volumetric fog options. Two of the shadow settings explicitly mention RT, and the same for two of the reflections settings.

Thanks for the extended post! I've always liked graphically impressive games, i'm thinking about buying it and i was wondering if it's worth it.

P.S. I was always wondering if i'm the only 4090 user, getting 24 FPS at Metro Exodus Enhanced, with 4K Ultra RT and a Shading Rate of 4.0 🤣
 
Thanks for the extended post! I've always liked graphically impressive games, i'm thinking about buying it and i was wondering if it's worth it.

P.S. I was always wondering if i'm the only 4090 user, getting 24 FPS at Metro Exodus Enhanced, with 4K Ultra RT and a Shading Rate of 4.0 🤣
So, how you feel about the game is probably going to reflect how you feel about the movies. There's definitely a "humans are bad, they destroy everything, nature is the best!" vibe from what I've played so far. And I will say there's a propensity for the game to hitch while loading new data (which doesn't really show up in my benchmarks as I run the same test multiple times and thus the first run with hitching gets tossed).
 
  • Like
Reactions: valthuer
The move was bad enough..the thought of a games give me the heebees

The movie won various awards and is one of very few movies to earn 3 billion dollars at the box office!

Not bad for a "bad" movie. Just because you don't like, it doesn't make it is bad.

I think the game also looks amazing and justifies a high end card.
 
Last edited:
  • Like
Reactions: Bamda
I will never play this game, because I'm absolutely creepied at being forced to play the game as a woman in first person.

I watched youtube videos, and there is something in the sound design that provokes me a deep feeling of disgust.
Oddly enough, I always play a female character in games that give me a choice. I played The Witcher even though I'm "forced"to play a male main character. So what's the issue? Andrew Tate syndrome?
 
gPtqdFJLzwSWKBf2VFo2xW-1200-80.png.webp

vjg6x5q6qx3Xfq7wGUHjzV-1200-80.png


Now do an article with several direct IQ comparisons side by side with those two settings, since they're rendering at about the same resolution (12% higher with the AI'd "3840x2160") and tell people if they should go with software trickery or native.
 
Last edited:
Why is nobody anywhere talking about how terrible the water looks? Water is the worse looking aspect of the game. It bothers me every time I see it. It moves like thin oil, flickers, the reflections are strange and too strong, and water borders are shaded strangely. It often ruins the immersion for me. Playing on PC with a 4090 and Unobtanium settings. RDR2 did water so much better. Otherwise, it's just an oversaturated game with lots of foliage.

Fix the water and introduce path tracing and I'll really be convinced this is the best-looking game.
 
Ok, so, yeah....WTH? This game is clearly broken somehow if not even the absurdly-expensive RTX 4090 can manage 60FPS average at 4K. It's like AMD and nVidia are trying to outdo each other with the question being "Who can sponsor a title that breaks video cards better?".

The opening salvo came from nVidia with CP2077 (which crushes video cards with its insane RT implementation) and now AMD answers with AFOP (which crushes video cards with its insane rasterisation requirements). Meanwhile, we as consumers are caught in the crossfire (which is pretty apt because games like this make me really miss Crossfire).

"Does the unobtanium mode look significantly better, though? Not really. We'll compare images below, and there are a few differences, but in general it's just much more demanding for extremely minor gains in image fidelity."

^^^^I'm actually chuckling IRL at this because it's exactly how I describe ray-tracing.^^^^ 😊
The 4090 is now over a year old. Yeah, its expensive, but the developers probably expected the 5xxx series to be out by now. Or some kind of stopgap upgrade.
 
The 4090 is now over a year old. Yeah, its expensive, but the developers probably expected the 5xxx series to be out by now. Or some kind of stopgap upgrade.

Nah. Nvidia has been known to have a two year life cycle on its products.

True, she's about to break that, by releasing RTX-50 series on 2025. But that piece of news, is six months old:

https://www.tomshardware.com/news/nvidia-ada-lovelace-successor-in-2025

Besides, assuming that Nvidia had stayed true to its original two year cycle, even the most overoptimistic developer already knew he shouldn't be expecting new GPUs before the fall of 2024.

4090 Ti, was known to have been cancelled 5 months ago:

https://www.tomshardware.com/news/n...x-4090-ti-plans-512-bit-bus-next-gen-flagship

So, if not even year-old flagships, like 7900 XTX and 4090, can achieve satisfying framerates, i'm not sure what the rest of the PC users should be looking forward to.

There's just no excuse for Avatar, if you ask me. It's the devs job of properly optimising the game, for the best hardware currently available, instead of relying on FSR and DLSS, to do the laundry for them.

It's like this guy correctly pointed out:

Every game developer wants to take the new "but can it play Crysis" meme and they use Ultra to do it.
 
Best things I got from this review is that FINALLY you can get low FPS experience upgraded to high FPS, and have it feel good. Pointing at example of 26 FPS becoming smooth and playable. THAT is the only thing I ever wanted from DLSS/FSR with frame generation. If I hear in close future they also fixed UI rendering bug, and basically going from native rendering to frame gen without side effects AND getting 26 fps to feel like 40fps and look like almost 60fps - I'll be sold on frame gen.

But that will still be just first game.

Most of upscaling and frame gen, specially on DLSS side, already expects a monster GPU, and then you get nice numbers of getting 100 FPS becoming 200 FPS. THAT approach sucks. I want 20 FPS experience to *magically " become 40(+) and no side effects or sluggishness of 20 FPS. Which maybe Avatar can make come true. But I'll hold my decision for more info first. And more games doing it, not just one.

As for conversation above on RT, I similarly wait for game that does it right. I puke on screenshots where wet asphalt shines and reflects like a perfect mirror, those games get discarded immediately as unrealistic and just a hardware hog. Yes, CP2077 as prime example. I don't mind upgraded reflections, but that's not it. Most I expect from good RT implementation is realistic and well done global illumination. Again, not overdone as in CP2077. I mean look at photos of Vegas or NY Times square at day, night, or rain, compare that to something like CP2077 (and almost everyone else) and you'll see what I mean. Those real places and photos never shine or reflect as much as games are trying to when promoting RT. Asphalt is very rough material, concrete, facades of buildings, even much of the metals snd metallic paint (eg cars) is relatively "bumpy" and imperfect enough that it can't reflect perfect picture like a mirror. A glass window rarely has mirror like reflection in real life! If you look at a building across street, if you can snag a photo of more than 10% of glass surface reflecting like mirror you get published in National Geographic as photo of the month. Now try reflections flying (swinging) around as Spiderman, and suddenly while Manhattan is perfect mirror. Yeah, that's why I hate RT *IMPLEMENTATIONS* in general.

So, when frame gen can make low end GPU outputting native 20-ish fps feel like 40+ fps, I will be using it. When RT won't be just advertisment for mirrors and overblown shining and reflections, I will start putting my thumbs up. I guess 10 years from now for both frame gen and RT to do both of that in tandem on a 200$ GPU (or integrated graphics). See you then.