News God of War (2018) PC Performance: DLSS vs. FSR Tested

Makaveli

Splendid
We tested the God of War PC release on a collection of the latest graphics cards to see how it runs, as well as what settings you'll need for a smooth gaming experience.

God of War (2018) PC Performance: DLSS vs. FSR Tested : Read more

What driver was used for AMD?

AMD Radeon Software Adrenalin 22.1.1

God of War
  • Up to 7% increase in performance in God of War @ 4K Ultra Settings, using Radeon Software Adrenalin 22.1.1 on the 16 GB Radeon RX 6900 XT graphics card, versus the previous software driver version 21.12.1.
  • Up to 7% increase in performance in God of War @ 4K Ultra Settings, using Radeon Software Adrenalin 22.1.1 on the 16 GB Radeon RX 6800 XT graphics card, versus the previous software driver version 21.12.1.
  • Up to 7% increase in performance in God of War @ 4K Ultra Settings, using Radeon Software Adrenalin 22.1.1 on the 12 GB Radeon RX 6700 XT graphics card, versus the previous software driver version 21.12.1.
 
And all the benchmarks are in 4k.
Very useful info. It's like everybody had RTX 3080 class graphics in their system. LOL.

How about more realistic 1440p and 1080p benchmarks?
DLSS and FSR testing basically overlaps with testing at 1080p and 1440p. No, it's not the exact same, but at some point I just have to draw the line and get the testing and the article done. God of War, as noted, doesn't support fullscreen resolutions. That means to change from running at 4K in borderless window mode to testing 1440p, I have to quit the game, change the desktop resolution, and relaunch the game. Then do that again for 1080p. That would require triple the amount of time to test. Alternatively, I could drop a bunch of the DLSS and FSR testing and just do the resolution stuff. Either way, something was going to get missed, and I decided to focus on DLSS and FSR.

As pointed out in the text, an RX 5600 XT at 4K original quality with FSR quality mode achieved 64 fps average performance. Which means anything faster would be playable at 4K as well — that includes the entire RTX 20-series. Drop to 1440p and original quality and performance would be even higher, because FSR doesn't match native resolution performance exactly. Anyway, quality mode upscales 1440p to 4K, so 1440p is easily in reach of previous generation midrange GPUs, and 1080p would drop the requirements even further.
 
  • Like
Reactions: NP and Makaveli
What driver was used for AMD?

AMD Radeon Software Adrenalin 22.1.1

God of War
  • Up to 7% increase in performance in God of War @ 4K Ultra Settings, using Radeon Software Adrenalin 22.1.1 on the 16 GB Radeon RX 6900 XT graphics card, versus the previous software driver version 21.12.1.
  • Up to 7% increase in performance in God of War @ 4K Ultra Settings, using Radeon Software Adrenalin 22.1.1 on the 16 GB Radeon RX 6800 XT graphics card, versus the previous software driver version 21.12.1.
  • Up to 7% increase in performance in God of War @ 4K Ultra Settings, using Radeon Software Adrenalin 22.1.1 on the 12 GB Radeon RX 6700 XT graphics card, versus the previous software driver version 21.12.1.
The article lists the drivers: 22.1.1 and 497.29.
 
  • Like
Reactions: Makaveli
Ah I missed that thank you.

So it looks like more optimizations are in order for amd. I'm going to be playing it at 3440x1440 with a 6800XT shouldn't have an issue with performance.
Yeah, RX 6800 should be fine. I suspect AMD performance is low simply because this is a DX11 game and AMD really doesn't do a ton of DX11 optimizations these days. AMD could probably improve performance another 10-20% with sufficient effort.
 
  • Like
Reactions: Makaveli

Makaveli

Splendid
Yeah, RX 6800 should be fine. I suspect AMD performance is low simply because this is a DX11 game and AMD really doesn't do a ton of DX11 optimizations these days. AMD could probably improve performance another 10-20% with sufficient effort.

I think you are right you generally don't see those big differences on DX12 or Vulkan titles.
 

wifiburger

Distinguished
Feb 21, 2016
613
106
19,190
It's amazing how crap AMD is at their drivers, 20-30fps difference just because it's a DX11 game.

It's really roll the dice when it comes to FPS if it's not Vulkan, DX12 game.
 
Regarding AMD's performance with DirectX 11 (from this Reddit post):

Let's say you have a bunch of command lists on each CPU core in DX11. You have no idea when each of these command lists will be submitted to the GPU (residency not yet known). But you need to patch each of these lists with GPU addresses before submitting them to the graphics card. So the one single CPU core in DX11 that's performing all of your immediate work with the GPU must stop what it's doing and spend time crawling through the DCLs on the other cores. It's a huge hit to performance after more than a few minutes of runtime, though DCLs are very lovely at arbitrarily boosting benchmark scores on tests that run for ~30 seconds.

The best way to do DX11 is from our GCN Performance tip #31: A dedicated thread solely responsible for making D3D calls is usually the best way to drive the API.Notes: The best way to drive a high number of draw calls in DirectX11 is to dedicate a thread to graphics API calls. This thread’s sole responsibility should be to make DirectX calls; any other types of work should be moved onto other threads (including processing memory buffer contents). This graphics “producer thread” approach allows the feeding of the driver’s “consumer thread” as fast as possible, enabling a high number of API calls to be processed.
The tl;dr - AMD recommends you use one thread to channel API calls. Granted RDNA may be able to do things better, but AMD likely didn't change their "best practices", or maybe the best practices for RDNA are the same as GCN.

In any case, NVIDIA's solution to this problem of things being shoved in a single core was to add multithreading support for DirectX 11 at the driver level. From this other Reddit post:
Since the NVIDIA Scheduler was software based and was running in the OS (and not in the GPU) they could intercept draw calls that weren't sent for Command list preperation and manually in the intercepting server batch the work and distribute it among the other Cores. The monitoring and interception would happen in the main thread (Core1) so it added some overhead but it would guarantee that even if the game wasn't optimized to use DX11 Command lists they still wouldn't be CPU bottlenecked since the software scheduler could redistribute the draw call processing across multiple threads/cores. This gives NVIDIA a huge advantage in DX11 titles that isn't properly optimized to distribute game logic across multiple threads/cores.

3DMark used to have a draw call test, but they've gotten rid of it. It would've been nice to see the DirectX 11 multithreaded data for NVIDIA and AMD GPUs.
 

SSGBryan

Reputable
Jan 29, 2021
133
116
4,760
And all the benchmarks are in 4k.
Very useful info. It's like everybody had RTX 3080 class graphics in their system. LOL.

How about more realistic 1440p and 1080p benchmarks?

Tom's Hardware is no different than any tech tuber (LTT, Jayztwocents, Hardware Unboxed, or Gamer's Nexus).

The information is accurate, but it doesn't actually provide anything useful for making buying decisions. They show a theoretical best case scenario.

If they provide 1080p & 1440p benchmarks (and on High settings), you won't feel a need to upgrade your card. At the end of the day, these types of articles are just a sales pitch that the card companies can't be bothered to make.
 
And all the benchmarks are in 4k.
Very useful info. It's like everybody had RTX 3080 class graphics in their system. LOL.

How about more realistic 1440p and 1080p benchmarks?

Tom's Hardware is no different than any tech tuber (LTT, Jayztwocents, Hardware Unboxed, or Gamer's Nexus).

The information is accurate, but it doesn't actually provide anything useful for making buying decisions. They show a theoretical best case scenario.

If they provide 1080p & 1440p benchmarks (and on High settings), you won't feel a need to upgrade your card. At the end of the day, these types of articles are just a sales pitch that the card companies can't be bothered to make.
The problem with lower resolution benchmarks, especially 1080p, is that it shifts the performance burden onto the CPU. Especially with something like DLSS or FSR, where it renders at a lower resolution internally.

If you're trying to benchmark graphics performance, you want to make the video card the bottleneck as much as possible. Otherwise your data is misleading. Besides, as was mentioned, you can glean from whatever card you have on the list that if it can do 60+ FPS at 4K with those settings, then it stands to reason that you'll get 60+ FPS at lower resolutions. But if your "before I even play it" FPS minimum is 120, then I have to wonder how you enjoy anything in life.
 

SSGBryan

Reputable
Jan 29, 2021
133
116
4,760
The problem with lower resolution benchmarks, especially 1080p, is that it shifts the performance burden onto the CPU. Especially with something like DLSS or FSR, where it renders at a lower resolution internally.

If you're trying to benchmark graphics performance, you want to make the video card the bottleneck as much as possible. Otherwise your data is misleading. Besides, as was mentioned, you can glean from whatever card you have on the list that if it can do 60+ FPS at 4K with those settings, then it stands to reason that you'll get 60+ FPS at lower resolutions. But if your "before I even play it" FPS minimum is 120, then I have to wonder how you enjoy anything in life.

Like I said - a theoretical best case scenario; accurate, but not useful.

A gaming rig consists of more than a graphics card. The reason this 4K 60+ FPS nonsense is pushed is to convince people that they need to upgrade. Otherwise, reviewers would take a holistic approach. Doesn't matter how great it runs on a 4k panel, if I don't have one.

before I even play it? Where did I write that? Oh, I didn't.
 
The reason this 4K 60+ FPS nonsense is pushed is to convince people that they need to upgrade.
What does this convince you of? That you need to upgrade to a 4K panel? That you need to upgrade to a video card that can do 4K 60+ FPS? If you're swayed by this kind of information, then I don't know what to tell you.

Besides, if a card can do 60+ FPS at 4K, then it logically follows it can do 60+ FPS at 1440p or 1080p. Since a lot of people's performance target is 60FPS, there's nothing more to discuss here at lower resolutions and they're all just data points.

Otherwise, reviewers would take a holistic approach. Doesn't matter how great it runs on a 4k panel, if I don't have one.
Then this review doesn't matter to me either because I don't have any of the cards they tested with. They also didn't test with the CPU I have in my computer. Or the SSD. Or the exact software configuration that I set up in Windows. Why can't reviewers read my mind and test with what I have so I know how it applies to me specifically?

EDIT: In any case, I missed the other point of this review from my earlier post. It isn't so much a graphics benchmark as it is testing to see how well DLSS and RSR work in improving performance when the GPU is the bottleneck here. 4K is a good resolution because it's something that is accessible and provides enough of a graphics workload to keep the bottleneck on the GPU. Going down to 1440p or 1080p with DLSS or RSR lowers the rendering resolution to 1080p or below, at which case, it's no longer a GPU bottlenecked workload, the CPU enters the equation.
 
Last edited:
Jan 14, 2022
1
0
10
DLSS and FSR testing basically overlaps with testing at 1080p and 1440p. No, it's not the exact same, but at some point I just have to draw the line and get the testing and the article done. God of War, as noted, doesn't support fullscreen resolutions. That means to change from running at 4K in borderless window mode to testing 1440p, I have to quit the game, change the desktop resolution, and relaunch the game. Then do that again for 1080p. That would require triple the amount of time to test. Alternatively, I could drop a bunch of the DLSS and FSR testing and just do the resolution stuff. Either way, something was going to get missed, and I decided to focus on DLSS and FSR.

As pointed out in the text, an RX 5600 XT at 4K original quality with FSR quality mode achieved 64 fps average performance. Which means anything faster would be playable at 4K as well — that includes the entire RTX 20-series. Drop to 1440p and original quality and performance would be even higher, because FSR doesn't match native resolution performance exactly. Anyway, quality mode upscales 1440p to 4K, so 1440p is easily in reach of previous generation midrange GPUs, and 1080p would drop the requirements even further.

I agree with you. It is a fair test.
However, I just want to put in evidence a minor issue, in my opinion. If you make a comparison between two technology (FSR and DLSS) you should at leas put one chart with the main differences such as - Native - FSR - DLSS (at least for Quality or Ultra Quality mode) where AMD is not present you just not show the bar. With three different chart is a bit chaoitc, in my opinion.
I do not want to say that is a bad work at all. I want to clarify this.
 

JTWrenn

Distinguished
Aug 5, 2008
240
158
18,870
Why ti's and no standards? That just seems like a silly set of cards to test given their penetration into the market and then also leaving out the 3080 and 3070. Really strange choice.
 

Makaveli

Splendid
Been playing this for the last couple hours game is fun and runs good.



Game looks great.

Set to ultra on everything except for Shadows at High and Ambient Occlusion on high



A Quick Frame capture of the first 20 seconds of the game when you pickup the log and then head to the boat.