[SOLVED] Metro Last Light at 4x SSAA fps drops to 30-40 on 2080 Super?

Oct 20, 2019
68
1
35
My CPU is pretty weak 75 7500 but the usage was 45-65% while GPU usage was 90-100% on 1440 p, all max settings.

If its not CPU limitation then how come 2080 is not being able to handle 4x SSAA?

When I switch to 3x SSAA the FPS goes up nicely and 2x SSAA fps goes even more but CPU usage increases obviously.

Is SSAA that demanding?

I played DEUS EX Mankind devided on all max/untra settings with 8x MSAA and my frames were good. So 2080 is running beautifly


Thanks
 
Solution
That's kind of an oddly mediocre CPU to pair with a 2080 Super. Also, are you playing original Last Light, or Redux? Due to the enhanced graphics, Redux has significantly lower FPS.

To answer your question on how taxing SSAA is, there's a GeForce graphics/performance guide Nvidia made on the game. Unfortunately it does not detail percentage of performance impact on AA settings like most of their guides do, but it does go into other details that point to it being very demanding.

First off, 4A has hard coded their own custom FXAA in the background, which automatically runs along with whatever AA you choose. That and the fact that the chart in the GeForce guide shows even the then top tier GTX Titan suggested setting being 2x SSAA, and...
That's kind of an oddly mediocre CPU to pair with a 2080 Super. Also, are you playing original Last Light, or Redux? Due to the enhanced graphics, Redux has significantly lower FPS.

To answer your question on how taxing SSAA is, there's a GeForce graphics/performance guide Nvidia made on the game. Unfortunately it does not detail percentage of performance impact on AA settings like most of their guides do, but it does go into other details that point to it being very demanding.

First off, 4A has hard coded their own custom FXAA in the background, which automatically runs along with whatever AA you choose. That and the fact that the chart in the GeForce guide shows even the then top tier GTX Titan suggested setting being 2x SSAA, and it becomes apparent how resource hungry 4A's "AAA" is.

https://www.geforce.com/whats-new/guides/metro-last-light-graphics-breakdown-and-performance-guide

Now granted, it's been a while since the GTX Titan days, and the 2080 Super has twice the Passmark score, but this could be down to a combination of your CPU bottlenecking the GPU a bit, VRAM limitations, or God knows what. Granted, the Super has faster GDDR VRAM, but that doesn't necessarily mean it's equally effective in all games.

The RTX capable architecture alone in these RTX series GPUs seems to cause some inconsistencies.
 
  • Like
Reactions: Fix_that_Glitch
Solution
That's kind of an oddly mediocre CPU to pair with a 2080 Super. Also, are you playing original Last Light, or Redux? Due to the enhanced graphics, Redux has significantly lower FPS.

To answer your question on how taxing SSAA is, there's a GeForce graphics/performance guide Nvidia made on the game. Unfortunately it does not detail percentage of performance impact on AA settings like most of their guides do, but it does go into other details that point to it being very demanding.

First off, 4A has hard coded their own custom FXAA in the background, which automatically runs along with whatever AA you choose. That and the fact that the chart in the GeForce guide shows even the then top tier GTX Titan suggested setting being 2x SSAA, and it becomes apparent how resource hungry 4A's "AAA" is.

https://www.geforce.com/whats-new/guides/metro-last-light-graphics-breakdown-and-performance-guide

Now granted, it's been a while since the GTX Titan days, and the 2080 Super has twice the Passmark score, but this could be down to a combination of your CPU bottlenecking the GPU a bit, VRAM limitations, or God knows what. Granted, the Super has faster GDDR VRAM, but that doesn't necessarily mean it's equally effective in all games.

The RTX capable architecture alone in these RTX series GPUs seems to cause some inconsistencies.
I agree the max gpu you should pair with the i5-7500 is the 1660ti, maybe even 1070. But what is rest of the setup, motherboard memory, ssd or hd etc.
 
Oct 20, 2019
68
1
35
I am changing my CPU to i5 8500 in a coming days. I know this will make a difference but since my CPU usage was not as much I thought it wasn't really CPU that was causing FPS drop,
 
You might be happier with at least an 8700...up the clock speed, and thread count! (Naturally, your existing mainboard will not support 8xxx/9xxx series CPUs, just as a reminder)

At current prices, you might do better going R5-3600 and a B450 that supports BIOS flashback...(rivals 8700K perfomrance in most games)
 
Oct 20, 2019
68
1
35
You might be happier with at least an 8700...up the clock speed, and thread count! (Naturally, your existing mainboard will not support 8xxx/9xxx series CPUs, just as a reminder)

At current prices, you might do better going R5-3600 and a B450 that supports BIOS flashback...(rivals 8700K perfomrance in most games)

Yeah i know. Im getting 8500 on B360 motherboard. I saw reviews of 8400 running with 2080 and it was great on my games. 8500 is even faster

I will be running 1440p on all ultra on 74Hz overclocked monitor and maybe in future getting 4k. I still have to go through Hitman, Creed 3 and blacj flag and many other older games.

So fps not issue for me as long as its over 50 and around 65-80 range.

I ran Deus Ex mankind D on 8x MSAA 1440p. Smoooooth. I suffered to play it on my 1070 at 4x msaa.

Lets see how my cpu affects last light redux on 4x ssaa.