Question RTX 2080S Underperforming

Jan 1, 2021
15
1
15
Just a few days ago I have recently bought an Alienware R11 desktop. 16GB RAM, Overclocked (+145,+1500 via AB) RTX 2080S, and an i7-10700F CPU.
I did a userbenchmark and I'm scoring pretty well, usually the top 30% of my exact setup statistics. But I feel like something is wrong and hindering my performance on my GPU because even some not so demanding games are struggling to stay above 120fps on my 165hz monitor that can go to 120fps.
I've seen people with pretty similar if not the same specs running games like COD Warzone at high 1080p at like 150fps, I'm working on the lowest settings at 1080p (downscaled from 1440p base resolution) and I'm bouncing around between like 70-110.
The amount of time I have taken watching YouTube videos and reading forums is absurd. I've tried almost every little windows setting tweak, OC'd my card, and even under high stress situations I don't get very good GPU usage.
Someone please help! I wish to start streaming soon with my girlfriend, and I would love to know if there are any fixes I can do to make this thing exactly what it should be, a mid to high tier gaming PC.
 
Foolproof checks:
  • Is the CPU and GPU temps at/below 80C while under sustained load?
  • Is the GPU frequency while under load as expected?
  • Have you tried a driver update for the GPU?
  • What is the GPU usage % in-game? You generally want this in the 90%+ range
Be very careful about comparing your results to a "similar" system on youtube. Seemingly insignificant differences in hardware, game, or in-game settings can cause drastic FPS differences if you don't know exactly what you're looking at.

If you're running 1080p in-game, then the majority of the load is going to be on the CPU (especially with a 2080S). As a general rule of thumb, the CPU tells the GPU what to draw in each frame. If you lessen the complexity of the picture (lower resolution and/or in-game quality settings), the GPU can finish rendering n frame before the CPU tells it to draw in n+1 frame. This will result in <100% GPU usage. This is most likely your situation. Your FPS is controlled by your CPU speed. For example, your FPS would probably be [roughly] the same at 1080p regardless of whether you're at minimum or high quality settings.
 
Last edited:
  • Like
Reactions: helper800
I'm actually running most games at 1440p, but sometimes for instance in Apex Legends I downscale to 1080p on high settings. I managed to do a couple things and run it a bit more stable FPS wise. I have RTSS running monitoring my GPU (Temp, %, and vram usage) and CPU (Temp and usage). My GPU stays around 55%-75% after I drop and CPU usually sits around 40%. Both have occasional spikes of about 20%. CPU Temp usually sits 60-70C while my GPU is almost always under 75C. My GPU driver is the most recent.
 
fresh windows install
This computer is about a week old, nothing else on here besides Steam, iCue, Spotify, AB, and a few steam games. I don't think it's anything to do with outdated or tinkered with Windows.

The max q version is about 30% slower than the desktop 2080 super. Which one do you have? That would put you at desktop RTX 2070 performance. Not the super either.
Is there a well to tell without opening the case?
 
Just about a month ago I had purchased a brand new Alienware Aurora R11 with the following specs:
Dell OEM 2080 Super​
i7-10700f​
16GB Hyperfury RAM​

I feel like there is something limiting my GPU power here. I've seen many different benchmarks with the 2080 Super and it just doesn't compare. I haven't recorded the exact numbers for FPS but this is what I'm performing at:
Warzone: (1440p Low settings) 90-100fps | (1080p Low-Medium Settings) ~110fps
Fortnite: (1440p Low settings) 70-110 Very inconsistent and laggy | (1080p Low-medium) ~100fps
Apex Legends: (1440p Medium) 85-100 | (1080p Medium) ~120 with inconsistency
I also have OC'd my card through MSI Afterburner. Core Clock +125 and Memory Clock +350.
I'm not sure if anyone knows how to pinpoint the issue here. I know the 2080 Super isn't the latest and greatest, but from my own research it's a pretty viable card in 2021 for 1440p gaming on medium to high settings on most games even like warzone. I personally cannot stand downscaling to 1080p since I have been gaming on 1440p monitors for like 3 years now.
I am not looking to run something like cyberpunk at ultra and rt on at 200fps or something. I just wish to figure out what's causing somewhat low and inconsistent frames on a new-ish card. My goal is to play titles like those at 120fps solid at 1440p.

I've seen a post or two about RAM speeds and how that can cause performance issues with GPU's. I have no idea how to tell what my RAM speeds are, and how I would use XMP to my advantage. Also, if someone could explain the role of RAM in gaming performance alongside your GPU.
 
Just about a month ago I had purchased a brand new Alienware Aurora R11 with the following specs:
Dell OEM 2080 Super​
i7-10700f​
16GB Hyperfury RAM​

I feel like there is something limiting my GPU power here. I've seen many different benchmarks with the 2080 Super and it just doesn't compare. I haven't recorded the exact numbers for FPS but this is what I'm performing at:
Warzone: (1440p Low settings) 90-100fps | (1080p Low-Medium Settings) ~110fps
Fortnite: (1440p Low settings) 70-110 Very inconsistent and laggy | (1080p Low-medium) ~100fps
Apex Legends: (1440p Medium) 85-100 | (1080p Medium) ~120 with inconsistency
I also have OC'd my card through MSI Afterburner. Core Clock +125 and Memory Clock +350.
I'm not sure if anyone knows how to pinpoint the issue here. I know the 2080 Super isn't the latest and greatest, but from my own research it's a pretty viable card in 2021 for 1440p gaming on medium to high settings on most games even like warzone. I personally cannot stand downscaling to 1080p since I have been gaming on 1440p monitors for like 3 years now.
I am not looking to run something like cyberpunk at ultra and rt on at 200fps or something. I just wish to figure out what's causing somewhat low and inconsistent frames on a new-ish card. My goal is to play titles like those at 120fps solid at 1440p.

I've seen a post or two about RAM speeds and how that can cause performance issues with GPU's. I have no idea how to tell what my RAM speeds are, and how I would use XMP to my advantage. Also, if someone could explain the role of RAM in gaming performance alongside your GPU.
Run userbenchmark and share the public link to the results. This will tell us about your RAM.
 
In all of your examples you seem to be hitting around 100-110 fps regardless of 1080p or setting fidelity. This is indicative of a CPU bottleneck. The RAM is a bit suspect as well considering my old 3200mghz cl 16-18... kit usually hit 108-112% on userbenchmark, however, it is a 2x16gb kit which means 2T (more performance) and not 1T (less performance) as most 2x8gb kits are.
 
I know most hardware troubleshooting methods start with testing another CPU/PSU/RAM sticks etc to determine what's the problem, but I do not have another CPU laying around.

Is there another way to test if it's my CPU solely? Also, if does turn out to be a CPU bottleneck, would you mind pointing me in the right direction for which CPU I should get according to my motherboard and GPU? From what I know, some CPU's don't work on certain MOBO's and some GPU's benefit with specific CPU's.