Question Low FPS with GIGABYTE RTX 3080 GAMING OC

Mar 21, 2022
11
1
15
Hi all

I spent a lot to build my rig, and I am quite disappointed... it seems to be underperforming. I checked everything I could think of.. Please help, I will take any advice

In Fortnite:
  • 1080p epic settings : 150-170fps avg
  • 1440p epic settings: 120 fps
  • 40K epic settings: 60 fps
View: https://imgur.com/a/ZFoKMug


I saw a lot of tests running easily at 250 fps in 1080p with max settings, here:

https://www.gpucheck.com/fr-eur/gam...a-geforce-rtx-3080/intel-core-i7-11700k/ultra

  • NZXT H1 V2 including 750W PSU and 140mm AIO
  • Asus ROG STRIX Z590-I GAMING WIFI
  • i7-11700K (Stock settings + Asus Multicore Enhancement MCE enabled in BIOS)
  • Kingston 2x16GB DDR4 KF432C16BB1AK2/32 (Dual Channel and XMP I @ 3200MHz 16-18-18-36 1.35V enabled in BIOS)
  • Gigabyte GeForce RTX 3080 GAMING OC 10G rev 2.0 LHR ("prefer maximum performance" is set in NVIDIA Control Center, Re-size BAR is enabled in BIOS and I verified in NVIDIA System Info)
  • Freesync/Gsync is off
Drivers are up to date, install is fresh, I have done nothing else than installing drivers after the Windows 10 Pro 21H2 install

I haven't tried enabling DLSS, but even without, it seems pretty low to me...

For the H1 V1 prebuilt, the case included a 10700KF + GeForce RTX™ 3060 Ti and NZXT advertise some guaranteed FPS
for Fortnite:
  • 183fps@1080p
  • 159fps@1440p
https://support.nzxt.com/hc/en-us/a...w-can-check-if-my-PC-meets-the-FPS-Guarantee-

They say they use the highest graphical presets available in game. No DLSS, no ray tracing

My goal is no to overclock everything to reach higher and expected value, I should reach them with stock settings, shouldn't I?

Plus I have run 3DMark, nothing wrong, excepts it also says I should reach 195+ fps on Fortnite

View: https://imgur.com/a/EEH9eGd

I also check GPU-Z, it says the gpu perfcapped with vRel and Util (log codes 4 and 16) and once with Pwr (log code 1), I don't know if that can help, I am sure sure to understand what they mean.

So whats wrong?
 
Last edited:
what power supply make & model are you using and how long has it been in use?

In Fortnite:
1440p epic settings: 120 fps
my 11700K + RTX 3080 Ti has no trouble reaching and maintaining 120fps with 3440x1440 in the most demanding games with everything maxed out, i would say Fortnight is not very demanding though.
getting far over that for short periods is easy enough but it usually requires disabling RT effects.

i really wouldn't expect anymore than what you're showing with highest settings and a 3080.
NZXT advertise some guaranteed FPS...for Fortnite
i wouldn't trust any retailer's claims of system performance.
after all they are all about making sales, not doing real tech demos and comparisons.

your 3DMark scores are perfectly normal.
but i also wouldn't rely on what they claim you may achieve in games either.
i always had a similar issue with them claiming my 8700K + 1080 Ti would do twice as good as it ever did in certain games.

try running some scans with userbenchmark or something similar and see if any other components show up as under performing.
 
cpu/gpu temp during the game?
ssd/ram usage?
check the gpu power usage?

you have all this info on the screenshot of MSI afterburner
the GPU is rated at 320W, it reaches that value in game, but GPU load is only around 75%


what power supply make & model are you using and how long has it been in use?

Brand new Gold 750W PSU, no brand though, it comes with the case...

my 11700K + RTX 3080 Ti has no trouble reaching and maintaining 120fps with 3440x1440 in the most demanding games with everything maxed out, i would say Fortnight is not very demanding though.
getting far over that for short periods is easy enough but it usually requires disabling RT effects.

i really wouldn't expect anymore than what you're showing with highest settings and a 3080.

i wouldn't trust any retailer's claims of system performance.
after all they are all about making sales, not doing real tech demos and comparisons.

your 3DMark scores are perfectly normal.
but i also wouldn't rely on what they claim you may achieve in games either.
i always had a similar issue with them claiming my 8700K + 1080 Ti would do twice as good as it ever did in certain games.

try running some scans with userbenchmark or something similar and see if any other components show up as under performing.

The thing is that the FPS are GUARANTEED by NZXT for the prebuilt. If a customer don't observe that fps, they should recall the product, so I am pretty sure they did not throw off some numbers for marketing... I really believe I should be able to do better with a 3080 instead of a 3060 Ti... Maybe you could give it a try with the epic settings in 1080p (no ray tracing, no DLSS)? :)
 
Gold 750W PSU, no brand though
i would never trust a generic power supply for any desktop, especially a gaming system.
you are risking everything connected if you do not know the actual manufacturer and the model it may be based on.

there's a chance that it may not be providing reliable power at extended max usage like when running games for any amount of time.
Drivers are up to date, install is fresh, I have done nothing else than installing drivers after the Windows 10 Pro 21H2 install
if you haven't updated to the latest available BIOS, many times they report performance increase in patch notes.

and when you state 'drivers', you mean all of the motherboard manufacturer's available updates or only those provided by Nvidia?

you don't mention any other games or any type of performance results regarding them.
have you verified that this game itself is running at full performance by uninstalling and deleting user options and then performing a fresh installation?
The thing is that the FPS are GUARANTEED by NZXT for the prebuilt
even if it is guaranteed you cannot expect the same, or even near the same performance unless you have the exact same setup that they are guaranteeing.
same drive, same motherboard with the same BIOS settings, same power supply, same system-wide cooling solutions, etc...
Maybe you could give it a try with the epic settings in 1080p (no ray tracing, no DLSS)?
don't play these types of games.

but in Cyberpunk 2077 with everything maxed + Ray Tracing on and v-sync, G-Sync, & DLSS off;
i get minimum ~118 / maximum ~155 / averaging ~135fps.
and i would say it is much more demanding of both the CPU and GPU than Fortnight.
and also consider that this at 3440x1440 which is quite a few more pixels than standard 2560x1440.

i normally keep v-sync and G-Sync on and haven't found a game yet that i can't max out without DLSS and maintain a constant ~120fps matching my 120Hz display.
I really believe I should be able to do better with a 3080 instead of a 3060 Ti
contact the card manufacturer and have them walk you through some troubleshooting.
if you're not happy with the results, return it.

try running some scans with userbenchmark or something similar and see if any other components show up as under performing.
 
  • Like
Reactions: Roland Of Gilead
BIOS is updated
when I say driver I meant all drivers from my mobo manufacturers, and Nvidia drivers.
I forgot to tell that I use an HDMI from the GPU, no the onbard graphics HDMI
PSU is not branded, probably from an OEM indeed

have you verified that this game itself is running at full performance by uninstalling and deleting user options and then performing a fresh installation?
what do you mean by user options? really I have done nothing else than install windows, update drivers, some cinebench R23, 3DMark, install fortnite, set graphics presets to epic

you don't mention any other games or any type of performance results regarding them.
I quickly tried God Of War yesterday, and the GPU was at 97-98% load and drawing 350W at max
FPS seats around 100-110 avg , it seemed low also compared to benchmarks for this config don't you think?
Again 180p ultra no DLSS
With this game, the LIMIT from what MSI Afterburner says is POWER, I get that.
In Fortnite it is VOLTAGE, so probably related to the vRel perfcap found in GPU-Z...
What does it mean ?
here he does 90-100 in 1440p Ultra no DLSS
View: https://youtu.be/NiBX8Oeq034?t=132


even if it is guaranteed you cannot expect the same, or even near the same performance unless you have the exact same setup that they are guaranteeing.
same drive, same motherboard with the same BIOS settings, same power supply, same system-wide cooling solutions, etc...
I have everything better:
11700K instead of 10700KF
3080 instead of 3060 Ti
PSU 750W instead of 650W
RAM 32 GB 3200MT/s instead of 16 GB 3200MT/s
ASUS Z590-I instead of GIGABYTE B560-I
one more exhaust fan that wasn't present in the V1
they confirmed me they done no overclocking, everything is stock except XMP

but in Cyberpunk 2077 with everything maxed + Ray Tracing on and v-sync, G-Sync, & DLSS off;
i get minimum ~118 / maximum ~155 / averaging ~135fps.
and i would say it is much more demanding of both the CPU and GPU than Fortnight.
and also consider that this at 3440x1440 which is quite a few more pixels than standard 2560x1440.
We agree, at lot more demanding, and yet a lot better that would I could hope for at the moment.

contact the card manufacturer and have them walk you through some troubleshooting.
if you're not happy with the results, return it.
I would like to have a real confirmation that the GPU is the culprit and that indeed my fps are quite low before resorting to that...

I am quite lost, I have been troubleshoot this for a week and I can't clearly pinpoint the culprit...
What do you think?

EDIT:
Some other test this morning with GTA V
The games runs at 160-180 fps in 1080p with Very High graphics presets (both DX10 and DX11)
BUT there is stuttering and framerate drop to 50-60 fps very often
There is definitely something wrong...
Does this mean a CPU bottleneck? with an 11700K really???

View: https://imgur.com/a/m5zgLeY

EDIT 2:
Maybe an anwser here:
It is rather a game related issue that a "real" CPU bottleneck apparently...

EDIT3:
I reenabled Vsync, but I have only a 60Hz display, so fps are now capped to 60, no stuttering at this framerate obvioulsy, it would need a 144Hz to properly test that...
 
Last edited:
what do you mean by user options?
games settings will be saved to user config files.
sometimes these can be out of whack compared to the developer defaults.

if the game has some type of corrupt file due to update or install issue;
uninstalling the game and deleting all user data folders(usually stored somewhere in C:\Users\**) and then reinstalling can sometimes help with performance.
Does this mean a CPU bottleneck? with an 11700K really?
definitely not.

though it's possible that your just crippling the CPU through default BIOS settings.
manufacturers today seem to sometimes implement some odd auto-clocking features and other useless, sometimes actually harmful to performance, options in the BIOS.
i always get better performance manually configuring the majority of settings, even when not overclocking.

but by default this CPU should be boosting up to 5GHz when gaming / heavy processing.
seeing it locked at 4.6 on your OG screenshot is odd.
possible that such a low demanding game just doesn't need that much, but then it should be providing more frames to the GPU anyway.

my guess is that you have a hardware limitation elsewhere.
what drive do you the game installed on?

and make sure all Nvidia Control panel > 'Manage 3D Settings' global options and for this specific game are set for high performance and not including any extra enhanced visual effects.
I quickly tried God Of War yesterday, and the GPU was at 97-98% load and drawing 350W at max
FPS seats around 100-110 avg
God of War is actually one of the few games that i have trouble getting very high fps from so that may not be the best test-bench for this troubleshooting.
i can maintain 120fps but only max ~135-140 when doing tests. that's some of the lowest i see in newer games.

but you running 1080p with that fps is definitely low.
contact the card manufacturer and have them walk you through some troubleshooting.
I would like to have a real confirmation that the GPU is the culprit and that indeed my fps are quite low before resorting to that.
what's the difference if the manufacturer walks you through hardware troubleshooting or if you try to do it through a forum of volunteers?

though seeing your 3DMark scores, even without you ever actually running userbenchmark, we can see that the card is performing fine.
 
but by default this CPU should be boosting up to 5GHz when gaming / heavy processing.
seeing it locked at 4.6 on your OG screenshot is odd.
possible that such a low demanding game just doesn't need that much, but then it should be providing more frames to the GPU anyway.

my guess is that you have a hardware limitation elsewhere.
what drive do you the game installed on?
M.2 NVME SSD PCIe 4.0 Kingston Renegate 2 To
Is it in the correct pcie 4.0 slot of my motherboard

and make sure all Nvidia Control panel > 'Manage 3D Settings' global options and for this specific game are set for high performance and not including any extra enhanced visual effects.
View: https://imgur.com/a/TDITMJk

does this seem ok?

what's the difference if the manufacturer walks you through hardware troubleshooting or if you try to do it through a forum of volunteers?
though seeing your 3DMark scores, even without you ever actually running userbenchmark, we can see that the card is performing fine.
I was talking about the return.
I tend to rely more on volunteers that are impartial. I am afraid if I ask NVIDIA, they will immediately accuse any other piece of hardware but their own...
 
I am afraid if I ask NVIDIA, they will immediately accuse any other piece of hardware
Nvidia wouldn't offer any support for a Gigabyte model card anyway.
you'd have to contact Gigabyte support directly.
it may not really be of any value, they'd likely just have you run something like 3Dmark and see that it is performing fine there.
M.2 NVME SSD PCIe 4.0 Kingston Renegate 2 To
Is it in the correct pcie 4.0 slot of my motherboard
you'd have to check your motherboard manual or BIOS options.
possible only a single slot offers 4.0 but this type of difference wouldn't affect game loading times or performance.
as long as you're not on an HDD, speeds should be fine.

but to check other hardware's possible performance issues;
like i've mentioned a few times now, run userbenchmark
does this seem ok?
as long as extra AA or sharpening effects aren't on, it shouldn't affect performance in any way.
2 things i would change though;
Negative LOD bias to 'clamp'
and Trilinear optimization to 'off'.

one other thing to check;
load up the game and while in alt+tab out and open Task Manager.
provide a screenshot of the 'Processes' tab showing 'Apps' & 'Background processes'.
 
you'd have to check your motherboard manual or BIOS options.
yep done as explained

like i've mentioned a few times now, run userbenchmark
there it is
https://www.userbenchmark.com/UserRun/51405742

EDIT:
here the screenshot after the 2 modif in NVIDIA control center
I started the MSI benchmark after the bus launch (beginning of the match)
while looking at it I will give it another try with Windows Defender off
View: https://imgur.com/a/NiOVR1I

EDIT2: without AV
View: https://imgur.com/a/5dl6IGU
 
Last edited:
drive is under-performing and it's missing a score for your GPU.
seems you didn't complete the tests.
yep don't know why some data is missing it went through though
here is another, this time the SSD is green
https://www.userbenchmark.com/UserRun/51406355

GTA V's engine starts to break passed 144FPS or so, it stutters like crazy, this is not your hardware.
yep I came to that understanding, GTA V is not my main concern anyway
 
²
2400MHz vs 3200MHz RAM can cause some noticeable performance difference, even more so with later gen CPUs.

if you have XMP enabled and it's still not set to manufacturer rated specs it could possibly be an issue with motherboard & RAM compatibility.
check the motherboard memory QVL and see if your kit is listed.

Yep I turn XMP DDR4-3200 on and off so much times I don't know what and where it when wrong
I manually reenabled speed to 3200 Mhz but I am pretty sure I didn't have to the other times.
I wil re do the in game test...

EDIT : here it is with XMP enabled and speed manually set to 3200 mhz as it should be (it was at the beginning and I lost it somewhere in the middle)
it is better, but still lower that all the aformentionned sources
View: https://imgur.com/a/2t0zw3p
 
Last edited:
here is another
your 3080 showing that 52 out of a hundred perform better is not really a good sign.
it could be that all of those were overclocked to their max but with modern cards that doesn't usually make that much of a difference in performance due to the manufacturers already testing and grouping them by model based on test speeds.

i'm not sure what memory speeds are supposed to be for this model but 4750MHz seems pretty low for a current gen card.
my 3080 Ti runs >9500MHz.
 
your 3080 showing that 52 out of a hundred perform better is not really a good sign.
it could be that all of those were overclocked to their max but with modern cards that doesn't usually make that much of a difference in performance due to the manufacturers already testing and grouping them by model based on test speeds.

i'm not sure what memory speeds are supposed to be for this model but 4750MHz seems pretty low for a current gen card.
my 3080 Ti runs >9500MHz.

I am even more lost...
I did it again, same results
https://www.userbenchmark.com/UserRun/51407114
the GPU benchmark wasn't going through because of MSI afterburner (I just had to close the app before)
EDIT:
I don't know what it is that userbench value but MSI afterburner says 9500 MHz in game
So what now...
 
i'm guessing since 4750x2=9500 that this value is correct for your card since your screenshot does show 9502MHz, but not entirely sure.
many used to get confused with DDR2 memory by the x2 equation.
I see. You haven't a clue then. Typical to resort to that kind of humour in this situation.
FYI :
i'm not sure what memory speeds are supposed to be for this model but 4750MHz seems pretty low for a current gen card.
my 3080 Ti runs >9500MHz.