[SOLVED] Underperforming rtx 2070S?

Dec 23, 2019
35
0
30
0
Today I upgraded my GPU to ASUS rtx 2070 super. I was very excited to try it in games but it seems to be underperforming in Red Dead Redemption 2 on all max details (1080p). My specs are:

CPU: 4790K (Cooler Master 212 EVO cooler)
GPU: rtx 2070 super
MB: ASUS H97M-E
RAM: HyperX Fury 1600 Mhz DDR3
PSU: RM 750x 750W GOLD+

One important detail: MSI afterburner reports under 50% CPU usage (on all threads) and 100% GPU usage while the FPS is around 20. That is WAYYYY lower than the benchmarks suggest. All temperatures are well within the safe limits (~50C).

I used DDU to uninstall the old drivers and installed new NVIDIA drivers from the website (shouldn't matter though as I was upgrading from GTX 970).

What's wrong with my rig?
 
Dec 23, 2019
35
0
30
0
Your CPU will hold that card back , by about 25 to 27 %. 1150 PLATFORM is your wall right now. CPU's are too weak for those cards.
Thank you for your answer.

How come my CPU is a bottleneck if no core reports anywhere near 100% usage (around 25% and under 50% on all of them)?

What CPU (either AMD or Intel) do you recommend to go with 2070s?

Thanks.
 
Dec 23, 2019
35
0
30
0
Thanks for the list! Not a bad price either, all costs less than the GPU itself...

However, when I tweak the graphics settings in RDR2 I get more FPS, e.g. when I turn water details down or tesselation.

Also, in other game I tried, BeamNG.drive, when I turn on reflection quality up and frames per update to 6, the fps drops down to 40. Without reflection details it runs at 60.

Are these things really computed on CPU instead of GPU?
 
Dec 23, 2019
35
0
30
0
PCPartPicker Part List

CPU: AMD Ryzen 5 3600 3.6 GHz 6-Core Processor ($189.99 @ Amazon)
Motherboard: MSI B450 TOMAHAWK MAX ATX AM4 Motherboard ($114.99 @ B&H)
Memory: Corsair Vengeance LPX 16 GB (2 x 8 GB) DDR4-3000 Memory ($75.98 @ Amazon)
Total: $380.96
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2020-01-16 17:27 EST-0500
By the way, I used the website pc-builds to estimate the bottleneck and I guess that's what you based your number 25% CPU bottleneck on...

The same site, however, suggests to use 32GB of RAM to work well with that CPU (AMD Ryzen 5 3600 will need at least 32GB of RAM to work well.). Is that really necessary? I mean, RAM is cheap nowadays, but gosh...32 seems like an overkill :D
 
Dec 23, 2019
35
0
30
0
ACTUALLY mine was a guess , I was going to say 30% . but backed it down a little. You don't need 32gb.
Oh, okay, thanks. Just out of curiosity, how does one even calculate/guess such number? In fact, how does a website algorithm make such an estimate? I can only think of one (experimental) way to determine the bottleneck:

Put in the hardware you want determined. Assume that nothing else is a bottleneck. Now do benchmarks on many games. Then swap the CPU for the best one available. If the frames go higher, CPU was the bottleneck and the % is how much more frames you get. Then put in the tested CPU and swap the GPU for the best GPU possible. Again, if the frames went higher, the GPU was the bottleneck and % is calculated same as before.

Or is it just experience? Based on some reference system you know GPU 1 and CPU 1 are not bottlenecking each other you know that GPU 2 and CPU 2 are 20% better than 1 each so you know that the system with CPU 2 and GPU 2 will not be bottlenecked by either etc?
 
Dec 23, 2019
35
0
30
0
My previous system was 4790K+GTX970+16GB RAM. I switched the GTX970 for an RTX 2070S, but I didn't get any higher FPS in Red Dead Redemption 2 (1080p ultra), still around 30.

I was advised here to upgrade the CPU so I got this new system and built it today
AMD Ryzen 5 3600
32 GB of RIPJAWS (3200Mhz, 16-18-18-38 XMP enabled in bios)
all mounted on Asus x570.
Kept the Asus RTX 2070 super dual fan OC.

Booting off an SATA SSD, playing from HDD (not that it matters but at this point I'm questioning everything I know about PC HW).

I still get only ~36 FPS on 1080p Ultra settings. I don't get it. What the heck should I upgrade to to play at least 60fps? Two RTX 2080 titans?

Jokes aside, all temperatures look reasonable, the air coming out of the case is cool. I have a one big 200mm front intake fan, one 140mm rear fan and 140mm top fan. One thing I noticed that in other games my GPU usage went up from something under 100% to 100% after I upgraded the CPU (while the CPU still reports around 10-20% usage), so it seems like the GPU is a bottleneck again :( I knew I should've gotten a 2080S! -_-

Help pls :/ it really feels like I'm still gaming on my old PC. Or maybe, just maybe, RDR2 is a very poorly optimised game.
 
Last edited by a moderator:

GarrettL

Prominent
Dec 4, 2019
641
112
590
23
I have Red Dead 2.

3800x and 2070S but at 1440p.

That game puts all the load on the gpu. Look at the cpu and usage is usually 10-15%.

And there are an absolute ton of settings to play around with. Put the tree texture on ultra and it’s amazing but really low frame rates.

You’d have to list your exact settings.
 
Reactions: Phaaze88
Dec 23, 2019
35
0
30
0
When you upgraded the CPU, MB & RAM did you do a clean install of Windows, motherboard/GPU drivers & games?
I kept the old drives, so the games are installed on the hard drive as before (steam/R* games launcher). Everything got recognized in the same way, drives kept the same letter, everything looks pretty much the same. I did a clean reinstall of the NVidia drivers when I put in the new GPU.
 
Dec 23, 2019
35
0
30
0
And the HDD is not helping
Hm, this is interesting...

The way HDD speed (negatively) affects gaming performance is usually (in my experience) via sudden frame drops when something has to load like...NOW. To suggest that a continuous moderate framerates are caused by an HDD seems like RDR2 is a really badly optimized game, I haven't experienced HDD-related FPS drops in a game for a long time...I'm kinda getting flashbacks to GTA IV, that game was genuinely badly optimized and I had to wait a couple of years for a normal hardware (price-wise) to run it.

I will test your theory when I get home and fire up RDR2 and check the disk usage.

As for my settings, I turned everything to highest possible setting (1080p). I'm surprised this would be a problem for a 2019 game on a 2019 hardware. Struggling with 1080p now, that 4K is becoming a thing....

At some point I was debating getting an ultrawide monitor (mostly for driving games where I would benefit from seeing a little more in my field of view), but that's hardly worth it if the GPU is already the bottleneck and I'd be getting...20fps? I'm not even considering upgrading to 2080ti, the price tag is ridiculous. So I'm mostly probably looking at playing old games on a higher resolution or new games on 1080p high settings or higher resolution on medium settings.

Feel free to correct me if my thoughts are flawed here, I'd appreciate to learn more about stuff :)
 

WildCard999

Titan
Ambassador
I kept the old drives, so the games are installed on the hard drive as before (steam/R* games launcher). Everything got recognized in the same way, drives kept the same letter, everything looks pretty much the same. I did a clean reinstall of the NVidia drivers when I put in the new GPU.
When you replace the motherboard you should do a clean install of Windows/software as there can be conflicting motherboard drivers which will cause performance issues. And a Windows reset won't work (well), it needs to be a clean install.
 
Doing a CPU and MB swap especially going from intel to amd you need to do a fresh install of the OS, yes things have gotten better with windows 10 and swapping parts but a reinstall is the best way.

You can keep steam on the HDD, you will need to reinstall the steam client and point it to where your games are. You can also move the RDR2 game folder from the HDD to the SSD and repoint steam to the new location without having to re-download the game again.
 
Dec 23, 2019
35
0
30
0
So, today I did a clean reinstall of windows, installed the motherboard drivers and NVidia drivers...only to find out that the framerate in RDR2 didn't change at all. :/ These are my graphics settings

Resolution 1920x1080
Refresh rate 60
VSync on
Triple buffering on
Shadow quality Ultra
Far Shadow Quality Ultra
SSAO Ultra
Reflection Quality Ultra
Mirror Quality Ultra
Water Quality High
Volumetrics Quality Ultra
Particle Quality Ultra
Tesselation Quality Medium
TAA High
FXAA On
MSAA x4
Graphics API DX12
Near Volumetric Resolution Ultra
Far Volumetric Resolution Ultra
Volumetric Lighting Quality Ultra
Unlocked Volumetric Raymarch Resolution On
Particle Lighting Quality Ultra
Soft Shadows Ultra
Grass Shadows High
Long Shadows On
Full Res SSAO On
Water Reflection Quality Medium
Water Refraction Quality Medium
Water Physics Quality Slider all the way to the right
Resolution Scale Off
TAA Sharpening Slider all the way to the right
Motion Blur Off
Reflection MSAA x8
Geometry Level of Detail Slider all the way to the right
Grass Level of detail Slider all the way to the right
Tree Quality High
Parallax Occlusion Mapping Quality Ultra
Decal Quality Ultra
Fur Quality High
Tree Tesselation Off

Averaging ~26 fps. What should I lower that won't affect visuals too much but will alleviate the load on GPU? But this is ridiculous, really. One shouldn't need a NASA-tier PC to run a game on ultra at 1080p...
 
Dec 23, 2019
35
0
30
0
Sorry for "spam", but this is interesting!

Actually, it seems like MSI Afterburner (RivaTuner) reports wrong FPS in RDR2. The benchmark itself reports 45 and 56 average FPS on high and ultra settings, respectively (with 15 and 30 minimum and 60 maximum which is capped because of the refresh rate). Unfortunately, RDR2 does not report 1% and 0.1% lows, but the game runs relatively smoothly.

I decided to not to trust MSI after seeing FPS drop to 20 when I turned settings down to High, it started to be obvious that the game ran at the 60fps (20 is actually not very playable).

I don't know what tool I should be using to benchmark the games reliably (and I mean real gameplay, not the game's own benchmarks; e.g. in RDR2 I miss benchmarks near water, or swimming in water, running over bodies of water with a horse etc. - I always see FPS drops near water).
 
One important detail: MSI afterburner reports under 50% CPU usage (on all threads) and 100% GPU usage while the FPS is around 20. That is WAYYYY lower than the benchmarks suggest.
This is likely because you have triple-buffering with vsync enabled. Vsync prevents screen tearing by having the system wait until the monitor's next refresh cycle before sending the completed image to the screen. Triple-buffering tells the system to keep rendering new frames even if they are not being drawn yet. So, depending on which metric is being looked at, different numbers can get reported. Generally, vsync isn't going to be great unless you are maintaining frame rates higher than your screen's refresh rate, which appears to be 60Hz in this case. What monitor is it? Does it happen to support G-sync/FreeSync?

RDR2 is really just a very demanding game, and is currently one of the most demanding on graphics hardware if you max its settings, going beyond what is reasonable for today's hardware. And some of those settings can cause big, double-digit hits to performance when set to ultra, in some cases without significantly improving visuals.

For example, having water physics maxed is going to tank your frame rate around bodies of water, so you probably don't want the slider all the way to the right, and it should be lowered to around the 50-75% range. However, water reflection quality causes only a minor hit to performance, so you could actually turn that up on high. Water refractions is somewhere in-between, but medium can improve performance a noticeable amount, again, without having a major impact on visuals. There are also a number of other settings that can negatively affect visuals, but don't really improve performance much, and should probably be left on ultra.

Hardware Unboxed did a good pair of videos when the game launched covering the relative performance hit and impact on visuals for each setting, so they might be worth a watch. You don't necessarily need to follow their recommendations entirely though, as some of the settings they suggest lowering only amount to around a 1-2% performance gain over the next highest setting, which is probably not going to be too important for a high-end graphics card like a 2070 SUPER running at 1080p. Others however, like water physics (when near water), lighting quality (at night), reflection quality, and MSAA, can cause very large hits to performance, potentially in the 20-40% range. And a lot of other adjustments can improve performance by around 5% or so, often without noticeably affecting visuals, so some of those might be worth adjusting too.

Their first video covers the basic settings, which will tend to have the largest impact on performance, while the second video covers the advanced settings, which are mostly linked to the basic ones, and can be used to extract a bit more performance or a bit better visuals out of them. Those water quality settings I mentioned before are detailed in there...

View: https://www.youtube.com/watch?v=385eG1IEZMU


View: https://www.youtube.com/watch?v=C3xQ33Cq4CE
 
Reactions: brewcrew650

ASK THE COMMUNITY

TRENDING THREADS