[SOLVED] i7 4790k w/ RTX 2080Ti, Getting pretty low FPS

kennysmoothx

Distinguished
Jun 16, 2011
35
0
18,540
Hello Everyone,

I just recently got a ROG Strix 2080Ti, and I'm not very happy with the results I'm getting.

I play on 1440p resolution and games like Battlefield V are running at around 60-80 FPS.

Currently I have an i7-4790k with 16gb of ram and I'm wondering if the 4790k can be bottlenecking my graphics card?

I would think the 2080ti can do better than ~70 average FPS on battlefield.

Any suggestions?
 
Solution
FPS seems a little low alright for that res. Have you got the RT features turned on? If so, that's probably why.

Your CPU isn't a slouch exactly, and is still relevant gaming wise. An OC on the CPU might help negate a potential bottleneck, but at 1440p the CPU plays less of a part.
Hmm I see, I have everything on Ultra so I think Raytracing is on.

But what confuses me is that I see benchmarks where the 2080ti gets much better frames on 1440p, and the FPS I am getting are comparable to what some rigs are getting @ 4k with battlefield 5. 🤔
 
Hello Everyone,

I just recently got a ROG Strix 2080Ti, and I'm not very happy with the results I'm getting.

I play on 1440p resolution and games like Battlefield V are running at around 60-80 FPS.

Currently I have an i7-4790k with 16gb of ram and I'm wondering if the 4790k can be bottlenecking my graphics card?

I would think the 2080ti can do better than ~70 average FPS on battlefield.

Any suggestions?


That's a lot to ask from a 4790K, the RTX 2080Ti is a monster.

Yes, it's bottlenecking the GPU.

Best thing you can do is a platform upgrade, CPU, MB, RAM. (9700K or 9900K)

The RTX 2080 would have been a better match for your CPU at 1440P.
 
Last edited:
Hmm I see, I have everything on Ultra so I think Raytracing is on.

But what confuses me is that I see benchmarks where the 2080ti gets much better frames on 1440p, and the FPS I am getting are comparable to what some rigs are getting @ 4k with battlefield 5. 🤔


Turn off Ray Tracing and see what you get FPS wise, it's a massive performance hit.
 
  • Like
Reactions: kennysmoothx
Hmm I see, I have everything on Ultra so I think Raytracing is on.

But what confuses me is that I see benchmarks where the 2080ti gets much better frames on 1440p, and the FPS I am getting are comparable to what some rigs are getting @ 4k with battlefield 5. 🤔


Yes, like i said, and Jankerson has too, RT can butcher the FPS. Turn the feature off, and see what FPS you are hitting.

I'd agree a little with Jankerson, there could be a bottleneck in some games at 1080p, but at 1440p, it's much less so. Let us know how you get on.

You've a decent mobo, and a chip than can hit 4.6ghz all core OC relatively easily. That will certainly help with any bottlenecking.
 
Hey everyone I wanted to follow back up with all of you.

There was one thing that completely fixed my FPS:

Updating from windows 10 1803 to 1809.

The 1809 update, is the one that includes Raytracing and the DX12 improvements.

After this update with DX12 enabled and nothing else, my fps went from 70 average to 120 average on 1440p. Night and day comparison.

Its also nice to hear the 4790k still holds up, you guys inspired me to get a nice cooling solution and I went ahead and OC'd it, hoping to get some more time with it.

You guys rock and Toms Hardware remains one of the best forums on the internet! Cheers!

EDIT: Just for clarification, I never had Raytracing enabled, RT required 1809, and the 120 fps is without RT.
 
Last edited:
I had a 4790k as well except with a gtx 1060 and I noticed I got a huge fps boost with DX 12 enabled. Other than that yea the cpu is the bottleneck, which is why I upgraded.

Hmmm That's very odd. Usually Nvidia GPU's don't do too well with DX12. AMD cards are much more adept. For games like BF1/V switching to DX12 from DX11 yields much worse results and is a common complaint for Nvidia GPU's. For eye candy DX12 looks brill, but can really hit max/average FPS. DX11 is the better path for more demanding games, typically with Nvidia GPU's.
 
So I also had a 4790K paired with a 2080 Ti and I'm pretty sure the CPU was not bottlenecking the GPU, as I was running some benchmark tests. That said it's a 5 year old motherboard/cpu with DDR-3. I am on Windows 10 build 1903 with latest nvidia drivers.

For unrelated reasons I did upgrade to a 9900K with 64GB of DDR-4 (3000). Yes, the ram is overkill. 32 or even 16 would be fine for gaming, but I do some work using very large files that justifies that.

And I've been playing Borderlands 3 so before the upgrade I couldn't handle 4K ultra -- was getting ~48FPS, and even then the CPU wasn't the bottleneck. So eventually I dropped down a notch in resolution to 1440p and was able to play at "badass" everything maxed out and no problem staying above 60 fps.

So post upgrade, I also installed a 1TB Samsung EVO NVMe drive onto a MSI MPG Z390 Gaming Pro Carbon with a fresh install of Windows 10. (Note if you do this, ensure you get the mobo bios update onto a flash drive and apply it before installation otherwise you'll get constant lockups during install).

Now I'm getting ~75 FPS in 4K Ultra -- after dropping volumetric fog down a notch to medium -- that had a bigger difference in framerate than everything else combined! The thing I love about B3 is that the settings apply on the fly so you can see the difference of a settings tweak instantly. The benchmark gave me 64 fps with those settings but that was a stress test scenario. 75 is typical during actual gameplay.

I'm getting about 50% more FPS post upgrade. The spec numbers don't support that kind of gain, but that's what I'm getting. Granted it's a clean install on a faster drive, but once the game is memory, that shouldn't be a factor except when you are moving around and streaming in data. My other motherboard's bios was not upgraded in years, so maybe there's something there. I can't explain why it's that much faster, but I'm not complaining.