[SOLVED] same fps after upgrade wth.

H47E

Distinguished
Dec 12, 2013
35
0
18,530
I've upgraded from a 4670k (didn't oc since I'm not handy with oc, plus didn't have the mobo for it) 8gb single channel ddr3 1666mhz to a i5 9400f ,2x 8gb ddr4 2666mhz although the ram are different brand, and one of them doesn't support xmp, they do run in dual channel mode though, I've checked in bios, cpu-z and aida64 and I still have the same fps as with previous cpu and ram.

I thought the cpu would at least give me a 15-20 fps and I thought that dual channel also boosts fps in games. Its paired with a 2070 super.


So what's wrong with it... getting a higher refresh monitor so I'm trying to squeeze as much fps I can within my budget.

I was thinking maybe its the different ram brand but that would leave the cpu not being impactful on fps.
 
Solution
Yeah, 4K isn't really a practical target for high refresh-rate gaming. The graphics card needs to render four times the pixels as it does at 1080p, and more than two times the pixels as at 1440p, so performance in pretty much any semi-recent graphically demanding game will be dictated primarily by graphics card performance at that resolution, even with a very high-end card. The more you turn up resolution or increase settings, the less of an effect CPU performance will have, since it will be waiting for the graphics card to complete its rendering more often than not.

In general, the graphics card handles rendering the visuals on your screen, while the CPU typically handles things you don't directly see, like physics calculations and...

4745454b

Titan
Moderator
What games? What settings?

If the old system gave you 60FPS at 1080 with details on high/ultra and the new system gives the same, it's because the 2070 super is overkill for a 60Hz 1080 monitor. The monitor is a bottleneck for your system. We need a lot more info on games and settings to guess what's going on.
 
  • Like
Reactions: Fix_that_Glitch

H47E

Distinguished
Dec 12, 2013
35
0
18,530
Well with my previous cpu/ram in Overwatch on epic 1080p ,150% render scale 118 avg fps , 200% render scale I had 76-83 fps and with this cpu it's unchanged.

GTA V DSR 3840x2160 max out no AA at all , msa reflection 2x, shadows= softest
Advanced Graphics, they are all off! Averaging around 79-86, I am getting same fps with both cpu/ram.

The only difference I've seen was in Everybody's Gone to the Rapture with all max out DSR 3840x2160
previous cpu ram had around 40-45 and now I have 45-50

and I know a 60hz monitor is an overkill that's why I wanna buy another one ( still waiting for it) but that doesn't mean I cannot monitor the fps with v-sync unlocked.

I guess I watched too many benchmarks on youtube having high expectations with my upgrade.
 

Phaaze88

Titan
Ambassador
Look at the in game settings you're running though.
Overwatch on epic 1080p ,150% render scale
That's a slap to the gpu.

GTA V DSR 3840x2160 max out no AA at all , msa reflection 2x, shadows= softest
Advanced Graphics, they are all off!
Another slap to the gpu.
Not all AA settings are gpu-based, so even if these are off, you still have the 4k DSR + 150% render scale on?

Everybody's Gone to the Rapture with all max out DSR 3840x2160
Third times the charm?

The higher the resolution/graphics settings, the greater the gpu performance impact; the cpu impact is less.
Sorry, but 4K DSR and higher render scale is proving too much for the 2070 Super. The cpu doesn't care because it's work is easy regardless at those settings.
Turn the resolution and graphics down a notch, and fps should improve.
 
Yeah, 4K isn't really a practical target for high refresh-rate gaming. The graphics card needs to render four times the pixels as it does at 1080p, and more than two times the pixels as at 1440p, so performance in pretty much any semi-recent graphically demanding game will be dictated primarily by graphics card performance at that resolution, even with a very high-end card. The more you turn up resolution or increase settings, the less of an effect CPU performance will have, since it will be waiting for the graphics card to complete its rendering more often than not.

In general, the graphics card handles rendering the visuals on your screen, while the CPU typically handles things you don't directly see, like physics calculations and animations, enemy AI, netcode, sound processing, general game logic and so on. Typically, increasing resolution does not increase demand on the CPU, only the GPU.

So, for example, if CPU A can perform its calculations 120 times per second in a particular game, and CPU B can perform them 150 times per second, that's not going to matter much if the graphics card can't keep up. If the graphics card were able push 150fps at 1080p, you would likely see that difference. However, at 1440p the card might only be able to push 110fps, in which case both processors would likely perform pretty close to one another, and if the card can only push 60 fps at 4K, either CPU would be waiting around for it to finish its rendering much of the time, and there would be almost no difference between the two.

Take for example, the summary charts in this recent review at TechPowerUp for the Ryzen 3600 (they didn't do a full review for the 9400F, but it's included nearby in the charts). Scroll down a bit for the gaming results, and compare how the 3600 (or 9400F) perform against the lowest-end models in terms of average frame rates, when paired with a very high-end 2080 Ti...

https://www.techpowerup.com/review/amd-ryzen-5-3600/22.html

At 720p (Which no one would use a 2080 Ti for, so it's more a synthetic benchmark than anything), there's a large difference in average frame rates between those CPUs and the low-end Ryzen 1200 at stock clocks, which only gets around 60% of their frame rates at that resolution. At such a low resolution, the graphics card ends up waiting for the CPU pretty much all the time. At 1080p, this difference shrinks a bit, but most of the benchmarked games are still being limited by CPU performance. We see the difference shrink even more at 1440p, and at 4K all the CPUs in the list see average frame rates within about 8% of one another, with most of the processors performing nearly identical. Plus, a 2070 SUPER, while decidedly high-end, is not quite up to the level of a 2080 Ti, so your 1440p performance at ultra settings would likely look rather similar to that 4K chart, with your 4K ultra results seeing even less of a performance difference between CPUs.

Of course, different games can place different amounts of load on the CPU, and I suspect you would see more of a difference in games that heavily utilize more than four cores, like Battlefield V, where the additional cores of the 9400F would likely help smooth performance. And that's likely to become more common in future games.

In general though, if you want higher frame rates for high refresh-rate gaming, you're going to need to drop the resolution. Even that 1080p with 150% resolution scale in Overwatch works out to over 25% more pixels being rendered than 1440p. Resolution scaling, DSR, supersampling or whatever name it goes by is a good-looking but very poor-performing form of AA, and is generally only worth using if you have lots of graphics performance to spare and are willing to take a big performance hit to smooth the visuals out a bit. On a 1080p 60Hz screen with a high-end card, it could make some sense. If you are targeting optimum performance on a high refresh-rate screen though, use a less-demanding form of AA instead.
 
  • Like
Reactions: Phaaze88
Solution