[SOLVED] from 2560x1080 to 2560x1440,fps drop?

frozensun

Distinguished
Jun 30, 2018
303
7
18,695
Guys how much approx FPS will I loose switching from 2560x1080 to 2560x1440 monitor?
I found topic here about that but lost the link.
Lets say I have in games I play 55-60 fps at 2560x1080 so how much would I get at 1440p?
 
Solution
Hmm... your old monitor was rated for only 75hz - but knowing some of those odd refresh models, it was something more like 75hz vertical and 60hz horizontal... I don't know why it would be like that either.

Your old monitor was likely running with a cap of 60hz.
What was said about resolution is true, but refresh plays a part as well - I, at least, assumed your old monitor was a 100hz+ model.
Yeah, it's going to be about 0.75x multiplier.

What graphics detail settings are you currently using? Ultra? The R9-390 is getting a bit old. Not exactly a 1440p card for modern titles at high/ultra settings. Its approximately equal to a GTX1650 Super or RX580/590 or RX5500XT (all of which are $150-$175 cards).

TBH though, I'd highly advise you to wait for the 2020 generations of GPUs before replacing your 390.
 
  • Like
Reactions: King_V and Phaaze88
is there a way we can calculate around how many fps we are talking about?
So I play FC 5 at 60 fps at ultra and Dying Light at around 55 fps at high.
I have some older titles installed too,so what you wrote, I'll loose 1/3 of fps and basically FC5 will run at 40 fps and Dying Light at around 37,right?
Dunno mate, 1/3 of FPS looks too much.
This is all teoreticallly.
I could play at 40 fps I don't mind and that is bare minimum for me.
2 months ago I bought 1660 Ti,haven't opened the box still because I don't have issues at 1080p with R9 390 but now I have 1440p monitor and I'm lazy to install 1660 Ti,and personally maybe I am wrong,even that card is not for 1440p at 60 fps,right?
 
Yeah, it's going to be about 0.75x multiplier.

What graphics detail settings are you currently using? Ultra? The R9-390 is getting a bit old. Not exactly a 1440p card for modern titles at high/ultra settings. Its approximately equal to a GTX1650 Super or RX580/590 or RX5500XT (all of which are $150-$175 cards).

TBH though, I'd highly advise you to wait for the 2020 generations of GPUs before replacing your 390.
Yep agree on that one and I paid 350 euros for R9 390 now I can't sell it for 110 euros.Sad 🙁
 
I have some older titles installed too,so what you wrote, I'll loose 1/3 of fps and basically FC5 will run at 40 fps and Dying Light at around 37,right?
Dunno mate, 1/3 of FPS looks too much.
That's how it works though. The higher the resolution, the greater the number of pixels that need to be rendered.
1440p contains about 33% more pixels over 1080p ultrawide, so expect fps to drop as such.
 
  • Like
Reactions: TJ Hooker
To get a general idea, you simply find out how many pixels in each resolution.
2560x1080=2764800
2560x1440=3686400
About 34% more pixels, so assume fps drops about as much.
So, from 55-60fps to 36-40fps. Again, this is just an estimation.
I just lost like -5 fps switching to 1440p which is unbelieveable...we all thought 1/3 of FPS drop...
How come just around 5 fps?
 
5fps loss, where?
Minimum?
Maximum?
Average?
The same graphics settings?
In just one game? The results will not be identical across multiple titles.
dying light average 55 fps,FC 5 around 55 too,average,but I could run benchmark.
I had around 60 in both games,almost constant 60 fps,so -5 is great result,no way 30 % less fps..
I don't know in other games probably unplayable...tried Metro Exdus and it goes as low as 30.
 
As pretty much touched on, while the numbers are approximate, it's a matter of number of pixels.

The math is: current FPS x (old resolution ÷ new resolution) = new FPS

That assumes all settings/detail levels are kept the same.

But, I also agree strongly with @WildCard999 - why would you want to go from ultrawide to non-ultrawide?


As to you only getting slight drops - what's your current monitor's refresh rate? Do you have it set to Vsync On at 60Hz? That might explain why current games at current resolution, games are 60fps, and using the higher resolution didn't drop it much.
 
Last edited:
As pretty much touched on, while the numbers are approximate, it's a matter of number of pixels.

The math is: current FPS x (old resolution ÷ new resolution) = new FPS

That assumes all settings/detail levels are kept the same.

But, I also agree strongly with @WildCard999 - why would you want to go from ultrawide to non-ultrawide?


As to you only getting slight drops - what's your current monitor's refresh rate? Do you have it set to Vsync On at 60Hz? That might explain why current games are 60fps, and using the higher resolution didn't drop it much.
144Hz refresh rate.I went from lg 25um58-p to samsung cjg50 32 inches 144Hz.
When I ran those games I expected that fps will drop as low as 40 even below just like Phaaze88 wrote but luckly it didn't.
Perhaps I had bigger fps then 60 with previous monitor,who knows...
 
Hmm... your old monitor was rated for only 75hz - but knowing some of those odd refresh models, it was something more like 75hz vertical and 60hz horizontal... I don't know why it would be like that either.

Your old monitor was likely running with a cap of 60hz.
What was said about resolution is true, but refresh plays a part as well - I, at least, assumed your old monitor was a 100hz+ model.
 
  • Like
Reactions: King_V
Solution