How much performance change between 1080p and 768p?

Eckoshy

Distinguished
Oct 25, 2013
29
0
18,530
So I'm looking into upgrading my graphics card sometime soon. After a fair amount of research I think the GTX 960 should suit my needs very nicely and stay in my price range (about $200). But I've got a somewhat odd concern. See, every time I look at the 960 benchmarks everyone always uses 1080p for their tests, and my monitors natural resolution is 1360 x 768. I know that should make the performance go up as I play on lower resolution, but I'm wondering how much it will change the performance. For example, I saw The Witcher 3 benchmarked with a 960 running at 1080p Ultra settings, and it was getting 40-50 fps average. Let's assume I'm running the same settings, but at 768p, how much should I expect the performance to improve?
 
Solution
1360x768= 1044480 pixels
1920x1080= 2073600 pixels

Since your resolution is almost half the pixel count, your performance will probably be about 70-80% higher (half the resolution doesn't mean exactly double the performance), you should be able to run The Witcher 3 at a constant 60fps.
1360x768= 1044480 pixels
1920x1080= 2073600 pixels

Since your resolution is almost half the pixel count, your performance will probably be about 70-80% higher (half the resolution doesn't mean exactly double the performance), you should be able to run The Witcher 3 at a constant 60fps.
 
Solution
FPS performance is determined by the weaker of cpu or graphics.
What is your cpu and gpu now?

You can get an idea of the benefits in a sort of backhanded way.
Run your games, but lower your resolution and eye candy.
If your FPS increases, it indicates that your cpu is strong enough to drive a better graphics configuration.
If your FPS stays the same, you are likely more cpu limited.

 
Oh yeah, I've already done some tests, my CPU is more than capable of handling the game. It's just the GPU that's the problem. Anyway, thanks for the help both of you.