Hey,
I was benchmarking last night and using Superposition right? And, in trying to see what others had achieved using a 1080 @ 1080p, stumbled across a video on YouTube which showed a guy running his 1080 at Extreme 1080p Direct X which is what I would usually do too. He however gets annoyed at the 36 to 37fps he's seeing and quits out. He then decides to run at 1080p with 4k optimized settings and gets a max fps of 180!? So, as you can imagine, I do the exact same changes and once implemented, my max fps goes from 41 to like 203!? I just don't get it? My score was over 21,000 which is crazy right!? I mean it's obvious it's not as taxing as just plain 1080p extreme but the difference is crazy, and it only uses 98% of the card too!? Can anyone please shed any light on why the score is so different!?
Many thanks.
I was benchmarking last night and using Superposition right? And, in trying to see what others had achieved using a 1080 @ 1080p, stumbled across a video on YouTube which showed a guy running his 1080 at Extreme 1080p Direct X which is what I would usually do too. He however gets annoyed at the 36 to 37fps he's seeing and quits out. He then decides to run at 1080p with 4k optimized settings and gets a max fps of 180!? So, as you can imagine, I do the exact same changes and once implemented, my max fps goes from 41 to like 203!? I just don't get it? My score was over 21,000 which is crazy right!? I mean it's obvious it's not as taxing as just plain 1080p extreme but the difference is crazy, and it only uses 98% of the card too!? Can anyone please shed any light on why the score is so different!?
Many thanks.