[SOLVED] Are 21:9 ultrawides as hard for the GPU to run as 4k is?

Solution
I'm not sure that you're going to see a whole lot of people who have in-person experience in changing from 4K to 3440x1440, or vice versa.

The benchmarks of games are real-world differences.

In any case, assuming that your CPU is never the limiting factor in any games:
  • The 3440x1440 has about 60% of the number of pixels as a 3840x2160 (4K)
  • Therefore:
    • a 4K monitor will give you about 60%, maybe slightly more, of the performance you'd get with the 3440x1440 monitor
    • or, if you go 1/0.6, the 3440x1440 will give you about 167%, maybe slightly less, of the performance you'd get with the 4k monitor.
Those are approximations. But, between that, and comparing game benchmarks, that's about the best you're going...

NadeMagnet69

Prominent
Jul 20, 2020
170
14
595
Sorry, I meant 21:9 3440x1440 vs 4k for gaming. I see a lot of vids and articles saying that 4k just isn't as good for gaming as 1440.
I'm looking at this ginormous 4k beast right now. https://www.amazon.com/Acer-CG437K-...07Z1212CQ/ref=cm_cr_arp_d_product_top?ie=UTF8
But I only have a RTX 2080 Super founders edition so it'll be hard to run and only be around 60hz for many games which is too low since 60 is where I've been maxed at for the last 8 years with my 27inch Korean Shimian 2k.

So now I'm looking at ultrawide 1440 instead and want to know what kind of refresh rates I can expect. If the ultrawides effect performance too much, than I'd be better off in my opinion with the much larger 4k. I want to spend around 1k give or take a few, max. I don't play FPS as much as I play games like warhammer total war. lol I've looked through like 3 dozen different monitors in the past few days and am starting to get lost.
Now I'm looking at
https://www.amazon.com/Alienware-Cu...07YLGH9Q5/ref=cm_cr_arp_d_product_top?ie=UTF8
and the
https://www.amazon.com/LG-34GN850-B...086XLLG28/ref=cm_cr_arp_d_product_top?ie=UTF8

I don't know which to pick. All I know is whatever I get I'm going to have to live with for "ages."
 

King_V

Illustrious
Ambassador
It's approximately proportional to the number of pixels. The aspect ratio doesn't matter, it's the total number of pixels.

So:
  • 1920x1080 takes a certain amount of GPU horsepower.
  • 2560x1080 (ultrawide) takes more GPU horsepower because there's more pixels than 1920x1080
  • 2560x1440 is NOT ultrawide, but takes more GPU horsepower than 2560x1080 because there are more pixels
  • 3440x1440 is ultrawide, but takes more GPU horsepower because it has more picelx, not because it's ultrawide.
  • 3840x1600 is ultrawide, but takes less GPU horsepower than 4K because it only has approx 75% of the pixels of a 4K monitor
  • 3840x2160 (4k) is NOT ultrawide, but is more demanding on the GPU than any of the above.


Now it's not exactly proportional. So, a 4K screen has 4x the number of pixels as 1920x1080, but actually tends to perform a little better than 1/4 the number of fps of a 1920x1080.
 

NadeMagnet69

Prominent
Jul 20, 2020
170
14
595
Thank you but I already realize all that. What I'm trying to find out is the real world differences from people who own them for that "more/less GPU horsepower." Not just benchmarks.
 

King_V

Illustrious
Ambassador
I'm not sure that you're going to see a whole lot of people who have in-person experience in changing from 4K to 3440x1440, or vice versa.

The benchmarks of games are real-world differences.

In any case, assuming that your CPU is never the limiting factor in any games:
  • The 3440x1440 has about 60% of the number of pixels as a 3840x2160 (4K)
  • Therefore:
    • a 4K monitor will give you about 60%, maybe slightly more, of the performance you'd get with the 3440x1440 monitor
    • or, if you go 1/0.6, the 3440x1440 will give you about 167%, maybe slightly less, of the performance you'd get with the 4k monitor.
Those are approximations. But, between that, and comparing game benchmarks, that's about the best you're going to get. Peoples' experiences will vary slightly, but so will the settings they use, the stuff they have running in the background, etc.
 
  • Like
Reactions: NadeMagnet69
Solution
Oct 25, 2020
1
0
10
I've recently gotten myself a C49RG90, which is 32:9 at a res of 5120x1440, and I've been wondering the same thing.

Now at 5120x1440, this monitor has about 12% less pixels than 4k, but is pixel count literally all that matters? I really don't much about the internal workings of a GPU and how the whole thing sets about figuring out what to render, but the thought does occur that at 32:9, such a wide field of view means that there are literally more 'things' in the frame. In my super-naive head, more 'things' would presumably mean more items: to keep track of (position and speed), to figure out lighting and shadows for, and possibly to keep more textures in memory.

I don't know if some of this would ultimately fall on the CPU rather than the GPU, and I don't know if these things would even give rise to any extra processing at all. But I'd be keen to hear from someone who does know more about the inner working to say whether or not 5120x1440 is actually about 10-15% faster than 4k, or about the same, or perhaps even a bit tougher.

As I type this, I realise that I'm literally on a 4k monitor in my home office, with my gaming rig upstairs. I could literally get an answer myself, if it wasn't such a crazy effort to lug this thing upstairs and replug everything. Perhaps when the kids are back at school after half term I'll steal a couple of hours in the day and do that, it's not quite the OP's question, but as an exaggeration of it, it should give an answer that can be inferred back to 21:9. If I were to do this, let me know what tools I should use, as what I'll just do otherwise is play a couple games and keep an eye on FPS to give approx values.
 

NadeMagnet69

Prominent
Jul 20, 2020
170
14
595
I've recently gotten myself a C49RG90, which is 32:9 at a res of 5120x1440, and I've been wondering the same thing.

Now at 5120x1440, this monitor has about 12% less pixels than 4k, but is pixel count literally all that matters? I really don't much about the internal workings of a GPU and how the whole thing sets about figuring out what to render, but the thought does occur that at 32:9, such a wide field of view means that there are literally more 'things' in the frame. In my super-naive head, more 'things' would presumably mean more items: to keep track of (position and speed), to figure out lighting and shadows for, and possibly to keep more textures in memory.

I don't know if some of this would ultimately fall on the CPU rather than the GPU, and I don't know if these things would even give rise to any extra processing at all. But I'd be keen to hear from someone who does know more about the inner working to say whether or not 5120x1440 is actually about 10-15% faster than 4k, or about the same, or perhaps even a bit tougher.

As I type this, I realise that I'm literally on a 4k monitor in my home office, with my gaming rig upstairs. I could literally get an answer myself, if it wasn't such a crazy effort to lug this thing upstairs and replug everything. Perhaps when the kids are back at school after half term I'll steal a couple of hours in the day and do that, it's not quite the OP's question, but as an exaggeration of it, it should give an answer that can be inferred back to 21:9. If I were to do this, let me know what tools I should use, as what I'll just do otherwise is play a couple games and keep an eye on FPS to give approx values.
It's a moot point now. Since I wrote this thread I bought the LG850. It had a backlight bleed problem but it wasn't a deal breaker. What was though was dealing with the idiocy of LG US division which I whined about here. https://forums.tomshardware.com/thr...enware-aw3420dw-for-gaming.3635251/?view=date
I'm happy with my 4K Acer CG7. It really does feel like 2 LG850 stacked on top of each other. Plus I got it for almost the same price as the LG. It's since gone up to around 1400 bucks. Since then I also managed to return my RTX 2080 Super with 2 days left out of the 30 I had to return it. I had no idea the 30 series was so close to release or I never would have bought it to begin with. I was super stoked to get a 3080 AND get it for less than I paid for what the 2080 FE had ballooned up to. lol Naturally I'm not one of the ones who managed to get one so I'm back with old POS GTX 660. Had to dust off (lol Literally) some old game CDs since those are about all it's still good for aside from watching vids or going online. But once I do manage to get a 3080 hopefully by the end of the year or the beginning of next year, driving 4k will hardly be an issue. At least compared to the 2080 FE anyways.