I've recently gotten myself a C49RG90, which is 32:9 at a res of 5120x1440, and I've been wondering the same thing.
Now at 5120x1440, this monitor has about 12% less pixels than 4k, but is pixel count literally all that matters? I really don't much about the internal workings of a GPU and how the whole thing sets about figuring out what to render, but the thought does occur that at 32:9, such a wide field of view means that there are literally more 'things' in the frame. In my super-naive head, more 'things' would presumably mean more items: to keep track of (position and speed), to figure out lighting and shadows for, and possibly to keep more textures in memory.
I don't know if some of this would ultimately fall on the CPU rather than the GPU, and I don't know if these things would even give rise to any extra processing at all. But I'd be keen to hear from someone who does know more about the inner working to say whether or not 5120x1440 is actually about 10-15% faster than 4k, or about the same, or perhaps even a bit tougher.
As I type this, I realise that I'm literally on a 4k monitor in my home office, with my gaming rig upstairs. I could literally get an answer myself, if it wasn't such a crazy effort to lug this thing upstairs and replug everything. Perhaps when the kids are back at school after half term I'll steal a couple of hours in the day and do that, it's not quite the OP's question, but as an exaggeration of it, it should give an answer that can be inferred back to 21:9. If I were to do this, let me know what tools I should use, as what I'll just do otherwise is play a couple games and keep an eye on FPS to give approx values.