Think back to the time when All monitors were 4:3, and the 16:9 were new and expensive. Everybody ooo'd and ahh'd about how nice the wide-screen was and how much more viewing area was now possible. The appeal of 21:9 isn't any different.
Many ppl have now moved upto using 2 or 3x wide-screens, just for the extra space. Game on one, have discord, cheats, directions, maps etc on the other.
The downside is 21:9 isn't a supported resolution in every game, isn't a supported resolution for many video or movies that are 16:9 native. So you'll get letterbox bars, same as watching anything really old that was shot for 4:3 on a 16:9 wide-screen.
The upside is that many newer content is native 21:9, immediately noticeable by the top/bottom bars on a 16:9 screen, instead of the side/side bars. Many/most newer games support 21:9, so the game view is much closer to peripheral vision viewing, not the more tunnel view of wide-screen or 4:3 resolutions.
The downside to that is the extra pixels. It takes power to light up a pixel, there's almost 1.8x as many pixels at 1440p as there are at 1080p, but ultrawide 3440x1440 has 1.35x as many pixels as standard 2560x1440 (1440p). That can have a strong affect on the fps onscreen with gpu bound games. So where you were close to 200fps in a game, you may end up closer to 160fps, at the same graphical settings.
Cpu is possible fps. Gpu is viewable fps. Depending on the game and how close that relationship is will determine how many fps are lost, if any at all. CSGO doesn't lose any fps, the gpu isn't challenged at all, so all the fps from the cpu will show. FarCry 6, with the HD pack will be a totally different story.