The 1080Ti can mostly hit 60fps at standard 4k (3840x2160) with max or very high details on modern games.
Given that the 3440x1440 has about 60% (ie 6/10ths) of the pixels, it can probably max out at about mostly 100 fps, again, on max or very high details on modern games. (the previous 60 fps divided by 6/10ths, or 60 fps times 10/6). That would be the card running full throttle, though.
If you're just looking to say run at a constant 60, or maybe cap it at 75, the 1080Ti should handle it well without having to run full-tilt.
If you do go 2560x1080, then the 1080Ti wouldn't even be breaking a sweat running it. Well, maybe fairly high utilization if going a full 144fps.
I understand that one of those monitors is VA, and the other IPS, though I'm not clear on the details.
However, they are both GSync monitors - which is an adaptive sync technology that's Nvidia only, as it's proprietary. Since they have to pay Nvidia a hefty licensing fee for that, they will be more expensive than an equivalent FreeSync monitor. It also locks you in to the Nvidia ecosystem.
Originally, FreeSync, while free/open, was only available for AMD cards, but finally, even Nvidia started supporting FreeSync use with their 10, 16, and 20 series cards. I'd say look for a FreeSync monitor with a wide FreeSync range, and one that has Low Framerate Compensation (LFC), and save yourself a notable amount of money in the process (the "Nvidia tax" if you will).