I would also say, what's the point of a massive 4K display, if you need to turn all the quality settings down to get a good framerate? If I have paid £1500+ for a GPU, plus however much this over-sized monitor is going to cost, then I want games to bloody well look good.
While I agree that it will probably be quite a while before people are able to get the most out of a 240Hz 4K display, having access to both high resolution and high refresh rate doesn't necessarily mean that both need to be fully utilized at the same time. A screen like this will likely be the centerpoint of a home theater, so it won't just be used for high refresh rate gaming, but other forms of entertainment as well. When watching films on such a screen, the resolution will be what's most important, and you ideally wouldn't want anything less than 4K resolution on such a large display, especially since that's become the standard for large televisions these days, and few are likely to spend thousands on a screen that can't at least handle that.
And then there's "AAA" game releases, that again, many will want to experience at or near their maximum settings on the screen, even if that means not running them at high frame rates. Most games don't actually benefit all that much from getting very high frame rates anyway, so you are not really losing much by not running them near the screen's maximum refresh rate.
But there are also games like competitive first-person shooters, that are usually less demanding on hardware, and can see more benefit from running at higher frame rates. So, one may want to have access to a 240Hz refresh rate for those kinds of games, while also having access to a 4K resolution for many other games and films. That's not really any different from any other high refresh display on the market today. Even at 1080p resolution on the fastest graphics cards, some demanding games won't push over 100 fps, due to them being limited by CPU performance. Just because not all games can run near a screen's maximum refresh rate doesn't mean there isn't benefit from having such a screen for the games that do.
This tech will also take years to come to market, at which point the top GPUs should have 2x - 4x performance of the current ones. Even taking into account that then-new games will also require more of the hardware, games will still run better then.
I would question the likelihood of future games running "better". Hardware will become faster, but developers will utilize that hardware more. Just look at raytracing, for example, on the graphics side of things. In the coming years, that may become the standard for game lighting, perhaps even phasing out rasterized lighting eventually. And where today's raytraced lighting effects tend to cut a lot of corners to operate in realtime, future RT implementations could easily ramp up the visual fidelity in exchange for a larger reduction in performance.
And as I previously noted, CPU performance can be a big limiting factor. Per-core performance of CPUs hasn't been increasing as much as it once was, and average yearly gains are likely to grow smaller as manufacturers run into limitations with improving process nodes. In the short-term, as game development transitions to the new generation of consoles, while dropping support for the prior generation with their much slower CPUs, we are likely to see many games become a lot more CPU-demanding. Since the processors in those consoles are a lot closer to what's in today's gaming systems, if a game targets 60fps on those consoles and fully utilizes a processor core's capabilities within that limit, one shouldn't expect to get more than 100fps on today's fastest CPUs. And if a game targets 30fps, which isn't at all uncommon later in a console's cycle, even the CPUs of a few years from now might struggle to maintain 60fps on PCs.