I don't think 4K needs to be part of the equation necessarily. If I put on HD (720p) film/TV or even a DVD (~480p typical), I'm not guessing if it was filmed with real people or not. Put on a modern "photorealistic" game at 4K such as the new Indiana Jones one or others mentioned, and you can tell it's virtual. Graphics seems to have entered an uncanny valley. It will take an enormous amount of resources on both the hardware and software side to get out of the valley, and raytracing isn't a magic bullet but only part of a comprehensive solution.
Meanwhile, pushing the graphics boundary does nothing for gameplay, and the industry is oversaturated with games that nobody wants to play or pay for.
I don't think the entry-level cost of getting into gaming (1080p) is such a big deal. It would be great if GPU and other hardware prices improved, but even the post-Cezanne APUs alone (e.g. 8700G or 6800H in a mini PC) have credible performance. You can find ways to get under the $700 mark, such as a refurb office PC + low profile GPU combo. You can pick up a discounted Steam Deck for <$400 and play at 720p. Any general purpose PC is useful and there are folks who are going to be using it all day every day.
But the games? Is anyone here paying hundreds of dollars a year for AAA titles? There is a massive amount of F2P, giveaways, and discounts, before you even consider the effects of piracy and emulation. 90% of the gaming industry could collapse overnight, and we would still be good... FOREVER.