That's not the best comparison though. Realtime raytracing in a game takes a lot of shortcuts to achieve playable performance, which will often negatively impact visuals, whereas the single scene would be rendered with a focus on quality over performance, in many cases going past the point of diminishing returns. The realtime routines are cutting a lot of corners to minimize the number of calculations that need to be performed, like greatly reducing the number of light rays and bounces, as well as performing significant noise reduction after that's done and upscaling the resulting image, and in most cases only using raytracing for a portion of the scene's rendering.
And of course, realtime raytracing today isn't being performed on a CPU, but rather on specialized hardware designed with that purpose in mind, so it's not exactly a direct comparison. That additional hardware isn't really contributing to making a system feel responsive in general use, which seems to be the center of this discussion. Certainly, today's systems are much more capable of performing demanding workloads faster, but that doesn't necessarily translate to a more performant user experience outside of those brute-force workloads.
And on that note, realtime raytraced lighting effects only work on that specialized hardware today because of a lot of heavy optimization that's been done. All those shortcuts involving minimizing rays and bounces, reducing the noise of the resulting messy output and upscaling in a way that looks decent, are all heavily optimized by people who know what they are doing with the hardware at a low-level. And that's pretty much the opposite of what we have seen going on with a lot of other software that isn't pushing the limits of modern hardware. A large portion of programmers today are outright bad at optimization, or are at least not given the time to optimize, since there's less incentive to optimize code for something that will at least sort of work on most modern hardware.
And that's the problem. Outside of a limited subset of tasks that still push hardware to its limits in a clearly measurable way, modern software is largely becoming less optimized. Take for example a web browser. There's less "need" for the browser's developers to focus on performance than there was a decade or two ago, since the minimum baseline of hardware tends to be higher, so they are more prone to allowing optimization to slide. It also doesn't help that Chrome has attained a near-monopoly on the browser space much like IE had a couple decades back, so we don't have nearly as much of the competition between browser engines as we had back when there were a number of them all vying for attention. Microsoft's browser is now a Chromium reskin. Opera is now a Chromium reskin. And Chrome itself has become a bloated, unoptimized memory-hog, with Google seemingly not caring much about performance since they strong-armed their browser into the number one position 10 years ago. Outside of Webkit, that has its own monopoly on Apple devices (And that Chromium itself was forked off of), the only real competition would be Firefox, though they seem content to no longer try to directly compete with Google, getting by on whatever limited resources they have access to as they continue to slide into obscurity, perhaps with the eventual fate of becoming a Chromium reskin themselves. So, the developers of Chromium don't have any significant competition for the browser to be compared against, and as a result performance (and new functionality in general) takes a backseat to simply maintaining the browser and keeping it secure so that Google can continue to use it for its primary task of harvesting user data.
This lack of optimization also extends to web developers as well. Developing a website used to be a careful balancing act between making the site look good, and allowing it to maintain a certain degree of usability on lower-end hardware and slow Internet connections, but that's generally considered less important today. With the prevalence of faster Internet connections and faster hardware in general, you end up with many web developers not caring as much if their page is bloated with unnecessary images with larger-than-necessary file sizes, and advertisements and tracking scripts that can bog down performance if not dealt with by the end-user. Combined with the lack of optimizations in the browser, and increased background activity from OS processes, it's no wonder why the web browsing experience of today ends up being significantly less optimized than it was in the past.
There are undoubtedly many parts of a user's computing experience that will perform better today, particularly with hardware like SSDs that have helped counter the unoptimized software bloat, but there are definitely areas where there have been regressions made, or no obvious improvements despite being run on significantly faster hardware.