PS4 launched at $400 in 2013. In 2013, Nvidia's $400 offering was a GTX 770. That card was not a good pick for 1080p gaming in 2020, plus the 2GB of RAM was quite limiting,
Usually, when a new console generation launches, the hardware initially tends to be more powerful than what you can get from an equivalent PC for the money. As was mentioned, the manufacturer is subsidizing the cost of the hardware, expecting to make it back through software, peripheral and online service sales in the years that follow. That initial price advantage tends to fade not too long thereafter though. Your comparison is not that great in that the GTX 770 came out a number of months earlier, and the following year, the GTX 970 became available at a $330 MSRP, and was around 50% faster than a 770 (or roughly twice as fast as the PS4's graphics hardware), and that card is arguably still very viable for "mid-range" 1080p gaming today. Even the GTX 770 could likely keep up in most titles, being roughly close to the level of a 1650, at least provided one has the 4GB version (which cost around 10% more). I suspect you could probably run most games at better settings and get more performance on a 770 4GB than what a base-model PS4 can manage, considering the graphics hardware should be more powerful. You obviously shouldn't expect it to run newer, demanding games at high settings with a stable framerate, since the PS4 certainly isn't doing that either.
The last couple of years have demonstrated that Nvidia and AMD won't sell lowend products if they don't have to. They don't care if you don't want to spend more than X amount of dollars, they'll just ignore you and sell to someone who will pay that amount.
To be fair, they arguably couldn't sell the hardware at lower prices without resellers buying up all the stock and reselling the cards at inflated prices anyway. If someone's going to make a profit off the largely mining-induced inflated street pricing, it's arguably better for them to get a cut than all of it going toward a middle-man leaching off the market. At least this way, some of the profits can be put toward things like future research and development. The current pricing is still bad, but will likely continue to get better in the months to come.
No.
An 'optimisation' is taking an existing system, simplifying it whilst getting the same or indistinguishably similar output. That is not the case with non-RT rendering. There, you are taking a simplified rendering system and adding complexity in order to change the output to imitate more closely an RT output.
All that "complexity" can be thought of as an optimized way of producing a relatively similar result to raytraced graphics. Optimizations don't necessarily need to follow a similar path to produce the desired result.
Compared to other software, raytracing could be thought of a bit like uncompressed video. The means of rendering uncompressed video are simple, but very demanding in terms of the amount of data being transmitted or stored. Compression formats like H.264 replace that simple rendering method with all sorts of convoluted algorithms in order to optimize the file size of videos, at the expense of losing some precision of the resulting image. The result tends to be "close enough" though, at least if you don't look too close, and anyone streaming video online or storing it locally will appreciate the benefits of the more optimized format.
That's not to say raytracing is bad though. My point was more that the other methods of rendering game visuals have their advantages, and the big performance gains for a minor loss in visual fidelity are why developers have traditionally gone that route. And of course, that one shouldn't expect those other rendering methods to go anywhere for a number of years, since most of the target market for games won't have the necessary hardware to run raytraced effects well, meaning RT will only have a negative impact on ease of development for quite some time.