Still not convinced $350 is low enough for the bottom SKU, probably going to get another $50 drop within months from launch.
That implies the 5700 is going to be their lowest-end part, which seems rather unlikely. I suspect we'll be seeing launches for a 5600, 5500 and so on over the coming months to fill in the lower price-points. They're likely starting with the higher-end to give their lower-end cards a bit more time to clear out, just as Nvidia did last fall.
AMD messed up pricing on the RX570/580, now it is struggling to get rid of them for $120-180 including free games or Xbox pass bundle that make the GPU itself nearly free if you care about the addons.
The launch prices of the RX 570 and 580 were fine. The cards were just a minor update to their RX 400 series counterparts, which had offered excellent value at the time they came out just 9 months prior. The 500 series might not have pushed performance-per-dollar any further, but I don't think anyone really expected much from a refresh so soon.
Unfortunately, the 500 series cards launched just as cryptocurrency mining was ramping up, and due to them offering better compute performance than the competition, prices only moved upward after launch, and took more than a year for them to get back down to where they should have been. The same happened with Vega. By the time mining subsided, and prices worked their way back down to purchasable levels, those cards were already over a year old, and I think many just decided to hold out for the next generation. The glut of leftover cards was probably mostly down to the collapse of the mining market, and clearly affected Nvidia too, which is likely why it took them two and a half years to begin launching a new generation of cards themselves.
As for bundled games, I doubt those cost AMD all that much. The Xbox game pass in particular is only 3 months, and can be thought of as an extended trial of a service that is currently not all that expensive to begin with. I suspect they get bargain-bin pricing on the other pack-in games as well.
They keep pushing to new graphical heights, but most of the engines don't scale back DOWN very well.
Sure they do. Just about any current games can get decent frame rates with max settings at 1080p resolution on a $200-$250 graphics card, and with settings dialed back a bit, even on significantly less expensive hardware. 1080p is arguably still "mid-range", and if one wants higher resolutions or refresh rates, they will naturally be looking at somewhat higher-end hardware around the $300+ range to accomplish that well.
If anything, most games don't scale UP all that well. Those buying high-end graphics card get a sharper image and/or moderately smoother frame rates than those buying a $200 graphics card, and that's about it. Developers have been designing their games to perform well on mid-range hardware and consoles, so high refresh rates and resolutions are about the only thing differentiating the high-end. I suppose that could actually be a good argument in favor of raytraced lighting effects, as they can potentially provide a more substantial benefit to visuals than just rendering more pixels or frames, at least past a certain point. If the upcoming generation of consoles pushes games to improve their visuals, we may see games making greater use of available PC hardware, and more reason for "mid-range" cards to offer more performance.