Both Nvidia and AMD need to put a lot more effort into improving efficiency, whilst increasing performance. They can't carry on shovelling in more and more power, for improved performance. It's ridiculous that a top-end graphics card, can draw more than 450 Watts,
The 40-series cards did significantly improve efficiency over the 30-series though. A 4070 performs very similar to a 3080, but only draws around 200 watts under load, whereas a 3080 draws upward of 320 watts. A 4080 draws around 50% more power than a 4070, and a bit less than a 3080, but tends to perform around 50% faster than those cards. Likewise, a 4090 can draw over 400 watts while gaming, but it tends to be around twice as fast as a 4070, so the efficiency level of all of those cards tends to be fairly similar, and is much better than the previous generation.
One thing to keep in mind is that there is no fixed level for what constitutes a "top-end card". A GTX 1080 Ti was $700, and drew around 250 watts, and was arguably considered "top-end", ignoring the semi-professional Titan cards that were hardly any faster, but cost a lot more. But if there is a market for higher wattage cards that deliver noticeably more performance than the more mainstream models, then it only makes sense to offer those as well.
Nvidia might shift around the model names and prices to try to convince people to spend more on a higher wattage card, but no one is requiring you to buy whatever card is at currently at the top. Niche cards in the $1000+ range are for those willing to spend extra for additional performance, who don't care much about the high power draw, heat output and pricing. If you do care, then those products are not for you, and you should look for something else instead.