Historically, software was driving hardware demands: typically, this was the new super 3d game, that everyone wanted to play, but first had to upgrade their GPU, CPU, RAM, or even a sound card in order to meet minimum hardware specifications.
Nowadays, even two generations old and modest GPUs are capable of running virtually all modern games at 1080p and at decent quality settings.
Back in the day you needed the latest and greatest GPU in order to achieve playable frame rates in the latest games - anyone remembers 3dfx voodoo add-on cards?
The only niche where still only the best available GPU is good enough is professional rendering / video editing because saved rendering time equals more earnings and higher income.
But for the vast majority of customers, including office users and gamers, any decent card from the past 5 years will do the job.
So, AMD and nVidia may want to acquire a game developer studio and let them create new games that will not only look extremely photorealistic and have the best game play and physics ever but also require a 4090Ti in order to run at 60 fps in 1080p mode and medium settings.
Moreover, both AMD and nVidia need to cut prices by at least 30-40%.
Finally, they need to optimize their GPUs for much higher performance per Watt (efficiency).
Frankly, who is going to spend 1500 bucks on a heat-generating, power-consuming card for games that he could just as well play with a 4 year old card that costs 70% less and consumes 50% less power, or just use a a gaming console?