While It's obvious the power is there, every release since the GTX 1000 series has shown this, in memory & bandwidth increases.
Where they're not improving are the actual video standards. We've been running HDMI 2.0(b) & Displayport 1.4 for 5 years, this is a new decade. 4K TV's running at 120 Hz are abundant, will be trickling down to 'value brands' such as VIZIO soon.
These cards won't be able to take advantage of HDMI 2.1 & DP 2.0, both offers not only better video, also faster and another big deal....eARC, which is 48x faster than HDMI ARC. This is video we're talking about here, 8K TV's which started the 4K at 120 Hz (& eARC) has been here for well over a year, yet an expensive GPU connected to these TV's can run at only half speed? And no eARC?
Not all who buys discrete GPU's are hardcore gamers (if at all), some of us like myself wants better video, as computers are also a major source of TV content. 24GB of the latest HBR2 RAM running at higher bandwidth means nothing when the actual video standard used are the same as from 2015/2016 cards. Why? Because when plugged into the HDMI 2.1 port of a 4K TV, that bandwidth isn't there to be seen. While there are some active Displayport 1.4 to HDMI adapters to increase speed, these won't pass all of the HDMI 2.1 standards, plus some other standards (Dolby Vision/Atmos, eARC & others), may or may not seen. Also not all of these adapters are reliable, so best to have at the source, in this case, the GPU output ports.
So another two year (or longer) wait to get current......oddly until the middle part of the past decade, MB & GPU OEM's would offer the latest standards ASAP to compete. Shame it's taking so long in the early 2020's to get current.🙁
Cat