Yeah, but also consider that RTX 4000 has a lot more L2 cache than its predecessors. To the point that the RTX 4090 actually has more cache than the RX 7900 XTX has Infinity Cache!
So, when you say RDNA 3 underperformed, how much of that is comparing it with RDNA 2 versus comparing it with a new RTX that's might have done more to catch up than we expected?
It is a fact the 7900XTX (to focus on just one for now) is relatively under-performing in 1080p vs 4K in terms of scaling compared to a 6950XT. Is it a big difference or, more importantly, relevant? Nah, I don't think so. I doubt there's any game out there that won't be CPU-bottleneck'd by anything above a 6900XT/3090 at 1080p. Well, for Rasterization at least.
One of the things MLiD and others leaked was they weren't going to use more cache on these initial ones, because the added cost didn't justify the extra performance, which seems to be reflected in all benchmarks and products (the specs) being sold. I haven't seen anything even hinting at a version with added cache (hi1) in their MCDs coming out soon from anywhere.
And the cache, looking at RDNA2's implementation, seemed to be most effective at lower resolutions anyway. I'd imagine they'd have to give the cards a lot more cache to make it worthwhile at higher resolutions where it makes sense and, looks like, the GPU (shaders) can't keep up anyway. Still, it would be interesting to see models with more cache and compare one day.
And just food for thought: unless you yourself talk to someone form AMD, Intel or nVidia directly or have access to confidential data, you have to doubt anything and everything you read and watch until people can get their hands on with the hardware. I like leaked information and, much like with everything, needs to be handled with caution.