I'd agree only partially with your advice.
First of all: don't buy anything you don't need (or you know you want)!
And try to avoid of buying too far ahead for any "future needs".
In both cases you're far too likely to overspend today.
Where I agree is that it might be worth waiting, if your need isn't there yet.
And if then those 4090s turn up used at around €1000, they might be pretty good value.
For me upgrading from a 3090 to the 4090 was an easy choice driven by the wider range of variable precision data types in CUDA for machine learning experiments and I needed the RAM for the bigger LLMs.
That the 4090 was a pretty good gaming GPU as well, didn't entirely escape me, I try to keep an eye on secondary use cases, just as one of my sons was happy to take the 3090 off me a year later at 50% and no Christmas presents.
The LLM stuff has moved quite beyond what you can reasonably experiement with in the home lab, so it's not very likely an RTX5090 will even remotely keep pace and I can't quite fit a Cerebras into my budget.
So will I need a 5090 for gaming? It doesn't seem very likely, at least if it's really just 30% extra performance over the 4090.
I spend most of my working day (and far too much of my leasure time) in front of a computer screen, but very little of that is gaming. So naturally it has to be big, flat, crisp and easy on the eyes, also by avoiding visual noise (such as reflections).
And that's how I bought my orignal primary: an LG IPS 4k at 43", non-glare great for 2D work and rather affordable at the time (much less than the EIZO EV2411W 1920x1200 it replaced).
When all these new high-refresh, curved, wide, HDR or even OLED displays came out, they were the ones my kids chose, who can afford to use their PCs (my leftovers) primarily for gaming and fun not work.
Eventually I got tired of their games looking so much better, so I compromised on a Gigabyte high-refresh VA-panel with HDR as new primary, but still 43", flat and 4k, because that still pays for what little fun I get. I run both side-by-side via a KVM, but often keep the older 2nd screen off, unless I want to monitor things there.
Unfortunately, TrueHD and 3k may look rather good on displays with that native size, but on a 4k screen chosen for visual detail those lower resolutions look obviously scaled. So whatever GPU I use, it has to be able to deliver at 4k native.
And the gap from THD or even 3k to 4k is square and huge. I went through quite a few GPUs trying to fill it in vain, R9 290X, GTX 980ti, GTX 1080ti, RTX 2080ti, RTX 3090 and eventually RTX 4090.
Ryzens replaced big Xeons with high-core counts but slow clocks, so even there I'm at last year's premium now (Ryzen 9 7950X3D), but it just happens that my two favorite games ARK (both variants) and Flight Simulator (again 2020 and 2024) just suck hardware without delivering the fluid performance I expect.
ASE (the 1st ARK, and a pioneer on Unreal 4) takes ages to load, because it using tens of thousands of individual files and Windows just happens to be really slow at opening them (it starts faster with HDDs on Linux then with NVMe on Windows!) but at least now it delivers on graphics, while ASA (the 2nd ARK) uses a few huge files and thus starts much faster, but is a very early Unreal 5 title with huge demands on GPU and CPU power.
And for the flight sims (2020 and 2024 equally) you either run an ultra wide curved big screen, or a triple with two angled side displays or my preferred variant, a VR setup, because immersion even on a big flat screen just isn't there otherwise.
For VR I started with the original Oculus DK1, got a DK2, CV1 and then a HP Reverb that was finally good enough in terms of screen quality (2kx2k per eye at 90Hz), but the VR experience inside FS2020 was really, really bad with the world just shuddering and stuttering, when that's exactly what can never happen in the real world, at least until the point of impact. And FS2024 has improved nothing, a Quest 3 with a very similar resolution only got rid of the cable (and the dreadful end of Microsoft's AR support).
I keep an RTX 4070 on my 24x7 system on the same KVM and screens and I was both delighted and annoyed to see how it kept creeping up in terms of what it could deliver with ever better DLSS support and general driver improvements on all "normal" games.
I'd dare say that the 4070 at 4k is often as good as the 4090 was at launch and it's two years of software evolution (games and drivers) that made all the difference!
And it's there, in the DLSS arena, where I'm not seeing anything new coming with the 5000 series. Perhaps Nvidia is really just keeping things under wraps, but if 30% uplift for every class at 20% extra cost is all it delivers, perhaps 4090s will last for a very long time yet.
Well at least in theory, should it crash and burn (still a distinct possibility after all, only the PNY 4070 still has a normal 8-pin connector), I'm much more worried about having to pay for a replacement at the currrent prices.
Quite honestly I expect nothing but disappointment from the 5000 series, because where it improves beyond the 4090 it will very likely be of little practical benefit and for the many other classes the spread in potential performance will be extremely wide. In some cases a much lower rank GPU might already deliver the very top performance and quality at 4k a game can actually deliver, in other cases all that extra hardware won't help or it's really the quality enhancement features like DLSS (and their AMD/Intel/Sony variants) that make or break gamer experience.
As to Intel's GPUs: I tried an A770 as a replacement for a GTX 1060, but it refused to work with DP and/or KVM, so I went with the 4070. I then tried again more than a year later with a mobile A770, because it came basically for free with a Serpent Canyon NUC. That was ok for the price, but I'm happy to have passed it on. And its idle power disqualified it for µ-server use that I really bought it for.
AMD's Radeons: I started with ATI via an 8514/A clone in 1990, the very first graphics accelerator they ever produced. And I stayed with them until the R9 290X (actually, I did try an R9 Nano with HBM, too, but returned it). On the GPU side AMD has constantly managed to annoy me since, mostly via bad driver support (e.g. on Linux), early retirements or by refusing to work on Windows server editions.
I've been much more of a fan for their APUs, even if those share quite a few of the driver complaints.
Ultimately it was CUDA which got me tied into team green for dGPUs... somewhat reluctantly at first, but hard to dislodge now.
For me the PNY dual fan Verto 4070 with its strictly dual-slot compact format and a single 8-pin power connector is the card to beat for a console class gamer PC, which I can build on a mobile-on-desktop Mini-ITX board. If AMD or Intel can match that power at a significantly lower price or exceed the performance at the same price and similar Wattage, they might be worth a thought.
The only thing "wrong" with the 4090 are current prices at new, but when you need it now, the only reasonable alternative may be to get it 2nd hand.
I bought my 3090 shortly before the 4090 came out, but even if it wasn't cheap by any means (also €1500, I believe), it let me do my job better than the 2080ti, which didn't support many of the critical ML data types. Staying ahead got me paid and that then allowed me to upgrade to the 4090 to do the job even better.
Using consumer GPUs to do reasonable AI work seems a windows pretty much shut these days, Nvidia put huge resources into making that happen.
It might eat into the success of the 5090, because that secondary market for the 3090 and 4090 high-end consumer GPUs might have been bigger than they ever realized.
The high-end consumer GPUs rode really high on an ML wave for a few years now, but AI workloads have moved far beyond on the upper end and games are really still tied to console class hardware for the mass appeal: more [vendor] players than a [consumer] player market I'd estimate, if I were any good at predicting the future.