I chalk some of that up to guerilla marketing. AMD has pushed multiple games with higher than usual VRAM requirements, and often you'll get major outcry about poor performance on Nvidia (especially at launch) when that happens — anything to paint Intel or Nvidia in a bad light. The Last of Us Part 1, some of the Resident Evil games, Godfall, God of War... The list goes on. There are a lot of games where I can't help but question the "need" to push VRAM use beyond 8GB, particularly when the visual upgrades provided aren't actually noticeable.
That an AMD-promoted game had issues with Nvidia hardware at launch, and then those issues got (mostly/partially) fixed with driver and game updates within a week, is the real story IMO. We've seen the reverse as well, though not so much recently that I can think of unless you count Cyberpunk 2077 RT Overdrive. Actually, you could probably make the claim of Nvidia pushing stuff that tanks performance on AMD GPUs for any game that uses a lot of ray tracing, though in some cases the rendering does look clearly improved.
Part of this is probably also a rush to release the game head of May the 4th. Or at least that's the only thing that I can come up with. Was there any good reason to ship the game in a poorly running state, rather than spend a few more weeks? Or maybe EA just didn't realize how bad things were, and releasing it to the public provided the needed feedback. Whatever.
I did not get early access or launch access, so the first time I tried the game was with the patches already available (and there's no way with Steam/EA to stay with the original launch version). Aaron however has been playing on an RTX 2060 Super at 3440x1440 with FSR2 Quality and a mix of ~high settings since launch and hasn't had too many issues. If you try maxing out settings on hardware that can't really handle those settings, don't be surprised if performance is poor.