I just didn't really get it.
Rather than reading the whole article, just view the images for easiest understanding. E.g:
1080p ultra:
GTX 1070 sits at ~45 FPS while RX 7600 sits at ~82 FPS. Almost double of what GTX 1070 can do. And this is geomean of some latest games.
Namely:
The eight games we're using for our standard GPU benchmarks hierarchy are Borderlands 3 (DX12), Far Cry 6 (DX12), Flight Simulator (DX11 Nvidia, DX12 AMD/Intel), Forza Horizon 5 (DX12), Horizon Zero Dawn (DX12), Red Dead Redemption 2 (Vulkan), Total War Warhammer 3 (DX11), and Watch Dogs Legion (DX12). The fps score is the geometric mean (equal weighting) of the eight games.
No clue what "rasterization' is.
GPU rasterization is simply the process of computing the mapping from scene geometry to pixels. Aka the thing what GPUs mostly do, when playing games (actually generating any kind of image, including your desktop).
Then it started talking about how the top one, the rtx 4090 bottlenecks at 1440 and especially 1080? So the most expensive top of the list GPU bottlenecks at 1080p? That's bad right? lol
There is no such thing as bottleneck.
In every system, performance is limited what the hardware is capable of.
For example, in gaming and e.g CPU bound game (e.g Cities: Skylines), the FPS is limited on what CPU is able to compute. The better the CPU - the more FPS you get. But since CPU bound games are usually slow-paced strategy games, high FPS isn't needed for them. GPU plays a little role in such games.
But for GPU bound games (e.g Cyberpunk 2077), the FPS is limited on what GPU is able to produce. The better the GPU - the more FPS you get. And since most faced-paced games are GPU bound, high FPS matters quite a bit. Due to that, many people upgrade their GPU far often than that of a CPU.
It is nigh-impossible to create a system that utilizes both the CPU and GPU at same level. Well, if you focus on one single game, you can create that balanced system but once you change to another game, all that balance is out of the window.