How about these two observations, which I did hint at in the article:
-------------------------
1: AMD's H.264 quality basically didn't improve since at least Polaris (RX 400-series) through RDNA, and RDNA 2/3 only provide a minor bump in quality at the lowest bitrates. Performance did improve, however, from ~100 fps with the RX 590 to ~425 fps with RDNA3. Also interesting was that performance was faster with RX 5700 XT than with the 6900 XT. Nvidia's H.264 quality also has remained virtually unchanged since at least Pascal at 1080p, but the 4K encoding quality showed a clear bump going from Pascal to Turing (Turing through Ada have been pretty static, though).
Performance has also only improved from about 305 fps at 1080p with Pascal to 490 fps with Ada, or from ~83fps at 4K with Pascal to ~134fps with Ada. However, both Nvidia's base performance and quality are significantly better than AMD's to start with.
There's literally no good reason why, in over six years, AMD hasn't been able to close the quality gap in H.264 encoding at all. It's rather sad that the most popular video codec of the past 15 years or so (it was first published in 2003) has received so little attention from AMD.
And it's not like Nvidia is alone in doing better with hardware H.264 encoding. Look at Intel's QuickSync. Sure, the UHD 770 is behind Nvidia by a few points, but it's not the 10~20 point delta that AMD sees. I'm pretty sure the UHD 770 isn't significantly changed on QuickSync compared to even Skylake's Gen9.5 HD 530, or maybe Kaby Lake's HD 630. Actually, I think HEVC got most of the updates after the Ivy Bridge (3rd Gen Intel) generation, and H.264 support was mostly static, other than the quality bump with Arc.
-------------------------
2: HEVC quality and speed of encoding sort of peaked in 2016~2017 and then received a bit less emphasis going forward. AMD's Polaris and Vega generation GPUs had slightly better results in HEVC quality comparted to RDNA 1/2, though RDNA 3 has mostly recovered the lost quality. Speed got a big boost from Vega to RDNA, though, and then RDNA 2 actually got slower. Somewhat similarly, GTX 1650 (Pascal encoder with Turing non-RTX GPU cores) had higher performance than the GTX 1080 Ti as well as the RTX 2080 Ti and RTX 3090. RTX 4090 did reclaim the performance crown, however.
Again, this suggests after building hype for HEVC (originally released to the public in 2013), over the next 3~4 years there was some excitement thinking it would replace H.264 as the codec of choice, but the royalty fees ended up killing that. So now companies are putting more effort into AV1 and hoping that can truly put H.264 in the rearview mirror.
-------------------------
The reason for the line charts on quality is because increasing bitrates correlate proportionately with increasing quality. Lines give you a clear slope, bar charts don't. But if you really want the raw numbers, it's all in the bottom tables. I pointed out in the text that there was a lot of overlap in quality on the AMD vs AMD and Nvidia vs Nvidia, so sometimes bars are obscured — completely in the case of AMD's RDNA 2 vs RDNA 3 with H.264, as they have exactly the same quality results. Nvidia's 1080 Ti and 1650 are also identical, for both H.264 and HEVC, while the 2080 Ti and 3090 are only perfectly identical on their HEVC results.
I did post a video of this yesterday, where I skipped including the GTX 1080 Ti quality lines and the RTX 2080 Ti quality lines for precisely this reason. I figured our readers would be able to grok the results, or refer to the numerical table for the full explanation, since many of them prefer more data rather than less.
View: https://www.youtube.com/watch?v=elZH8iXGTPk