AV1 decoding has been present since Ampere for Nvidia, though I'm happy to see Intel ensuring it works well.
What I'm really curious about is encoding. The 4090/4080 16GB have dual encoders on the die.. I wonder how well it does vs Intel's encoder.
If it's cheap enough, I honestly wouldn't mind having it as a backup GPU (since there's no integrated graphics on the Ryzen 5900X) and/or for extra display outputs.
Nobody said the word fake. The majority was referring to it not being in compliance with the guidelines for the news section; hence the thread was moved from the news section by a moderator to the suitable area.
Does this really count as any sort of record? The 9900KS is simply a binned 9900K. Which, a long time ago, had der8auer hit a 7.6GHz overclock on all cores.