The OP is using a 1440p monitor, you crank everything up and run Cyberpunk @1440P and any CPU bottleneck will be gone. On most of the newer games they would not be one anyhow but as said before you can always hand pick games to get your desired results.
Who would even buy a 4080 for 1080p Just to give you some numbers with a 5900X that is about right on par with a 11900K (actually a bit slower)
I didn't say that he was buying it for 1080p, he asked if there would be a bottleneck and since CPU performance is only ever really tested at 1080p or 720p, I told the OP what information I could find and showed him the video. At the end of my post, I did tell him that increasing resolution often nullifies it.
Nobody knows where the bottleneck would be at 1440p or 2160p so I just told him what information was out there.
Cyberpunk 2077's latest update adds an awesome built-in benchmark, and lets you stress test your PC like Crysis should've.
www.tweaktown.com
Notice the 6600 is not even listed. (6700XT 53 FPS I guess that is why the 6600 is not even listed)
And yes that guy should not be allowed to even post to YouJunk anymore.
Compares a 10900K and 6600 to a 3090ti but powered by a 1700X processor THIS IS A 100% Joke of a video when he went to 1440p he should of at least compared the 2 cards using the 10900K that video =

I think that you're missing the point of the video. His mission seems to be to teach noobs weird and wonderful things about PC tech. That's why he runs his show like he's Mr. Rogers. The comparison of the RX 6600 to the RTX 3090 Ti was just him using a completely insane and unrealistic setup to prove his point.
Noobs are notorious for building unbalanced gaming PCs and I think that he made this video to show just how important it is to try to strike some kind of balance between the CPU and GPU.
I saw another YouTuber, a young girl (I can't remember her name) and she had a video called "Check out my first gaming PC build!". When I saw what she had chosen for parts, I felt like I was in the Twilight Zone. Her CPU was either an i7-13700K or an i9-13900K (I think it was the latter) with some expensive 360mm AIO cooling it, some expensive Z790 motherboard... and an
RX 580 2048SP for a video card! I think that if she had seen this Tech Deals video, her build would've been fantastic instead of almost utterly useless.
I called it one of the most cringe-worthy gaming PC builds that I'd ever seen because, for what she must have paid, she could've had literally
triple the gaming performance. To us, the experienced tech enthusiasts, this is just common sense but nobody is born knowing anything and I think that channels like Tech Deals, whose target audience is clueless noobs, are as important as shows like Sesame Street are to children.
As for why I never forgot the RX 6600 and i9-10900K, well, that's because I
never would've believed it if someone had just said it to me. The idea that the "lowly" RX 6600 would be bottlenecked by a 10th-gen i9 would've struck me as insane but I couldn't deny what my eyes were seeing and it blew me away. Now, sure, the bottleneck is at 1080p (but that's what the RX 6600 is for anyway) and it's a very small bottleneck but it does illustrate just how massive GPU performance gains have been lately when compared to CPU performance gains. It also showed just how much we underestimate so-called "low-end" gaming hardware.
I know that for super high-end cards, CPU bottlenecks are largely irrelevant because of the resolutions used. but the OP asked the question and I answered it as well as I could. I knew that what I had to tell was going to sound guano-insane so I posted the video just to show that I hadn't been sniffing glue (or worse).
