AMD's Ryzen 9 9900X processor tumbles to just $332 - power up a new AM5 PC with this holiday bargain

Unless 9900X3D unsuckifies the 9900 series, it's game over for this line. 7900X3D was bad all round. Hopefully AMD is weaving some magic to finally make 9900X3D viable. I'm not buying 9950X3D ever and want more than 8 cores of 9800X3D. If 9900X3D sucks, and Intel fixes AL gaming performance, I'll get 265K.
 
Unless 9900X3D unsuckifies the 9900 series, it's game over for this line. 7900X3D was bad all round. Hopefully AMD is weaving some magic to finally make 9900X3D viable. I'm not buying 9950X3D ever and want more than 8 cores of 9800X3D. If 9900X3D sucks, and Intel fixes AL gaming performance, I'll get 265K.
If you're expecting a miracle from software/OS, it would also apply to the 7900X3D. It's going to be the same layout, same 1 CCD with cache, 1 CCD without.

If Zen 6 moves to a 12-core unified CCX, then that will be the 9900X3D burier giving you 12 cores with no compromises, but the actual 9900X equivalent will become somewhere from 16-20 cores.
 
AMD doesn't want to make 6 core CCDs.

They just happen when they try to make 8 core CCDs: defects cannot be avoided.

Now are these chips really bad or unusable?

I'd argue that many people won't eben be able to tell when using a 6 or 12 core computer instead of an 8 or 16 core variant.

Most games still use surprisingly little CPU power and I had FS2024 running on just 2 out of 16 cores via Lasso on my 7950X3D the other day, without it doing better or worse in VR. And I couldn't measure any difference between V-cache and non-V-Cache either ony my RTX 4090 at 4k.

Without VR things are pretty smooth with way less hardware, with VR it stutters badly, no matter how much hardware you throw at it.

And while I was testing Windows 11 on older hardware I was astonished to see how well an Ivy Bridge i7-3770 with a GTX 980ti was running games at 2560x1440: ten year old hardware available for pennies 2nd hand running rings around any Strix Point.

And most consoles are way weaker in the CPU department than these chips.

Personally, I've always tended to prefer perfect bin variants, because saving a few hundred € on a system costing thousands overall didn't make sense and because my machines are very multi-purpose.

But if you're a gamer looking for a way to shoehorn the bigger GPU into your budget, these "bin failures" may help you without really ever becoming a bottleneck.

Just have HWinfo or similar run in the background and observe what's going on: there is a lot to learn there about the small correlation between hype and reality. And then you may consider if 400FPS are tangible value when your monitor or you skills are outclassed far earlier.

And if things eventually change, you can still upgrade once prices and your budget find common ground.