The biggest issue ppl had with the FX were the temps. There is no such thing as cpu temp but yet ppl are so ingrained by Intel, the 70° is the cutoff and it has to be an actual number they can understand that they ran into trouble. The FX cpus are literally physically built without any means of reading a temp.
Instead, they use a thermal margin that winds down to Zero. Amd Overdrive. It's an algorithm created to take the core loads, voltages, amount of cores, package temp and some other values and come up with a range of heat. So you could run a heavy load with several cores and get a value of 20 or run a light load maxing 2 cores and get the same 20. Doesn't mean you are at 20°, or have 20° before maxed out, simply means 20 and it's plenty above Zero.
So when overclocking, you'd push every core to its max, and hopefully end up anywhere above Zero, the actual number wasn't important, just it's relative placement above Zero. Generally idle was in the 40's, windows working 30's, heavier app working 20's, gaming teens, stress tests in the single digits.
Overclocking is not a cure for stutters. Ever. In Battlefield 4, the 8350 was 2nd best fps, right behind the i7-4790k, and significantly ahead of the i5-4690k, even overclocked, and the results got better after @ 4.0GHz. Some say that was a fluke, but it's really a precursor, BF4 was the first game to leverage cores vs speeds, like more modern games, unlike prior games that leveraged core speeds vs limited cores, which Intel excelled at.
The FX sucked at CSGO, simply because that game is closed, only uses 2 cores, no rollover, so is highly Intel bound, higher IPC rules. And the FX generally had @ 66% of the IPC of a 3rd gen Intel. So gaming experience changed depending on the game.
The biggest problem was the time gap between FX and Ryzen, Intel just got stronger and better between 4th Gen and 9th Gen and FX did nothing, even if gaming experience did somewhat increase as more games followed in BF4 design. But with complexity, and need for IPC gains, that didn't last long. Now most games are open ended, will use 1 master core and as many supporting cores as needed, generally starting at 4 and ending up @ 10, so a 6/12 cpu is plenty good atm, and 4/8 cpus can suffer a little on heavier titles. As long as there's sufficient IPC.
Which makes the 12100 extremely good value for what it is, with lesser cpus making up in thread count a balance, to a point. In simple math, if a cpu had an IPC of 100 and a speed of 3.8GHz, that'd be 380 G instructions per second. A cpu with 50 IPC would need to be at 7.6GHz to get the same amount of instructions per second, same fps. Since the FX is only going from 4.0GGz to 4.6GHz OC, you end up closer to moving from 100G to 150G, not 380G, so relative fps is in the toilet, very little gains seen overall, even with a hefty OC. Simply not enough gains to warrant spending money on other than for a nostalgia project.
Years ago, when the FX was newer and still somewhat competitive to Intel, a jump of 10-20fps with OC was a good thing, but when a modern cpu is starting out 50-100 fps higher than the better OC fps, moving from 100fps to 120fps isn't nearly worth the price of starting out at 150-200fps in the same game, for roughly the same amount of money.