From 60 FPS to 140 FPS Nvidia research shows essentially linear increases in gamers kill/death percentage death ratios.
You mean "Nvidia marketing nonsense". Actually, a lot of what they claim about higher frame rates making it a bit easier to aim is true, and it's actually a pretty reasonable "article" for the most part, at least until you get down to that rubbish data tacked on to the end. The K/D ratio part is little more than deceptive marketing. They're trying to make it sound like getting a high-end graphics card and high refresh rate monitor will double a player's performance, but they are using data in an incorrect way to come to that conclusion. If you actually follow the links through to their own study they are referencing, you see that this is how they acquired that data...
One of the common metrics of player performance in Battle Royales is kill-to-death (K/D) ratio -- how many times you killed another player divided by how many times another player killed you. Using anonymized GeForce Experience Highlights data on K/D events for PUBG and Fortnite, we found some interesting insights on player performance and wanted to share this information with the community.
So they simply looked at anonymous player statistics, and found that the players with higher K/D ratios were also more likely to be getting higher frame rates. They used this to support the suggestion that the high framerates were what was making them play so much better, but in reality, the data should be correlated in the opposite direction. Namely, players with the highest K/D ratios have them largely because they play the game a lot, and are more likely to buy higher end hardware to support their gaming than the more casual players who don't have such high K/D ratios.
SMT won't really improve game performance and games love high clocks - its straight forward.
For an 8-core processor running today's games, that's reasonably accurate. For a 6-core processor, SMT is already helping to avoid performance hitches in some demanding games. That will only become more of a concern in the years to come, and may even impact performance on these 8-core parts within the next couple years or so. And that's even more important if people are running tasks in the background, whether it's for streaming, or just something like a web browser, which review benchmarks don't really test for. With Intel adding SMT across the lineup for their new processors within the coming months, having 6 or 8 cores with SMT will become the norm for nearly all new mid-range or better gaming systems, and while developers won't abandon targeting lower thread count processors overnight, you will likely see more performance hiccups on systems lacking SMT down the line. The slightly higher per-core performance of the current Intel hardware in games will also be beneficial of course, depending on the game, but the benefits of SMT are there too, even if they are not as straightforward at this time.
And as an example, the latest Threadripper was going to cost me and estimated $600 more per year in power consumption
This seems pretty unlikely even if the cost of electricity your region were absurdly high. The current average cost of electricity in the US is about 13 cents per kilowatt-hour. At that cost, 1 watt of power draw running 24/7 for an entire year would cost you a little over a dollar ($1.14). In order for the processor to be drawing $600 more per year at that rate, it would need to be drawing about 525 watts more than another competing processor doing the same amount of work, and would need to be running under full load for that entire year. If anything, the 3000-series Threadripper processors should be significantly more efficient than the competition, so I don't see any way that those numbers could come anywhere close to panning out.
I don't want Intel to be poor value, but they are not even trying. As we all know, the 9900k should be the price of the 9700k, I won't buy Intel until then.
I mean, that's pretty much what should be happening in the coming months. The i7-10700K should more or less be an i9-9900K at a somewhat more reasonable price, from the looks of it. Of course, it will still be around $100 USD more than an 8-core, 16-thread Ryzen, plus the cost of a capable cooler, which for a processor with that level of heat output should be close to another $100 extra. So while the value should be improved, I'm not sure it's quite there, and aside from very high-end systems, one would probably still be better off putting that money toward graphics hardware instead.