hopefully dx12 helps the AMD cpu's a decent amount like Mantle does in games that support it. I haven't kept up like I used to with PC news, appears Broadwell wasn't a huge splash in the desktop scene(just like I had been saying for awhile for people not to spend money on an h97 over an h81 to 'upgrade' to broadwell for a wimpy improvement) , now we're in Aug on Skylake, I watched the vid and still years later no reason to upgrade over my stock i5 3570k, somebody on Sandy Bridge would still be more than fine. Even the old 2500k was still holding up and beating the 8350 most of the time. I would have liked to see the fx 6300 in those tests, from everything I've gathered over the years, in real world gaming and even multitasking, the 6300 really isn't that much of a drop-off from the 8350
electricity costs and heat are a concern with me on the fx 8350 I'll be honest, but on a budget I'd get an 8320 (or even 6300) and pair it with a decent gpu for 1080p and play at high or tweaked ultra settings and call it a day and be happy. I've literally never seen a person who had a high end 120-144hz monitor use an AMD cpu anyway. (I could certainly be wrong though.) Most gamers are on a 1080p 60hz monitor though, and probably quite a bit honestly still on 1600x900 or even 720p.
With dx12 (although I'm weary of windows 10 with all the spying and privacy issues) the AMD FX cpu's will age well, they'll be good for at least this entire console gen, both consoles are using (low clocked) AMD 8 core APU's based on Piledriver design.
I'm just an average person who doesn't have to have the most fps in the world or the most ultra maxed out graphics though, as I get older I honestly care less and less about gaming, I write this from my 'sucky' first gen AMD llano a8 laptop w/6gb ram and w/ a slow 5400rpm hdd and win 8.1 and it's speedy enough certainly for every day tasks and non demanding games, my i5 3570k stock desktop w/ 8gb ram and win 7 on an SSD load stuff instantly, with a gtx 660 as the gpu, I can still play most games on 1080p high settings.
In the back of my mind at first I didn't like turning stuff down to play Witcher 3 since my gtx 660 is the minimum gpu for that game, but after watching youtube gameplay vids, all I did was turn AA off, shadows to low, and hairworks off, and it still looks good, and everything else is on high (not ultra) and the game plays smooth with no lag. What I'm getting at is, contrary to online elitists, turning some settings down isn't the end of the world, and saves alot on your wallet. Not everybody is a tryhard mlg gamer who has to have 120fps+ or 4k ultra graphics. Some of us grew up on the snes and black and white tvs 😉 at least that kept me humble lol.
in closing, I wouldn't have any qualms getting a cheap 6300 or 8320 and pairing it with 8gb ram and an r9 280x and be happy playing games on 1080p high(if not tweaked ultra) . I'm not super into single thread heavy games like mmo or rts(not my thing)
The new i5 is certainly nice, but certainly not worth the money over my current 3570k, and even the 8350 in those tests stayed around 60fps anyway roughly.
on a sidenote, geez, I guess I missed it, but did all those people saying to spend more for a z97/h97 for broadwell support get quiet like those people that said "you'd never need more than 2gb vram for 1080p" ? (lol...)