It depends on the game, the resolution, the gpu etc. There's no such thing as 'gaming' in a one size fits all application. Gaming ranges from solitaire to witcher 3.
Comparing cpu's using mid grade gpu's tells little other than in some games the gpu is the bottleneck. A gtx 1060 isn't a 1070 isn't a 1080. Showing benchmarks to a 1060 when referencing potential bottlenecks for a much stronger 1070 means nothing. It's not hating, it's plain fact.
If it were just the internet hating on amd's fx chips then amd would have accepted the fx as the greatest thing since sliced bread. As it happens, they realize it's not which is why zen is such a departure from the fx lineup. Better efficiency, higher ipc core performance, smt more similar to intel's ht. Why move away from something so incredible? That makes no sense.
Bf4 is a popular game but a poor example. It will likely be gpu limited before it's cpu limited. Hence the similar fps. When looking at a game like arkham knight, the i7 clearly has an advantage with the 1060, between 11 and 14fps min fps and the fx 8350 barely maintains 60fps. Dropping lower can potentially result in stutter/frame drops during intensive scenes on a 60hz monitor. Even when the 8350 is oc'd to 4.6ghz it only gains 3fps.
Those ashes of the singularity benchmarks go off the reservation a bit, instead of comparing min fps and avg fps as they do in others, they ignore min fps and instead choose to show dx11 vs dx12. Why leave out important benchmark results? It's an incomplete result vs other games, only comparing averages which is only half the story. Considering the other games are measured as min/avg if someone wasn't reading the benchmark graph correctly they might get the wrong impression.
With a 1060 the i7 gets 24fps better performance at min fps than the fx 8350 gets on average in arma3. On a 60hz monitor there's a difference between low 40's fps and being able to maintain 60fps.
Of course at 1440p the cpus become closer as midrange cards like the 480 and 1060 become the bottleneck, pushed to their limits trying to push the additional pixels. That's what 1070's and 1080's are for, not the 1060 or 480. Gimping the gpu with too high of a resolution to try and show two cpu's performing similarly is bogus benchmarking. A bit like firing up solitaire so I can claim a pentium 2 is just as fast as today's i7, it makes no sense.
In a number of those games there's a large difference between the two and if using a 120hz or 144hz monitor could matter a whole lot when one cpu is 40fps more than the other, in the case of gtaV.
It's not to say the op won't get a benefit from the 1070 with their fx over say a 750ti, sure they will. Will the fx bottleneck it in a number of cases? Yes it will. More or less pointing out how skewed that bunch of benchmarks is from hardwareunboxed and how even if their comparisons were more in line with one another - ie, min/avg fps in all the games instead of min/avg in one, dx11 vs dx12 avg fps only in another, dx11 vs vulcan avg fps only in another - it's still comparing two gpu's neither of which the op was asking about which makes the whole comparison to the op's question a bit pointless.
What good are benchmarks for the r7 240 when asking about the r9 480? Or a gtx 1060/r9 480 when discussing the gtx 1070?
So long as someone is on a budget or they're happy with the results of their fx cpu in the games they play, more power to them. No one is forcing anyone to upgrade components but when the question is asked whether there's a performance difference then yes. Is it worth the upgrade? Only the person making the purchase can decide that. Will it improve someone's gaming experience? Only if the better performance applies to the games they play. Some people prefer one title over another, one type of game over another and some play one or two games while others want to play a variety with the best performance.
If someone is interested in playing just cause 3 on a 60hz monitor then it matters squat if the fx plays doom just fine. Doom isn't JC3 and the fx paired with a 1060 (since those were the bench's mentioned prior) is going to be held back at the low 50's fps while the i7 is maintaining an easy 60fps min and averaging 76fps.
Since The Division was mentioned, yes, the fx 9590 comes out 1fps ahead of the i7 6700k. However while the fx 9590 is about maxed out, what happens when a 6700k is oc'd to 4.8ghz instead of 4ghz? Not that it matters since the game runs almost identically on an i3 6100. One game doesn't make or break the overall performance of a cpu and in a game like the Division or Doom, neither are cpu intensive at all. The benchmarks prove that.
http://www.techspot.com/review/1148-tom-clancys-the-division-benchmarks/page5.html
Not bashing fx, just pointing out the facts. In less cpu intensive games it does fine just like a dual core pentium, an i3, i5, fx 4320 etc. In cpu intensive games they tend to struggle and can potentially hold back a more powerful gpu like the 1070. dx12 helps all cpu's, it's not amd specific. They see much of the benefit though due to the weaker ipc and offloading additional driver overhead is a definite boost for them. They don't come out ahead of intel's cpu's though. We're still waiting for native dx12 games, compatible dx12 drivers from both camps for gpu's etc. Dx12 in fully functioning form is still in the baby stages.
http://www.guru3d.com/articles_pages/total_war_warhammer_directx_12_pc_graphics_performance_benchmark_review,8.html