FX-8370 vs i7 5690X Gaming benchmark. Why is the FX so underestimated by the community?

Bem-xxx

Reputable
Sep 20, 2015
163
0
4,710
An old CPU like the FX-8370 has better frame time variance or better frame rate than the i7 5690X in some games.

Project Cars At 1080p
fx-8370-vs-5960x_gaming-pcars_gtx960.jpg


The Witcher 3 4K result is shocking.
fx-8370-vs-5960x_gaming-witcher3_gtx970-sliv2.jpg


http://http://www.technologyx.com/featured/amd-vs-intel-our-8-core-cpu-gaming-performance-showdown/?hootPostID=282b782e2a8d67e68bbb50aeea079e96

The FX-8350 also performs better than the i7 4930K at 4K.

http://http://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html

With Win10 and modern games, the FX 8-cores has a new life. I don't know why everyone suggests to pick an i3/i5 over an FX 8-cores.

Why is the FX 8-cores so underestimated by the community?
 
Images aren't loading for me 🙁

Since you mentioned it I wonder why tomsHardware didn't bench Witcher 3 or any recent games yet. They haven't put out their monthly Best CPUs and GPUs for the money articles. No SBM article too.

It seems they benchmarked specifically a portion of Witcher 3 where it was CPU intensive with very weird configurations (The Intel extreme CPUs tend to perform lower than their regular CPUs in general for gaming). Other sites favor Intel:

http://www.techspot.com/review/1006-the-witcher-3-benchmarks/page5.html




This review was published around the same time as ours..
[link]
They also tested with SLI which is even more CPU demanding...
[link]
Another CPU test can be found here...
http://pclab.pl/art63116-49.html
The Core i3 beats the FX-6000 series in this test with the 290X as well, the minimums are considerably better.
[link]
More testing done here...
[link]
Note the CPU results...
[link]
There is a detailed Witcher 3 performance thread here and gamer's are commenting on how well the open world game plays with a low-end CPU...
[link]
More CPU scaling results that align with my own...
[link]
I have tested more of the game and didn't find results that were nearly as extreme as what was shown in that video. Still I am happy to take another look at it, the game has been patched over half a dozen times since we ran those early tests.
 
Try playing games that benefit more from faster IPC than needing more cores. In that case an i5 will be ahead of FX 8xxx series. Look at how they fare in games like star craft:

http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/5

When it comes to it i5 is much more balance than fx-8xxx. Other than RTS mmo based game or online shooter also quite reliant on cpu with faster IPC. AMD hoping that DX12 could improve their situation more since DX12 allow for much lower cpu overhead and much efficient multi cpu core use but not all games going to use DX12 in the future.
 


A lot of games will be DX12.

-Deus Ex
-Hitman
-AotS
-Tomb Raider 2016
-Battlefront
-Mirror's Edge
-Fable: Legends
-Mass Effect Andromeda
-Ark
-King of Wushu
-Unreal Tournament 4
-Gears of War
-Arma 3
-Dayz Standalone
-Killer Instinct
-Halo Wars 2
-Star Citizen


 
Even if they use DirectX 12 that isn't going to help if the developers don't implement multithreading well for every part of their game. Arma 3 probably will still continue to run like crap on AMD CPUs because the developer designed the game to have just about everything run on one core, and DirectX 12 on its own won't change that. Just because the draw calls are now multithreaded doesn't mean you won't hit a CPU bottleneck in some other part of the process like AI, tracking other players, object placement or something else.

As for why AMD CPUs aren't recommended for gaming, at mainstream resolutions they are a bottleneck with higher end cards. They're still okay-ish with a midrange GPU, but move beyond GTX 960 and R9 380 performance levels and you will start to see a bottleneck unless you are pushing a very high resolution or are playing a game that is entirely GPU bound eg. Tomb Raider or Witcher 3. Part of the problem is that even if AMD does well at 4K, simply because everything becomes ridiculously GPU bound, it doesn't really matter because if you have the money to buy $1500 worth of graphics cards to drive 4K resolution, then you probably aren't going to be buying an AMD CPU anyway, as you have the money for an Intel setup. The savings made from buying AMD would be insignificant compared to the cost of the GPUs alone, and you'd be stuck on a 3 year old platform missing a bunch of newer chipset features.