Fx-8350 powerful enough for the GTX 1070

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Yummiesttag

Reputable
Jul 13, 2014
63
0
4,640
I have a fx-8350 stock clocks and I don't plan to overclock but do you guys think it will be able to handle the 1070 or would it bottle neck?
 
Solution
Turned into an and/Intel flame war again.the op was not talking about buying an 8350 & a 1070 .
He ALREADY owns an fx 8350 setup.
Yummiesttag - If you want to buy a 1070 then buy one .
You won't get the absolute maximum use out of it with an fx chip but that doesn't mean its not a viable purchase.
It'll offer you future proofing on the GPU front irregardless & allow you to play any game on max graphical settings.Who cares if you're not pushing 100fps+ if you're running a 60htz screen , the 8350 is still entirely capable of pushing perfectly playable frame rates on any title out there.
Why not post comparisons for solitaire while we're at it. I'm sure I have an old p2-450 that competes as well as an 8350. It's already been mentioned that each game is a different program, if people lump 'gaming' together as one catch all category they're missing the point. If a user plays the division, the amd works well. Given the similar performance levels it appears the game is not very cpu heavy and instead is limited by the gpu. The i3, a mere dual core cpu, keeps up with the best amd has to offer at half the price so apparently all you need is an i3 for the division.

Same story with doom, a 4th gen i3 outperforms an fx 8370 by around 10fps min and 5fps average fps while costing $30 less. The skylake i3 6100 in those comparisons runs $70 less making intel's i3 clearly the winner here.

Sine an 8370 runs at the same price range as an i5 (which I noticed was conveniently left out of those dx12 comparisons) I see no advantage for amd even with dx12. What happens when the op goes to play fallout 4?
http://www.techspot.com/review/1089-fallout-4-benchmarks/page5.html

From that benchmark comparison, the article reads, "The top 10 processors that we tested were all Intel with the fastest AMD processor, the FX-9590, being beaten by a Haswell Core i3 and dominated by a Skylake Core i3. Worse still, the FX-8350 was almost 30% slower than the Core i5-2500K and a little over 30% slower than the Core i5-4690K."

Far cry 4, the 8350 is bested by a cheaper i3 again.
http://www.techspot.com/review/917-far-cry-4-benchmarks/page5.html

According to techspot even an overclocked pentium g3258 will match an fx 8350 in rise of the tomb raider. You realize that's a $65 dual core cpu right?
http://www.techspot.com/review/1128-rise-of-the-tomb-raider-benchmarks/page5.html

Also keep in mind that there's one site offering those dx12 stats which makes me question it a bit. Not just with this, the same goes for anything. Cpu cooler performance tests, any sort of comparison. Why is there just one random benchmark out there and not similar tests done by TH, anandtech, techspot, wccftech, eurogamer etc etc.? Seems a bit dubious to me. The fact is that for tomb raider it's a dx12 patch not a dx12 coded game and the results are pretty poor offering worse performance than dx11.
http://www.tomshardware.com/news/tomb-raider-dx12-vxao-patch,31396.html

Have a read on some other reviews of it and people's experiences with the dx12 patch, like this statement from overclock3d. "In all cases we have seen a performance decrease when using the DirectX 12 API in Rise of the Tomb Raider instead of DirectX 11, showing that a lot of work still needs to be done in order to make using the DirectX 12 API worth using in the game."
http://www.overclock3d.net/reviews/gpu_displays/rise_of_the_tomb_raider_directx_12_performance_review/6

Given the consensus that dx12 is running worse for rise of the tomb raider I'm going to venture and say that a single chart posted by some random site is smelling a bit like bunk. But hey, I'm the ignorant one right? :)

 
Turned into an and/Intel flame war again.the op was not talking about buying an 8350 & a 1070 .
He ALREADY owns an fx 8350 setup.
Yummiesttag - If you want to buy a 1070 then buy one .
You won't get the absolute maximum use out of it with an fx chip but that doesn't mean its not a viable purchase.
It'll offer you future proofing on the GPU front irregardless & allow you to play any game on max graphical settings.Who cares if you're not pushing 100fps+ if you're running a 60htz screen , the 8350 is still entirely capable of pushing perfectly playable frame rates on any title out there.
 
Solution
@synphul you're still ignorant and fanboy my bro, the difference in tomb raider with processor is only 6fps man, wtf?, is not worth new +200$ system for that for no mention in dx12 fx 8 cores win a huge fps

and far cry 4, just old engine bad optimized for multithreading, fx 9590 is on the top of the Division bro, and ubisoft made both games

 
i have a fx 8320 overclocked to 4.7ghz, runs great even GTA V, max settings 1080p with gtx 970, some heavy parts drop to 50 fps, but i know gta v is no optimized for amd processor, bf4 runs max msaa x2, 1080p 90fps constan, heavy parts down to 70

<mod edit- Watch the language>
 
I just realized that Synpaul is a moderator so this will be my last time coming to this website... Clearly tomshardware is clueless seeing as how they have chosen him to represent them on the forum. He probably gets paid to lobby for intel... Rather than post a giant wall of text confirming that I am a dumbass *cough synpaul cough*...

just watch this video. It will show you firsthand everything you need and you can be the judge for yourself.
https://www.youtube.com/watch?v=WZ_5p9wd2dk

gg mic drop pz tomsh*tware
 
It wasn't intended to be an amd/intel war. The op did ask about a 1070, the fx 8350 is already a bit of a bottleneck with 970's. It's up to them whether they think it's a viable purchase or not. Especially since they don't plan to overclock the fx 8350 which just about ensures a bottleneck. Whether a game is optimized to someone's liking or not doesn't change what it is, the games are the games. So many routes to take, either lump it and accept the performance of what you have, play only certain games, avoid others etc.

I only wish I got paid to lobby for any hardware or software manufacturer. Then I could quit my day job. Instead the moderators here donate their time to trying to help others, my opinions are mine, the info I provide I do my best to back up with multiple citing of information done by others which I have no affiliation with. In the end my advice as a fellow enthusiast is free so take it or leave it, no one is out anything.

As was pointed out by another long time member of the tech community and another moderator here in the 3rd post, the fx 8350 at stock speeds will be a bottleneck. SR-71 Blackbird suggested it would be a bottleneck and an i5 would be a better route to take (I would assume if this were possible for the op) as a friendly suggestion to get the most out of a piece of hardware they're looking to dump $300+ on.

Many others suggested there would be a bottleneck. In fact the only people I see saying just do it it'll be fine, the fx 8350 is an awesome chip are the fanboys with very little info to back that claim up, ridiculously accusing moderators of being paid shills for companies and so on. Youtube videos prove nothing and are one of the worst forms of comparisons out there next to cpuboss and the likes.
 


Going away won't matter. If you go on any other popular tech forum in general (like PCPP), they will say the exact same things as synphul said.
 


tom'sHARDWARE is not such a community. If we guys get paid to be fanboys, we would be so rich :lol:
The moderators aren't like what you think :kaola: Time you leave this community, this ain't "tomsh*tware". It is the best forum for technology, and you are a disgrace.
Time you realize, a four year old chip., a heater, low IPC, etc. Then why can't a recent chip be better?
 
i just can't understand why people have think they know what bottleneck is...

Bottleneck its a drop in general performance due to one particular piece of hardware not being able to keep up with the rest of the system.

A good example of bottleneck will be 4gb of ram for modern games like GTA V... the lack of ram might cause several framedrops and thats the way bottleneck works.

On the other hand the different hardwares have different performances, and that statement its more accurated when the particular software its added to the mix.

So depending on particular software and the ammount of process being executed by the system the Fx 8xxx will work closer to i7 or closer to i3 performance... BUT

there's no reason to think Fx8350 will bottleneck GTX1070, since it can handle GTX980 in 2x SLI. And on the other hand that's not the same as saying Fx8350 will get the best results out of the GTX1070 for every single game.

So to the very specific question made on this topic, NO, there's no real need to change your CPU in order to make the GTX1070 to run at his max. Even with no overclock, but again, a little overclock will indeed help.

I would keep posting links to validate my points of view, but noone cares, all the fanboysm its quite fact-proof.

PS: if the question where talked about GTX1070 paired with i7 3770k at stock i would have answered the same, and that CPU its the closest Intel counterpart of the Fx8350 (and i love that i7, for the record)
 
I'm not sure how the 3770k is the closest fx 8350 counterpart. Even an i5 3570k outperforms the fx 8350.
http://www.anandtech.com/bench/product/701?vs=697

As does the previous i5 2500k.
http://www.anandtech.com/bench/product/288?vs=697

The weak single core performance of the fx series is what causes many of the issues, steeper fps drops and lower min fps in quite a few games. That's what zen is hoping to overcome, they recognize it's an issue and have addressed it. Many users with fx chips have helped overcome the weaker core performance by giving their fx a substantial overclock though the op already mentioned they have no desire to do so.

I don't know if the op got scared off with the heated debate but I don't recall them ever saying what their current gpu is. If they're gaming on an r7 240 then sure a 1070 makes sense. However if it's a higher end amd card or gtx 970 or something I would say the cpu would benefit them more than a gpu would.
 


My 2500k is way better than my 8350, in pretty much everything, except when im running a ton of virtual machines.
 


You dont agree with him, therefore you are a fanboy. I dont get why people cant debate with each other intelligently, instead they totally devalue their argument by sound like a ranting 12 year old. I hate the term "fanboy" why not use like "biased"?
 


when games are optimized for amd, this is what happen, intel fanbot :).
[video="https://www.youtube.com/watch?v=WZ_5p9wd2dk"][/video]
 
A bunch of videos showing that the intel CPUS get better frame rates?

Its not an optimization issue, its that AMD sacrificed single core performance and crammed more cores on the CPU instead - This is actually not bad for some things, my 8350 is very useful for running vms and other highly threaded processes, but the vast majority of programs (and games) are coded to do most of the work in one thread, which the FX cpus suck at.
 


Pffff.... that's really a stupid reply...
Of course the 8370 is way better than my "old" Q6600, that's the point, that's why I bought it and that's why I chose it.
It does more than I can throw at it, and it didn't cost me an arm.
The FX sucks at games? Not mine... It does pretty well, it goes even great here. It does great with my 40+ opened tabs. It does great when encoding. It does great when I'm playing with photos.

Let me add something that defines Intel fanboyish: when an FX appears in a thread, the word "sucks" always comes out from them fanboys...
 


I dont know what that video proves. The 8350 seems to be doing worse than the i5's in every game other than the witcher in some areas, pretty much what I expected.