Fx-8350 powerful enough for the GTX 1070

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Yummiesttag

Reputable
Jul 13, 2014
63
0
4,640
I have a fx-8350 stock clocks and I don't plan to overclock but do you guys think it will be able to handle the 1070 or would it bottle neck?
 
Solution
Turned into an and/Intel flame war again.the op was not talking about buying an 8350 & a 1070 .
He ALREADY owns an fx 8350 setup.
Yummiesttag - If you want to buy a 1070 then buy one .
You won't get the absolute maximum use out of it with an fx chip but that doesn't mean its not a viable purchase.
It'll offer you future proofing on the GPU front irregardless & allow you to play any game on max graphical settings.Who cares if you're not pushing 100fps+ if you're running a 60htz screen , the 8350 is still entirely capable of pushing perfectly playable frame rates on any title out there.


Why is it a stupid reply?

You compared your FX to an old core 2 quad, not a modern Intel cpu.

I still don't understand why me pointing out the flaws of my FX 8350 is me being a fanboy? Would a real enthusiast ignore them?
 


let's analyse your reply then
"Yeah coming from a Q6600 I can see why you would think the 8370 is good.."

It's about the same thing as writing "coming from a rusty iron anchor anything would be better, you have no real vision"

I said I switch from that core to the FX because you wrote:

So I meant that I did want to buy one now, and I did buy one. I looked carefully at the actual market, what's available, what I do, what I needed, and how much I wanted to spend.
I expected a big improvement over the 6600 and I got it. I would have got it with an iX but my choice went for the FX, that's all.
It's a stupid reply because I didn't buy a sack of shit. Like I said, I looked carefully at what I was doing with my rig, what I wanted to do, what I expected it to do more, and how much I wanted to spend.
So I've bought a FX8370, a Gigabyte GA-990FX Gaming, a Noctua NH-D15S, and could use my 32GB DDR3 RAM. All the games I play (I don't play a lot) are configured at the max, no lag. All the applications I use don't bother the FX at all. All the cores are used.
So yeah, coming from an 8 yrs old CPU, I can see the difference indeed. Doesn't mean the 8370 (or even the 83XX/e) is bad at all.
The IPC is not as good as on the iX? I knew that. But like I said, the apps I use use more than 1 core, so...
 


in game sthat use 8 cores, no bottleneck, in games that dont

MASSIVE BOTTLENECK
 


 
Dont know about the 8350 but i have the 8320 which is basically a 8350 when overclocked. I overclocked mine to 4.4ghz and i also bought this newly released gigabyte g1 gaming gtx 1070. And yeah the 8320 is definitely hurting. Lots of stuttering during gameplay. Fps didnt really improve on my games-starcraft2 and xcom2. but I was able to hit 60fps on ultra (no hairworks) on Witcher 3 though. Before 1070 its was a 650ti and i play 25-35fps on low or medium settings. In conclusion i was kinda disappointed with my purchase of 1070 as it didnt give me what i hoped for which is basically 60fps veryhigh details on 1080p gaming. Yup all i asked is 1080p not 2k not 4k. 10 <language edit> 80p. Is that too much to ask? And then again maybe 1070 is like...not my fault bro. Its your amd fx8320 bottlenecking all the time duh.

Mod edit: Please watch the language
 
GTA V uses DX11, that API (and other older APIs like OpenGL 4.5) will distribute the CPU load between 2 main cores, while only using a small percentage on the other ones. Here's a vid that shows it:
https://www.youtube.com/watch?v=x69UI_5UTlQ

On the other hand, DX12 / Mantle / Vulkan does distribute the load bewteen cores way better:
https://www.youtube.com/watch?v=HOevOiJUQAE

as for a reasonable real - life example of this:
Tomb Raider with DX11, minor bottleneck (GPU going to the 80% usage) still really good experience
(FX8350 @4.5ghz +GTX980Ti 16 gb Ram 1080p)
https://www.youtube.com/watch?v=z1NBHgKHIrc

Tomb Raider with DX12 no bottleneck at all, showing the importance of a newer API. Every CPU with more than 2 cores (yes, even old Phenom IIs and i3s will see benefits from a newer API, but the greater benefit will come for the FX series from what i think)
(Fx8350+GTX980Ti 16gb Ram 1080p)
https://www.youtube.com/watch?v=DKKgA27BeoM

all in all, Fx8350 used to be a mainstream high end CPU, nowadays can be considered a mid range CPU (mid to high end for some task others than games) but by no means its a bad CPU and from what we know by now it will not bottleneck a GTX1070 or in the unoptimized games scenarios the bottleneck will be so small that it will not worth to upgrade the entire rig
 
Hey guys, this isnt really a solution, more like a question haha. I currently have an 8350 cpu and I was looking into buying a 1070. I also have a 660 gtx card which I know is underperforming with the cpu. Also I don't have that much money to spend buying a new cpu AND gpu because if I upgraded my cpu I couldn't get a new gpu which would further my problem even more. My question is, is it wiser to buy a 1070 a be set for a few years or wait it out all together? Also if I get the 1070 will my noctua nh-15d be enough to cool my cpu even when I overclock?
 
just stick with 8350 for now, get 1070 wich will still give u a very nice increase then save some more money, then when you have enuf eather get the new zen or intel depending on power to performance and while u save perhaps learn abit of oc'ing on your cuttent cpu as therir easy to play with, but if all your looking for is 1080p 60fps ultra most games then the cpu will probly last another year till u can afford a new set up, oh nd dont listen to any of the fanboys on eather side just get your moneys worth and enjoy
 


Sir, please don't call people ignorant. Tom's Hardware is meant to be a place where users can collaborate on an issue, it's just sad when they start attacking each other.

Also, just because you don't agree with research does not mean that it's not sound data. I will agree that a single game does not represent the entire gaming community, but just because it's old and not optimized does not mean it's not being played. If a game is still widely played, then I wouldn't exclude it from the list to be benchmarked.

Also, the FX 8350 may have eight cores, but Windows uses them more like a set of four cores with hyperthreading because of the architecture of the CPU itself. More on that here.

Also, the FX 9590 is really just a factory overclocked 8350, so it suffers from the same problems as the 8350. These problems include: Windows's poor usage of the cores, low IPC, high heat output, etc. Yes, there are some motherboards that technically support the 9590 but many fail to deliver steady power without the VRMs overheating. Not to mention that the 9590 just draws so much power.



Alan, the URL that you posted no longer exists.
 
The internet loves hating on the FX chips.

Modern test of the 8350 vs. 6700k on a TON of games. Stock and overclocked:
http://www.hardwareunboxed.com/gtx-1060-vs-rx-480-fx-showdown/

Guess what? They are close in 1080p and pretty much identical in 1440p. BUT OMG FX BOTTLENECKS!>!@!@@!!>@!@@!

The FX chips are fine. They had a rough go of it when everything was single threaded, but that is no longer the case. BUT OMG FX BOTTLENECKS!>!@!@@!!>@!@@!

Oh and I'm currently on the fence whether i'm building a 1230 Xeon (budget i7) or FX9370 (on sale) based PC. That's how I found this thread. So I have no real bias either way. The BS you read regarding the FX chips is astounding though.
 
It depends on the game, the resolution, the gpu etc. There's no such thing as 'gaming' in a one size fits all application. Gaming ranges from solitaire to witcher 3.

Comparing cpu's using mid grade gpu's tells little other than in some games the gpu is the bottleneck. A gtx 1060 isn't a 1070 isn't a 1080. Showing benchmarks to a 1060 when referencing potential bottlenecks for a much stronger 1070 means nothing. It's not hating, it's plain fact.

If it were just the internet hating on amd's fx chips then amd would have accepted the fx as the greatest thing since sliced bread. As it happens, they realize it's not which is why zen is such a departure from the fx lineup. Better efficiency, higher ipc core performance, smt more similar to intel's ht. Why move away from something so incredible? That makes no sense.

Bf4 is a popular game but a poor example. It will likely be gpu limited before it's cpu limited. Hence the similar fps. When looking at a game like arkham knight, the i7 clearly has an advantage with the 1060, between 11 and 14fps min fps and the fx 8350 barely maintains 60fps. Dropping lower can potentially result in stutter/frame drops during intensive scenes on a 60hz monitor. Even when the 8350 is oc'd to 4.6ghz it only gains 3fps.

Those ashes of the singularity benchmarks go off the reservation a bit, instead of comparing min fps and avg fps as they do in others, they ignore min fps and instead choose to show dx11 vs dx12. Why leave out important benchmark results? It's an incomplete result vs other games, only comparing averages which is only half the story. Considering the other games are measured as min/avg if someone wasn't reading the benchmark graph correctly they might get the wrong impression.

With a 1060 the i7 gets 24fps better performance at min fps than the fx 8350 gets on average in arma3. On a 60hz monitor there's a difference between low 40's fps and being able to maintain 60fps.

Of course at 1440p the cpus become closer as midrange cards like the 480 and 1060 become the bottleneck, pushed to their limits trying to push the additional pixels. That's what 1070's and 1080's are for, not the 1060 or 480. Gimping the gpu with too high of a resolution to try and show two cpu's performing similarly is bogus benchmarking. A bit like firing up solitaire so I can claim a pentium 2 is just as fast as today's i7, it makes no sense.

In a number of those games there's a large difference between the two and if using a 120hz or 144hz monitor could matter a whole lot when one cpu is 40fps more than the other, in the case of gtaV.

It's not to say the op won't get a benefit from the 1070 with their fx over say a 750ti, sure they will. Will the fx bottleneck it in a number of cases? Yes it will. More or less pointing out how skewed that bunch of benchmarks is from hardwareunboxed and how even if their comparisons were more in line with one another - ie, min/avg fps in all the games instead of min/avg in one, dx11 vs dx12 avg fps only in another, dx11 vs vulcan avg fps only in another - it's still comparing two gpu's neither of which the op was asking about which makes the whole comparison to the op's question a bit pointless.

What good are benchmarks for the r7 240 when asking about the r9 480? Or a gtx 1060/r9 480 when discussing the gtx 1070?

So long as someone is on a budget or they're happy with the results of their fx cpu in the games they play, more power to them. No one is forcing anyone to upgrade components but when the question is asked whether there's a performance difference then yes. Is it worth the upgrade? Only the person making the purchase can decide that. Will it improve someone's gaming experience? Only if the better performance applies to the games they play. Some people prefer one title over another, one type of game over another and some play one or two games while others want to play a variety with the best performance.

If someone is interested in playing just cause 3 on a 60hz monitor then it matters squat if the fx plays doom just fine. Doom isn't JC3 and the fx paired with a 1060 (since those were the bench's mentioned prior) is going to be held back at the low 50's fps while the i7 is maintaining an easy 60fps min and averaging 76fps.

Since The Division was mentioned, yes, the fx 9590 comes out 1fps ahead of the i7 6700k. However while the fx 9590 is about maxed out, what happens when a 6700k is oc'd to 4.8ghz instead of 4ghz? Not that it matters since the game runs almost identically on an i3 6100. One game doesn't make or break the overall performance of a cpu and in a game like the Division or Doom, neither are cpu intensive at all. The benchmarks prove that.
http://www.techspot.com/review/1148-tom-clancys-the-division-benchmarks/page5.html

Not bashing fx, just pointing out the facts. In less cpu intensive games it does fine just like a dual core pentium, an i3, i5, fx 4320 etc. In cpu intensive games they tend to struggle and can potentially hold back a more powerful gpu like the 1070. dx12 helps all cpu's, it's not amd specific. They see much of the benefit though due to the weaker ipc and offloading additional driver overhead is a definite boost for them. They don't come out ahead of intel's cpu's though. We're still waiting for native dx12 games, compatible dx12 drivers from both camps for gpu's etc. Dx12 in fully functioning form is still in the baby stages.
http://www.guru3d.com/articles_pages/total_war_warhammer_directx_12_pc_graphics_performance_benchmark_review,8.html
 
My response was for people claiming the FX will bottleneck a 970. Probably should have been clearer.

Overall, the difference between the two isn't that great. 10-20fps can be easily adjusted in the graphics settings. The 6700k is the better chip. No question. But it will also cost a couple hundred dollars more.

I doubt 1070 benchmarks between the two would be all that different from the 1060 I posted. Here are some with the 980ti, 980ti sli and GTX 1080:
MirrorsEdgeCatalyst_proz.jpg

Forza_proz.jpg

Division_proz.jpg

tr_proz_11.jpg

w3_proz.jpg

nms_proz.png


bf1_proz.png

nms_proz.png

TheTechnomancer_proz.jpg


Gamegpu.com has more comparisons. These are just random picks.

Personally, I think the 1070 would be just fine with the FX8 or 9 series chips. Unless you are chasing the highest possible monitor refresh rate, you can get away with having a mid range CPU with a higher end graphics card(s).

The FX4100 & 6100 are bottlenecks with a GTX1080. A overclocked 8350 or 9590, not so much.
 
I have an AMD FX 8350 and just got a reference GTX 1070 about a month ago. I can get 60 in Fallout 4, which is a CPU heavy game to begin with, but I get some pretty bad frame drops and the GPU usage never goes above 50% or so. I'm feeling the bottleneck for sure.
 


Something isn't right somewhere. You are barely getting 1060 performance with a 1070. I find it VERY hard to believe that a 8350 would bottleneck a 1070 that badly. Its either driver or Fallout code related. Poor coding by developers isn't a performance bottleneck.

The 8350 does just fine with the 1080, 980ti, even in 980ti SLI (benchmarks above). Yet it can't get over 50% with a 1070? Even in a CPU intensive game, something else is holding back the performance.
 
Not really too surprising in fallout 4 (which is capped at 60fps out of the box unless modded due to the game's timeclock).
http://www.gamersnexus.net/game-bench/2182-fallout-4-cpu-benchmark-huge-performance-difference

When games are cpu intensive the fx chips struggle. It's on a per game basis, if the game were that terribly coded (some are) it doesn't matter which hardware you throw at it. Intel cpu's do fine, fx cpu's struggle with a 980ti which isn't far off from a 1070 in terms of performance. Even a 9590 drops to 35fps and in extreme cases, 16fps. It's going to be felt in some areas of the game.

It's not all games, it's hit and miss depending on the title. Low cpu intensive games the fx do fine on, any cpu does fine on. When it comes to cpu intensive titles, the fx struggle a bit. Kind of a shame to get a 1070 that only shines in some games while having to cripple the graphics settings in others to try and drive fps up. I don't think too many people spend $400+ on a gpu just to have to turn graphics down to medium at 1080p.

As I said before, it's not that games aren't playable on an fx and a 1070 will make the gaming on an fx 8350 better than a 750ti. When the question is will the cpu hold the gpu back, in several cases it appears so. When the 1070 is capable of pushing more fps on intel cpu's in several games compared to the 8350, the fx is clearly holding the gpu back some. Is the performance loss a major deal? Depends on the person, some people are happy with 30fps, others want at least 60fps and others shoot for as high fps as they can get.
 
I forgot that Fallout was using the Skyrim engine. Makes more sense now why the 1070 is doing poorly with the 8350. The only other game I think would be a major bottleneck would be Arma3. Most modern games are getting away from that older CPU coding so the FX and 1070 should do fine elsewhere.
 


 
I've done the stress test and logged the results. The GPU usage hovers close to 99%, and the CPU usage stays in the 20% range. When playing Fallout specifically, my CPU usage spiked above 70% once but averages about 55%. The GPU usage average was under 50%. My display is only a 1600x900 and I'm running almost every setting on ultra or high. If there's something else holding it back, how could I figure that out?
 


Arma 3, H1Z1, Starcraft 2, DayZ, Rust, Arc, GTAV, every indie title, and any open-world looter shooter will be bottle-necked by the 8350 as these titles are single-threaded.

Overclocking helps a lot with gains up to 30-40%
 
Beholder88 - you're running a 1070 at 900p??

Then your GPU usage is fine , the 1070 is a true 1440p card which is like 4x your resolution.
900p is juat not going to tax it anywhere past 60% unless.

1.you're running csgo or similar at unlimited/uncapped fps (it will probably so 500fps+ on a title like that)

2. You enable dsr & run at 1440p+ resolutions.


^ both these scenarios are just plain pointless though & will push GPU usage for nothing.

Essentially you could have bought a 1060 to play at 900p , saved money & for the same performance.

Gpu usage is completely unimportant if you're happy with graphics settings & fps!!
 


depends on the game honestly, I know they arent the best cpus but they handle my old cards fine 1440p bf4 with 95% usage on both average. cpu intensive games is where you will see lower gpu usage.

 


You could make a pretty big list of single threaded games from 2014 on back that will struggle. But I still think moving forward the FX and 1070 will do fine together because single threaded coding is a thing of the past. The FX chips should age nicely. AMD had the right idea with multiple cores, but the timing was off.
 


Well not exactly, SMT or hyperthreading and using cores with high IPC is ultimately the superior design, hence why AMD are going that way with Zen.
 


The idea of multiple cores was going in the right direction. But nothing took advantage of them at the time. The FX chips do well with DOOM. Skyrim...not so much.

The FX 8350 and up should age well. New AM3 boards are being released with modern features which should give them a needed refresh.

Zen is the future of AMD for sure. But the FX chips are far from being obsolete. Buy a new board, graphics card and 1440p monitor and you are set for the near future with a FX chip.