AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

aznjoka

Distinguished
Mar 21, 2011
31
0
18,530
Nothing to complain here, have to remember AMD did this on the same 32nm and continued on the AM3+ socket. So power consumption is not a surprise, also that it's looking towards its AM3+ customers. Changing sockets every new tick or tock could be a total drag. I see this as one of AMD's advantages and they are certainly heading in the right direction.
 

cknobman

Distinguished
May 2, 2006
1,167
318
19,660
Putting this in perspective I AM IMPRESSED.

If the FX-8350 had existed in June I would have built a system based off that instead of the i7-3770k.
I know the 3770k is a superior processor both speed and power but the FX-8350 would have been enough for me along with the ~$120+ in processor savings alone. Plus I gotta support my comany if possible!!!
 
[citation][nom]Tomfreak[/nom]Looking at how poor AMD IPC(or per core performance) is. I dont think I will be buying AMD for long long time. I personally like high single threaded performance CPU.[/citation]

IPC is not per core performance. Instructions Per Clock, aka IPC, doesn't even necessarily correlate to actual performance. Do you want proof? Athlon II and Phenom II have exactly identical IPC, yet not identical performance nor identical performance per core at a given frequency.

[citation][nom]razor512[/nom]Push the X6 1100 to 4GHz and it will beat the 8350 in every test except the ones that require special CPU optimizations.What tomshardware needs to do is another round of average overclocked CPU's to see if it is better for a user to get a X6 and overclock it or 8350 and overclock it.I am currently using a X6 1075T and when overclocked, easily beats the 8350 by a good amount in cinebench.Since your readers are likely to be the type that will build their own system and also upgrade their CPU, and most of all, overclock. ...[/citation]

Overclocking wouldn't change much.

[citation][nom]mayankleoboy1[/nom]The gaming benchmark methodology is somewhat irrelevant. The "average FPS" is good for lots of the CPU's. 45 Vs 60 is not much significant.But the frame latencies (ala techreport) are more relevant and informative. Of course, they are harder to grasp.[/citation]

You are correct that average FPS is not accurate, but going from a frame latency equivalent of 45FPS to the same equivalent of 60FPS most certainly is a significant improvement in my eyes in fast-paced games such as most first person games.

 
[citation][nom]iam2thecrowe[/nom]considering all i really do is game and browse the web, im glad to stick with my 3570k, no regrets here. Really, what percentage of people do things that take advantage of 8 threads that would make this cpu worth a look? maybe 10%? People are happy that there is a decent improvement here, but imagine what they could have done if they had stuck with tweaking the phenom II core? and stuck 8 of those cores on a cpu? They would have a much better product today and would be much more competitive i think. Instead they went the pentium 4 route...... and we all know where that leads.......[/citation]

Actually, the modular micro-architectures are superior in concept to the older Athlon 64 through Athlon II/Phenom II/Sempron' 10h micro-architecture. The problems are the cache, crap implementations (Piledriver improves this), and much more, but still, Phenom II would not have been better. Furthermore, AMD didn't really go the Pentium 4 route as much as you might think. Compare the FX-8350 to its price competitors on 32nm, Intel's Sandy Bridge i5s and you'll see that AMD still has plenty of wins and furthermore, disabling one core per module is enough to give it a significant performance boost as is overclocking the CPU/NB frequency.


So, you get substantial performance per core improvements without even touching the CPU frequency yet and you're disabling half of the CPU's integer cores, so power consumption can go down significantly. If you want a gaming performance comparison, keep in mind that Phenom II nor a 32nm die shrink of it have a chance against that which could easily compete with Intel's i5s and i7s in performance per core.

[citation][nom]proffet[/nom]your lucky to get the L3 to unlock, mine didn't when I had the P2 X4 840... I was disappointed to say the least..[/citation]

That's not luck. The Phenom II x4 840 doesn't even have any L3 on its die, so there's nothing to unlock. The same is true for the Phenom II x4 850. All other Phenom II CPUs have it and some Athlon II CPUs have it too, but that you failed to unlock a component on a CPU that doesn't even have that component, locked or otherwise, shouldn't be surprising.
 
[citation][nom]mikenygmail[/nom]What many reviewers and fanboys tend to miss over and over again is that AMD delivers the best performance-per-dollar and that ANY current model desktop CPU will run ANY software just fine. Unless you have some enterprise level software that brings a modern CPU to it's knees, ANY of the currently available desktop CPUs will run Windows or Linux based software just fine. In fact, Linux apps do even better in many cases than Windows bloatware.I have no idea if AMD will ever offer a discrete CPU to equal Intel's top of the line, over-priced models nor do I care. I buy what delivers the best performance for the price. I have yet to purchase any AMD desktop CPU that would not run ALL software as well as an Intel CPU, without any isses what so ever. ...[/citation]

AMD only tops the performance/price numbers in some situations. In others, they do not top those charts by a long shot. Furthermore, Ivy Bridge was just a die shrink with minor changes. Of course it didn't change performance much, it's almost identical to Sandy Bridge. Shrinking the process has no impact on performance, only changing the micro-architecture, frequency, and such can change that. The point of shrinking the process is reducing power consumption and that's something that Ivy compared to Sandy outpaced AMD's improvements in Piledriver even if looked at an efficiency improvement standing point rather than sheer power consumption. You accuse people of not looking at data objectively, yet you didn't do so either.

EDIT: I'd also like to point out that Ivy's heat problem is caused not by its process technology, but by the crap paste between the die and the IHS.
 

DryCreamer

Distinguished
Jan 18, 2012
464
0
18,810
shame the CPU's will be $200+ for the next few months... luckily I don't have plans to build a system just yet, so by the time I do, the prices should be at a more attractive equilibrium :)

Dry
 

cbrunnem

Distinguished
[citation][nom]blazorthon[/nom]When that becomes practical.[/citation]

how is it not practical and how is single player practical when it comes to benching cpus? it misleads everyone looking for bf3 cpu benchmarks to buy a cpu. they by an amd cpu then come ask questions to why they arent getting the fps that they think they should. if they had seen really bf3 multiplayer benches then they would have been properly informed.
 
IMHO Most people (common buyers) that is asked to have a Geforce 600 series instead of Radeon 7000 series card for power consumption reason (which as you know, they consume more power than the 600) will say, 'Fine, but is it better?' Then we would probably said not really (say, 7950 vs 660 Ti)... So Power doesn't matter that much, especially in a country where electricity is dirt cheap.

The same with this case, People that will do more multi threaded works will probably go for this, regardless of the power consumption. It's priced right

But, not the same for single threaded works. This processor won't be my first choice for gamers or anything that pretty much use single threaded applications, but it will for multi threaded usage.

What I hoped from the first (since bulldozer being reviewed) is that AMD made a $200 Quad core processor, with a good pricing and with a good single threaded performance. This would probably make my first choice of processor to an AMD processors :lol:.

And I mean, look at this:

pred-cinebench1.png

pred-cinebenchx.png


AMD could be chased with its multi threaded performance because single threaded performance could affect it.

But anyway, this is I think better than SB to IVB from intel :). But yes, still not my first recommendation.
 
[citation][nom]cbrunnem[/nom]how is it not practical and how is single player practical when it comes to benching cpus? it misleads everyone looking for bf3 cpu benchmarks to buy a cpu. they by an amd cpu then come ask questions to why they arent getting the fps that they think they should. if they had seen really bf3 multiplayer benches then they would have been properly informed.[/citation]

BF3 MP is incredibly difficult to have consistent benchmarks. You can't use pre-set runs with 64 players unless all 64 players are computer-controlled, so you'd need to get 64 online bots just to benchmark this properly. Even then, consistency is still difficult. Single player has no such problem at all.
 
[citation][nom]refillable[/nom]IMHO Most people (common buyers) that is asked to have a Geforce 600 series instead of Radeon 7000 series card for power consumption reason (which as you know, they consume more power than the 600) will say, 'Fine, but is it better?' Then we would probably said not really (say, 7950 vs 660 Ti)... So Power doesn't matter that much, especially in a country where electricity is dirt cheap.The same with this case, People that will do more multi threaded works will probably go for this, regardless of the power consumption. It's priced rightBut, not the same for single threaded works. This processor won't be my first choice for gamers or anything that pretty much use single threaded applications, but it will for multi threaded usage.What I hoped from the first (since bulldozer being reviewed) is that AMD made a $200 Quad core processor, with a good pricing and with a good single threaded performance. This would probably make my first choice of processor to an AMD processors .And I mean, look at this:AMD could be chased with its multi threaded performance because single threaded performance could affect it.But anyway, this is I think better than SB to IVB from intel . But yes, still not my first recommendation.[/citation]

7950 does have a performance advantage over the 660 Ti, especially if you use enough MSAA to utilize its performance in most games at 1080p. Nvidia does not have the power consumption advantage at every level either. Compare the 660 Ti to the similarly performing 7870 and this becomes obvious. AMD also seems to have more room for undervolting than Nvidia has as well as lower idle power consumption (especially in multi-GPU systems and/or with displays turned off).

Those 2013 projections probably aren't accurate because the Steamroller CPU implementation is set to be a much larger improvement over Vishera than Vishera was for Zambezi.

If we look at overclocking, Intel took a step backwards thanks to their crap paste between Ivy's die and IHS while AMD took a step forward even with Piledriver. Then there's also, as I've already said, core configuration altering and CPU/NB frequency overclocking to be had that can let Zambezi quad module models, let alone Vishera quad module models, compete even with i5s and i7s in performance per core.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]cbrunnem[/nom]how is it not practical and how is single player practical when it comes to benching cpus? it misleads everyone looking for bf3 cpu benchmarks to buy a cpu. they by an amd cpu then come ask questions to why they arent getting the fps that they think they should. if they had seen really bf3 multiplayer benches then they would have been properly informed.[/citation]
In addation to what blazorthon says, from the article:
Of course, this only applies to the single-player campaign, which tends to be GPU-heavy. The multi-player element of Battlefield 3 is more taxing on processor performance. But because it’s difficult to create a repeatable benchmark involving 63 other players, we’ll move on to another game notorious for its emphasis on CPU speed.
If you don't read the review properly, that's not their fault.
 

devBunny

Distinguished
Jan 22, 2012
181
0
18,690
[citation][nom]matthelm[/nom]If you [Denmark] are paying that much [$0.40/kWh], why would you let it set idle, turn it off instead![/citation]

As I sit here, reading this web page, my CPU is idle - as it has been for most of the time that I've been working and browsing. I'm busy but the PC is not.
 

mapesdhs

Distinguished
[citation][nom]ukee1593[/nom]^^ @sugetsu Yes I'd expect that a FX8350/8320 to be a very good build for a (high end) video editing/content creation build. ...[/citation]

"high end" is a 6-core or multi-socket system, not an $800 consumer box.

Ian.
 

swimomatic

Distinguished
Aug 24, 2009
16
0
18,510
These charts really helped me feel better about my aging DDR2 and Phenom II X6 rig. Only demanding game I play is BF3 and it seems that it is much more GPU dependent than CPU dependent. Well done Tom's!
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
@article:

Good to see AMD finally catching up! Probably takes the recommendation for a budget workstation (that's not in Denmark :p ).

I actually think the FX-8350 is THE top-tier AMD CPU to recommend now, seeing that it out performs/equals the Phenom II in most situations, including games.

Nice review Chris! :)
 

misiu_mp

Distinguished
Dec 12, 2006
147
0
18,680
One think to remember when making a workstation is that AMD has support for ECC memory on the (cheap) desktop. ECC memory is really cheap and available today, but intel does not support it in its desktop parts. You need the much more expensive server motherboards for this.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]swimomatic[/nom]Only demanding game I play is BF3 and it seems that it is much more GPU dependent than CPU dependent.[/citation]
The single-player campaign. THE SINGLE PLAYER.

Also, DDR3-1333 improved my frame rates in general over the DDR2-800 i was using before (same Core 2 Quad processor).
 
mikenygmail, I agree about AMD CPUs being adequate for essentially every common task for which people use their computers. That's the ONLY part of your post I found agreeable. In particular, your assertion that AMD leads in performance per dollar is so obsolete as to be absurd. It USED to be true. Recent reviews and NON-synthetic benchmarks have now shown though, that for most tasks (games in particular), which are not well-threaded, there is no longer any price point at which Intel does not outperform AMD; sad but true.
Your assertion of adequacy is precisely why AMD's focus on the APU makes such business sense. People without the means to build even $600 gaming rigs will be thrilled that they can buy an APU-based system and still play, even if the settings are lowered.
At the very high end, however, my best analogy on this chip comes from GW2. Bulldozer was such a failure that it put AMD in a "downed" state from which it has struggled to rally and get back in the game. For most high-end users though (especially gamers), PD is too little, too late. Sorry AMD, "you've been defeated."
There appears to be a niche market of professionals who might find PD preferable to i5/i7, but only if its Productivity - TCO is higher over a typical business-ownership time frame.
 

proffet

Honorable
Aug 30, 2012
489
0
10,810

and here's the shameful part about that....
if you look again, the 980BE and 1100T do as you stated, equal or within a few FPS.
but now realizing that this (FX-8350) is the second new release since the 980BE/1100T (Deneb C3 / Thuban 0E)
and it's just not equaling it in terms of gaming performance is quite SAD....

and the fact that the FX-8350 is a 8-core **cough cough** 4-module **crap**
wouldn't you think or expect it to be more than what it is..?
I do..
(I do not play W.O.W.)

so the price needs to come down even more in my opinion though not much.
IPC still falls short but yes better that Bulldozer.
if you had/have a FX-8150 are you gonna dump it for this.?
if so then really.?

they should have continued the Phenom line further.
I didn't know raising the base clock was a new arch or new stepping.. :p

Would FX-8350 be my first choice in a new build, though? Probably not. Although I’m impressed by the work AMD’s architects have done in the last year, performance remains too workload-dependent. And, inexpensive energy aside, I’m going to go with the more efficient implementation when all else is close to equal.
 

Considering that even the FX-4170 games as well as the FX-8350, your post is kinda irrelevant. The eight-core models aren't as good for gaming as the top quad-core model of their generation at stock. Please explain to me why I should expect an eight-core CPU to beat an otherwise similar quad-core CPU when you're comparing them in single and/or lightly threaded tasks without even optimizing the eight-core version for those tasks.
 

proffet

Honorable
Aug 30, 2012
489
0
10,810
were not talking about the FX-4170 (I can destroy that argument later..)
and the 8-core FX-8350 doesn't exactly win by large margins in half of the bench suites that utilize all cores.
(I expected more..)

and don't give me that disable 1/2 the module nonsense either.. :non:
in-gaming for better performance.
 
Status
Not open for further replies.