AMD FX-4170 Vs. Intel Core i3-3220: Which ~$125 CPU Should You Buy?

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Christopher1

Distinguished
Aug 29, 2006
667
3
19,015
[citation][nom]EzioAs[/nom]Nice review as always. Nothing really too surprising but I guess it was quite necessary to compare the 2 CPUs at the same price point (not everybody prefers Intel). If Piledriver pulls through (I hope it does), then maybe AMD will have the slight edge at performance per dollar against the i3 Ivy Bridge[/citation]

Main reason why some people don't like Intel is because they want AMD to keep them on their toes and don't want to admit that.
 

nickchalk

Distinguished
Aug 6, 2008
35
0
18,530
Hey Don, good review there, can you put up a phenom ii x4 in the bench as well?
965 sells for 85 euros here and i'm stuck with a 5 year old C2D e6750.
A decent m/b with 970 chipset and a phenom ii x4 965 will cut the upgrade cost down by a big deal and we could see if the older CPU still has the balls to compete with the fresh guys.

thanks.
 
[citation][nom]jaideep1337[/nom]damn the 4170 is really power hungry! AMD need to work on these power consumption issues first before they do anything else![/citation]

That's exactly what they're doing. Again, take a look at Trinity.
 
i3 or quad-core Trinity K edition is a much better comparison. If you only count i3s and the FX-4170, then the i3s have a clear win in gaming performance in almost all modern games because they have similar gaming performance with much lower power consumption. I'd take an A10-5800K or even an A8-5600K over an i3 because they are far more power efficient than the FX-4170 and they can overclock well.
 

Pieninja

Honorable
Oct 23, 2012
11
0
10,510
i dont mind the 4170, it performs well enough for all my needs and is somewhat cheaper than most intel chips
 

bim27142

Honorable
Mar 9, 2012
22
0
10,510
Under load, that gap grows to a staggering 103 W.

OMG even if someone gives me a 4170 for free i still wouldn't accept it... these days, AMD's power envelope is almost on the "unacceptable" standards... but that's just me... ;)
 
[citation][nom]bim27142[/nom]OMG even if someone gives me a 4170 for free i still wouldn't accept it... these days, AMD's power envelope is almost on the "unacceptable" standards... but that's just me...[/citation]

The 4170 is an older generation CPU at this point. The FX-4300 uses significantly less power than the 4170 while having a moderate performance advantage and even the A10-5800K can match the 4170's gaming performance in only a little more power than the i3-3220 (less power at idle than the i3-3220, so it can probably make up that load loss over time anyway). Saying that you wouldn't buy the 4170 and that it's representative of what AMD is doing now is like saying that you wouldn't buy an i7-920 and that it's representative of Intel's current power consumption. That's not fair to Intel just like your claims aren't fair to AMD.
 

No, it's more like saying you wouldn't buy an i5-2400. The i7-920 was introduced 3 years before any Bulldozer CPUs.

But I agree. The FX-4300 is AMD's new budget gaming CPU, just like the i3-3220 for Intel, and those two should be compared now. I am glad to see AMD improving - it's good for everyone.
 

6wheels

Honorable
Oct 28, 2012
4
0
10,510
i dont understand why AMD hasn't moved yet to 22nm or even 28nm Im sure moving to a smaller manufacturing process will help reduce both power consumption and heat from the die. I miss the old AMD =(
 


I was going by performance/power consumption differences rather than time frame differences to give a more accurate impression of the differences between the CPUs. However, I see your point in that the time frame differences aren't similar.
 


AMD doesn't own their own fabs and the cost of a die shrink would probably be much greater than tweaking their designs for the current process technology. Comparing the FX-4170 and the FX-4300, you get an improvement in power efficiency that seems fairly similar to what you'd get in a die shrink and it was undoubtedly a lot cheaper for AMD, so there's not much motivation to do a die shrink when there are superior options relevant to R&D cost.
 

6wheels

Honorable
Oct 28, 2012
4
0
10,510



they don't have to produce it they can outsource the production just like what they are doing with their graphics devotion. most of new AMD GPU's are 28nm.
 


The GPUs kinda needed the drop to meet goals and it made sense for what they are because AMD was already making a whole new architecture, so they might as well design it for a new process because they're not just tweaking the current architecture. The same is not true for their CPUs and I explained why in my previous post.

EDIT: I just re-read your post.

AMD does not out-source the development for their graphics and they already out-source the production of both CPUs and graphics cards. The rest of my comment from earlier still applies too.
 

v351

Honorable
Oct 18, 2012
8
0
10,510
No mention of the 965 BE? I was under the impression that it can overclock better than a 4170 and has a similar multitasking/ multithreading advantage over the i3. Also cheaper unless maybe you buy an expensive cooler to max out its overclocking potential.
 
[citation][nom]v351[/nom]No mention of the 965 BE? I was under the impression that it can overclock better than a 4170 and has a similar multitasking/ multithreading advantage over the i3. Also cheaper unless maybe you buy an expensive cooler to max out its overclocking potential.[/citation]

It also has spotty availability, lack of support for any modern instructions that are seeing increased usage in modern software, and newer CPUs such as the FX-4300 are better for gaming, at stock and when overclocked, than any Phenom II CPU.
 

bassbeast

Distinguished
Dec 14, 2010
74
0
18,640
Or you could just do as I did and get a Phenom II X6 for $110, that's six FULL cores and plenty of room to OC later if I want to, and the money I saved on it and the board went towards more RAM. Every test I've seen with FX they had to place the FX 8 against the Thuban 6 and barely won, in a fair 6 VS 6 battle Phenom II wins and they can still be had for crazy cheap at places like Tiger.

I hope AMD fixes the mess and comes out with another winner but frankly the "half core" design just doesn't cut it for me and my customers, not when I'm getting Athlon X3s for $60, X4s for $70, Phenom II X4 BE for $90 and X6 for $110, the bang for the buck is MUCH better with the Phenom II based than the FX, which is still too hot, too power hungry, and too low on the IPC. It really acts more like hyperthreading to me than a full core, so until I run out I think I'll stick with the Phenoms.
 

cleeve

Illustrious
[citation][nom]v351[/nom]No mention of the 965 BE? I was under the impression that it can overclock better than a 4170 [/citation]

Overclock better? Not at all. The X4 Phenoms have little headroom, many samples will cap out under 4 GHz.
 
[citation][nom]bassbeast[/nom]Or you could just do as I did and get a Phenom II X6 for $110, that's six FULL cores and plenty of room to OC later if I want to, and the money I saved on it and the board went towards more RAM. Every test I've seen with FX they had to place the FX 8 against the Thuban 6 and barely won, in a fair 6 VS 6 battle Phenom II wins and they can still be had for crazy cheap at places like Tiger.I hope AMD fixes the mess and comes out with another winner but frankly the "half core" design just doesn't cut it for me and my customers, not when I'm getting Athlon X3s for $60, X4s for $70, Phenom II X4 BE for $90 and X6 for $110, the bang for the buck is MUCH better with the Phenom II based than the FX, which is still too hot, too power hungry, and too low on the IPC. It really acts more like hyperthreading to me than a full core, so until I run out I think I'll stick with the Phenoms.[/citation]

Actually, there are two cores per module cores, just with only one FPU per module instead of two (not relevant for integer workloads and almost all consumer workloads that I'm aware of are integer, not floating point). The scaling problem is caused by a front end bottle-neck, especially in the x86 decoders, that will be rectified in AMD's next micro-architecture, Steamroller. Besides, the scaling is fairly close to 100% anyway (around 75-85% is a heck of a lot better than around 10-30%) even on Bulldozer and Piledriver and beyond that, in the context of gaming, that doesn't really matter anyway because no game that I'm aware of uses eight threads so efficiently that this'd be a bottle-neck for highly threaded gaming performance anyway (although it does cause some havoc for lightly threaded games if not used properly like Hyper-Threading did in the early days).

Bulldozer may be too hot and power hungry, but it's already cooler and less power hungry than Phenom II and Piledriver is even more so. The FX-4300 outperforms any Phenom II CPU in almost all games and it does so significantly more efficiently than Phenom II, albeit still not nearly as efficiently as even an Intel Sandy Bridge i3, let alone Ivy Bridge.

Furthermore, as Cleeve pointed out, Bulldozer and Piledriver overclock better than Phenom II (granted in my experience, the x6s do at least manage to be overclocked a little better than the x4s), so all you did was pay less for lower overclocking performance, but higher stock performance, power consumption, and heat generation.
 
[citation][nom]Headbomb[/nom]I was thinking about getting a 4170 for my gaming PC but is it really worth it?[/citation]

If you want an AMD CPU for a lower end gaming system, then the FX-4300 is a better option. It has about the same price, significantly lower power consumption, and greater performance.
 
G

Guest

Guest
The review is forcibly unfair. TomsHW you're mistaking the way a CPU fps is measured. You must let the games find their limit on the CPU not the GPU. How? Low resolutions, low presets and optionally an accelerated sound card, yes, sound cards with acceleration makes cpus go faster.

Anyway a 120 dollar cpu with a 7970 is over rated, you should find the kind of preset playable @ 60fps minimum with an equal value videocard or 150% of cpu price. Of course the 7970 serves the scientific purpose of measuring cpu power, but it's wrong method, an i7 with 6 cores wouldn't have fared different so you can't say "uhm, so the FX is good as the others!". And also you say 30fps is barely playable, I say under 55 is barely playable and unnerving, why would a 120 dollar cpu user play at maxed out settings just to feel sluggish, slow, choppy, delayed and missing inputs?
 
[citation][nom]drejeck[/nom]The review is forcibly unfair. TomsHW you're mistaking the way a CPU fps is measured. You must let the games find their limit on the CPU not the GPU. How? Low resolutions, low presets and optionally an accelerated sound card, yes, sound cards with acceleration makes cpus go faster.Anyway a 120 dollar cpu with a 7970 is over rated, you should find the kind of preset playable @ 60fps minimum with an equal value videocard or 150% of cpu price. Of course the 7970 serves the scientific purpose of measuring cpu power, but it's wrong method, an i7 with 6 cores wouldn't have fared different so you can't say "uhm, so the FX is good as the others!". And also you say 30fps is barely playable, I say under 55 is barely playable and unnerving, why would a 120 dollar cpu user play at maxed out settings just to feel sluggish, slow, choppy, delayed and missing inputs?[/citation]

The whole point was to measure CPU performance at practical resolutions. The review did that quite well. The 7970 was there to alleviate the GPU bottle-neck while still using practical resolutions and for that purpose, it was the right card for the job.

Also, anyone who says that 55FPS is barely playable is clearly not the average gamer (or is lying), most certainly not a for entry-level gaming, so your opinion of it is not really relevant to the subject. Most people consider 30FPS or thereabouts to be the cut-off for something to be playable or not and thus it is used as the cut off in a review that is intended to be relevant to the most people as reasonably possible.
 
Status
Not open for further replies.