AMD FX-4170 Vs. Intel Core i3-3220: Which ~$125 CPU Should You Buy?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Something to keep in mind... why AMD is in trouble. (And I've built nothing but AMD systems in the past).

1 - PCIe 3.0 - AMD systems will NOT have thing until 2014...
2 - Socket Hell, AM3+ is for upper end users? While FM1 is replaced by FM2 (using the SAME damn chipset - WTF!) which is the only AMD chipset with native USB 3.0 support. No PCIe3.0 HP and DELL, etc have been selling FM1/2 system for a while... which to get? No compatibility.

3 - buy a NEW Z77 board for $80~120, you can put in Sandy Bridge or IVB CPUs. Get native USB 3.0, PCIe 3.0... today... any $50~400 intel 1155 CPU will work.

4 - the TDP is very bad... the computers run warmer and use more power... not good.

Use to be in the Pentium 4 Era, AMD ran at a lower clock rate, less temp and energy while giving out better performance... AMD took a page out of P4 for some stupid reason.

I want AMD to be more competitive... and for most people a $60~100 CPU will handle their needs.

I'm quite happy with my $190 i5-3570K, which pretty much kills any AMD CPU out there in any way.

Sorry AMD... Come out with a better product, I will be more than happy to BUY it.
 
The A85X is a Trinity-only chipset, however it's very similar to the A75. You are correct though; AM3+ is stagnating progress, unles it really doesn't matter about PCIe 3.0 (though AMD does have PCIe 3.0 graphics cards so I don't understand why not).
 
I am starting to think Tom's is flat out biased against AMD.

Both of these processors been out like a year. What is new in an Ivy vs. Bulldozer review. This is beating a dead horse. We get it Bulldozer sucks.

The only benefit to this stories timing is confusing shoppers and putting a negative light on Piledriver when it comes out in a 2 weeks.

Straight propaganda.
 
[citation][nom]belardo[/nom]Something to keep in mind... why AMD is in trouble. (And I've built nothing but AMD systems in the past).1 - PCIe 3.0 - AMD systems will NOT have thing until 2014.[/citation]

So you get PCIe 3.0...what exactly are you going to do with it?

How will it measurably change your life over PCIe 2.0?

You do know that a lot of these standards are mostly for market churn and not really any real benefit for the end user.

"I have to buy that one as it has lots of 3's on the box and this one only has 2's!"
 
[citation][nom]belardo[/nom]Something to keep in mind... why AMD is in trouble. (And I've built nothing but AMD systems in the past).1 - PCIe 3.0 - AMD systems will NOT have thing until 2014...2 - Socket Hell, AM3+ is for upper end users? While FM1 is replaced by FM2 (using the SAME damn chipset - WTF!) which is the only AMD chipset with native USB 3.0 support. No PCIe3.0 HP and DELL, etc have been selling FM1/2 system for a while... which to get? No compatibility.3 - buy a NEW Z77 board for $80~120, you can put in Sandy Bridge or IVB CPUs. Get native USB 3.0, PCIe 3.0... today... any $50~400 intel 1155 CPU will work.4 - the TDP is very bad... the computers run warmer and use more power... not good.Use to be in the Pentium 4 Era, AMD ran at a lower clock rate, less temp and energy while giving out better performance... AMD took a page out of P4 for some stupid reason.I want AMD to be more competitive... and for most people a $60~100 CPU will handle their needs.I'm quite happy with my $190 i5-3570K, which pretty much kills any AMD CPU out there in any way.Sorry AMD... Come out with a better product, I will be more than happy to BUY it.[/citation]

PCIe 3.0 is hardly important at all, especially when many of AMD's boards have more than enough lanes for PCIe 2.0 x16/x16, aka same bandwidth as PCIe 3.0 x8/x8. USB3.0 support being native isn't very important ant many boards have it through non-native controllers.

TDP is irrelevant. Yes, AMD's CPUs can consume too much power, but that isn't what TDP is and you should know that already. Besides, their Trinity APUs don't suck too much power, so it's not all of their processors.

Looking at Trinity, their Bulldozer-based Piledriver architecture isn't just some Netburst clone, it's an excellent micro-architecture that is very power efficient even at high frequencies with four cores. Bulldozer was bad because it had terrible implementations, not because of its concepts.

Yes, AMD needs to improve, but you could at least be more reasonable.
 


Actually, the FX-4170 is absolutely, hands down, the best stock performance that bulldozer has to offer for the money IMHO.

From an app standpoint I think it's slightly superior to the i3-3220. It's unlocked so there's more flexibility for power users. Multi-threading performance is notably superior.

It's too bad about the power efficiency, but to me it doesn't automatically make the processor 'suck'.





Not sure how to address this other than to point out that its simply untrue.

The i3-3220 is brand new. It's been available at retail for, like, ONE MONTH.




Interesting conclusion. Completely false, and it belies your personal attitude and bias more than it reflects ours I think. 😉


 
Quite interesting how a CPU with bulldozer architecture, which most of the people in tech world claim as 'an utterly failed product' able to match ivy bridge in some way and shows where they're good at (applications that uses multiple cores) and they're able to touch ivy bridge's performance.

Although they are still unacceptable in power consumption. Twice as much, that's totally not good at all. Not sure if they can fix this in the next architecture upgrade (After piledriver, steamroller or something? piledriver power consumption is still high by the way)? There is always a classic argument that something will be much more expensive when it has higher power consumption after you compare the bills for 3 years. :lol: But, the good thing is that FX-4170 being able to OC and while i3s can't is a good addition to budget enthusiast.

I would still choose/recommend the i3 though, because, power consumption and better power consumption. But that would be different when the case is that the guy want to do more application work and doesn't care about power. And hey, piledriver based CPUs are coming to launch sometime nearby. Could it improve the power consumption and make the gaming performance better? We'll see and that's going to be truly interesting for us enthusiast to watch over the competition on the CPU market.



Eh Dude... Slow down. I am not a fanboy but I want to fight with wrong facts that are used to bash any company.

I know it's disappointing for AMD to not have a chipset which support PCI-E 3. And the funny thing is in fact that they had it on their GPU division. But PCI-E 3 sockets are not (at least nowdays) that important and won't make any big difference.

Native USB 3.0 support? Get an ASUS board. With M5A97 or something else for $100 you get a non-native USB-3.0
controller. So it doesn't matter that much to be quite honest with you.

TDP? That's not what you want to say mate. Yes I know the power consumption of the CPU is bad, but there is no direct relevance to the heat of the computer and the TDP. In fact bulldozer CPU's run pretty cool, OC'd (8150) at 4.2 GHz with 212 Evo maximum they shouldn't past 60 degrees (also keep in mind that ivy bridge gets temperature problem at launch, may not matter though). And TDP..., find it on internet.

Everyone here want AMD to be more competitive surely, but they are not as in big trouble as you described. Their CPU are sometimes fine and even better than intel offerings for some users. But Still I'm not sure why they are always losing their revenue. Probably because of false strategy of marketing or lower demands.

The 3570k isn't $190, it's $230. That's the discounted price. The 8150 is $190. So the 8150 might be a better choice for some people because it renders cinebench (for example) the same as or just a bit slower than the 3570k despite that the 8150 is $40 cheaper.

Have a nice day :).
 
I must say that not one customer I have dealt with has ever asked about the power consumption of a system let alone a CPU.

I have pushed low power systems to customers on the basis of "hey you leave them on a long time and you can save a little cash on your bills!"

They just shrug and go "whatever you say! Are they cheaper/better/faster?"

IMO CPU power usage is something to consider but I think its overall impact on a system is minor at this point especially with regards to most people usage patterns. It's just another benchmark pissing match.
 


Missed my point.

If you run a $125 CPU vs a $400 GPU you don't see your actual bottlenecks. Your GPU isn't going to be stressed at the same level a lower end GPU is. I'm willing to bet if the system cost were brought into line on both systems the results would be a lot different. Move the bottlenecks to were they would be in an actual system at the same cost. Basically a real world system, not just some though experiment.

I get annoyed when people benchmark only one component, it's utterly useless. You build a computer with many interacting components, not just a CPU.
 
[citation][nom]daglesj[/nom]I must say that not one customer I have dealt with has ever asked about the power consumption of a system let alone a CPU.I have pushed low power systems to customers on the basis of "hey you leave them on a long time and you can save a little cash on your bills!"They just shrug and go "whatever you say! Are they cheaper/better/faster?"IMO CPU power usage is something to consider but I think its overall impact on a system is minor at this point especially with regards to most people usage patterns. It's just another benchmark pissing match.[/citation]

Power consumption matters. People simply don't care about it for some odd reason. Maybe they don't realize how expensive it can be just like with loans and such. People often don't realize that they can't afford something or that they'd be struggling to afford something (although that's brought to an extreme that a computer is obviously unlikely to get near) just because it looks cheaper when you give it a monthly price compared to a total cost number.
 
[citation][nom]Cleeve[/nom]Actually, the FX-4170 is absolutely, hands down, the best stock performance that bulldozer has to offer for the money IMHO. From an app standpoint I think it's slightly superior to the i3-3220. It's unlocked so there's more flexibility for power users. Multi-threading performance is notably superior.It's too bad about the power efficiency, but to me it doesn't automatically make the processor 'suck'.Not sure how to address this other than to point out that its simply untrue.The i3-3220 is brand new. It's been available at retail for, like, ONE MONTH.Interesting conclusion. Completely false, and it belies your personal attitude and bias more than it reflects ours I think.[/citation]

My point is dropping a Ivy Bridge vs Bulldozer review right before Piledriver comes out.

The review is pointless because it has been covered many times on this site and the timing is fishy because just going to lead people into thinking the FX-4320 and FX-4170 are roughly the same.
 
[citation][nom]ddpruitt[/nom]Missed my point.If you run a $125 CPU vs a $400 GPU you don't see your actual bottlenecks. Your GPU isn't going to be stressed at the same level a lower end GPU is. I'm willing to bet if the system cost were brought into line on both systems the results would be a lot different. Move the bottlenecks to were they would be in an actual system at the same cost. Basically a real world system, not just some though experiment.I get annoyed when people benchmark only one component, it's utterly useless. You build a computer with many interacting components, not just a CPU.[/citation]

The point was to see how well these CPUs can perform and ensuring that the graphics isn't a bottle-neck is a necessary factor in doing that. This article wasn't about whether or not both CPUs will perform adequately with lower end graphics cards, it was about seeing at what points the CPUs become bottle-necks. This might not have been what you were looking for, but you shouldn't treat it as if it should be what you want to see.

It is what it is and for what it is, it did what it was supposed to do, show us what these CPUs are capable of. The entire point of this article was to benchmark the CPUs, so you can hate it all you want, but it is you who has missed the point. This was a review of the CPUs, not of two low-budget gaming systems and the games, so of course it is the CPUs that were benchmarked.
 


The review is not pointless. It is an awesome, timely comparison.

There are very few reviews showing the FX-4170, and it makes a lot of sense to do it as soon as Ivy Bridge i3-3220 is released because it's almost the exact same price.

The FX-4320 will have better efficiency than the 4170, but the performance will probably be lower since the clocks are less. But this is a great lead-in to a Vishera review when they become available.

It makes perfect sense, and the timing is impeccable. IMHO, it's your objections that are pointless.
 
[citation][nom]jacobdrj[/nom]All the segments you mentioned would be served better with different CPUs. Many of them less expensive than these. SB Pentiums. IB Pentiums. Both better choices. [/citation]

Better choices than FX-4170 and i3-3220 based on what metric? Are you arguing that lower performance helps workstation and general purpose builds? I don't buy that.

A workstation and general purpose build are better served with a $120 CPU if thats your budget.


 
has anyone thought about what 100W difference means? That is ONE typical incandescent lightbulb's power consumption before the new (dimmer, containing mercury) "green" energy saving lightbulbs available now. I find it rediculous that the power consumption of an additional lightbulb in your home would concern anyone. People just don't put these power numbers in perspective!
 
[citation][nom]torque79[/nom]has anyone thought about what 100W difference means? That is ONE typical incandescent lightbulb's power consumption before the new (dimmer, containing mercury) "green" energy saving lightbulbs available now. I find it rediculous that the power consumption of an additional lightbulb in your home would concern anyone. People just don't put these power numbers in perspective![/citation]

We've already put it in perspective. We don't want to spend that much extra money when we could instead get an i5 for more money up front, but less total cost later on.
 
[citation][nom]torque79[/nom]has anyone thought about what 100W difference means? That is ONE typical incandescent lightbulb's power consumption before the new (dimmer, containing mercury) "green" energy saving lightbulbs available now. I find it rediculous that the power consumption of an additional lightbulb in your home would concern anyone. People just don't put these power numbers in perspective![/citation]

So your argument is essentially that if you don't pay attention to your energy costs, they don't actually exist. Interesting!
 
Ok fine I'll do some math to illustrate how stupid power efficiency is as a consideration. One person said as a monthly cost is not enough so I'll do an annual cost. I'll start by saying I'm no mathematician, just making a point from my personal perspective.

A few assumptions:
- the 100w difference between the two processors is ALWAYS occurring while I'm playing games (not realistic). Let's assume ALL I do is play games.
- I typically play PC games after kids go to bed, around 9pm-12pm (very generous)
- my off-peak electricity cost (when I'm playing) is 6.5c per kWh
- my weekend gaming is about the same hours as weekdays

3hrs per day x 350 days (extremely generous estimate, only subtracting my vacation weeks) = 1050 hours. I have to pad the numbers a little to allow for some idle time (thought that's only 20W difference) if I left the computer on overnight by accident or to download a large file. Let's say 10 days x 21hrs = 210 idle hours + 1050 = 1260hrs/year.

100W x 10 hrs = 1000W or 1 kWh
1260hrs / 10 hrs = 126 kWh per year
126kWh x 6.5c = $7.56/year.

I tend to upgrade every 5 years = $37.80

yeah... big concern.
 


Most people pay more for electricity than that and doesn't take electricity costs per KW generally aren't constant even for the same person over the course of a year. Furthermore, that's nearly $40. Going from a $125 FX CPU to a $160-170 i5 makes a whole lot of sense given the performance difference. Furthermore, if you typically upgrade every five years, the i5 would last longer than the FX would because of its performance advantage, so you could go more than five years (not that you might not care about it) with the i5 if you would have gone five with the FX, saving even more money from fewer upgrades over time. A better comparison if you want AMD would either be a trinity APU with the GPU disabled or an eight-core FX CPU such as the FX-8120 with the second core of each module disabled, allowing similar performance even without overclocking to the FX-4170 while consuming less power than even the FX-4100.
 
So you're saying that because the AMD processor uses $40 more electricity over 5 years (my hours and numbers were inflated to illustrate I'm trying to be fair), you think people would be willing to spend more right now on a processor? If they have more money to spend right now I don't think they would choose this option. Of course $40 would not break the bank alone, but this would be just one component of a budget build with ALL components chosen to save money RIGHT NOW, not over the course of 5 years.
 


Then you have the i3 option and the A10-5800K option that both save that money while having similar performance and pricing. Not that the 4170 is a bad CPO, but it is not ideal.
 
I may be out of date on this, but historically AMD motherboards are always cheaper, especially at budget build level.

I know I probably sound like a fanboy, but I bought an i5 myself. It was obviously a better performer at the price I was willing to pay. If the motherboard is cheaper and saves you money right now at this price point, i dont think most people would have their budget shaken by the 0.63/month extra electricity costs in my example scenario.
 
Status
Not open for further replies.