Torn between getting Intel or AMD

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

zigar01

Reputable
Mar 6, 2014
11
0
4,510
Howdy Folks,

So I have an aging 1035t Phenom X6 processor. I got it to OC stable at about 3.4ghz and it's served me well that way for the past couple of years, but I recently decided to get myself a GTX 780 (haven't purchased it yet) and know that that processor is probably going to bottleneck that GPU.

The motherboard I currently have is socketed AM3+ so I can easily upgrade to an FX-8320 or 8350, but I seem to be reading everywhere that the intel Core i5-4670k or the i7-4770k beats the pants off of the FX-83XX chips.

I guess the bottom line is whether or not it's worth it for me to spend an extra $200-300 to get a new motherboard/intel CPU or just save that cash and get the FX-8320/8350.

Also, as a side note is anyone expecting a price drop of the GTX 700 series soon? Just wondering if I should pull the trigger sooner or later for the GPU :).

Thanks!

TL;DR - Worth it to spend an extra $200-300 to go with the newer Haswell Intel processors or save my cash and get an FX-8320/8350?
 
Solution


I have an Asus M5A99FX Pro mobo.
 
I surely have a lot to reflect on. All you guys have been very helpful. What I think I am going to do for now is buy the new GPU and see how it performs with my current CPU. If I find it wanting I'll probably take it from there. In about a month's time I'll have a bit more money (I get paid mileage for my job, so I get a monthly reimbursement check that I use on entertainment-related purchases) and will be able to afford the Intel mobo/cpu setup if I really want. I know it's impossible to predict, but would one have more longevity over the other? I know some people out there with 2-3 year old Core i5 & i7 processors that are still going strong, but I don't know anyone (personally at least) that's had an AMD chip last them that long (save myself, but 3 years later I'm looking for an upgrade).
 
Well, there is the gimmick. I've seen more than a few posts by knowledgeable people, who arestill running phenom iix4's which have been around since 2010, and the 6 core fx which have been around since 2012. However, with amds sticking to a few chipsets and just making bigger/faster cpu's to match the mobo, at a relatively cheap price, upgrading is easier/cheaper than the same for Intel who changes mobo's every few years.
 


The fact that you think that processors draw more energy at idle than they do on load and stuck by that statement when I made you aware of your error makes me distrusting of your entire knowledge of technology. Skimming through your points, I couldn't really tell what you were on about, basically a medley of saying how I was wrong yet also proving me right, so I wasn't clear on that entirely.

The summation of my points is that the OP clearly stated that they already have an AMD motherboard, and with that being known, it is not worth spending close to $500 just for a %10 tops increase in performance. The 8320 may not be the most powerful chip on the market, but it holds its own, and with the new consoles having 8-core processors, more games will start to utilize more threads to keep the consoles from going outdated by next year.

Also, TekSyndicate did a comparison on power consumption between the 4770K and 8350, and the difference in consumption actually wasn't that huge. I believe that in an average electricity costing area, it was $20 more a year to run the FX chip over the Haswell. That's not enough over 4 years to justify buying an Intel processor and motherboard.
 
Now lest try this again... Where in the world did you read that statement in any of my posts. Are you sure you did read right?

I will summarize just to avoid any lazy readers accusations:

I7 2700:
Idle - 8W
Browsing/movies - 12W
Full load ~52C - 72W
Full load ~78C - 81W

I3 4130
Full CPU IGP load ~60 - 46W
Full CPU IGP load ~75 - 51W

The fact that you think people don't deserve your special attention to clearly read their post, makes me question your age, attitude, knowledge and ... well basically everything. Skimming through stuff won't get you far in life.

Also the end cost of the power consumption depends where you are, who is your electricity supplier, what is your contract and etc. TekSyndicate's figures are true, but for a limited case of users. Cheers
 


I am not "skimming" at all, in fact I read all of the comments very well so that I could accurately provide an answer to the OP's question. Very clearly in this statement you said that not only do Intel chips draw more power at idle than they do on load, which is vastly incorrect, but you also say that FX chips draw 130 W at idle. On stock settings, FX Chips generally draw 120-130W at full load depending on the conditions the processor is placed under and the specific CPU in question. Of course, if overclocked, this number will potentially be higher, but saying that any CPU draws close to 130W at idle is absolutely preposterous. At idle, my FX-6350 before I overclocked it downclocked itself quite nicely and didn't consume much power at all when idle, and the i7 will consume much less than 75W at idle, some people measure it to be between 30-40W of consumption. My motherboard is only rated for max 125W TDP, so there is no way in hell my FX-6350, which is overclocked to 4.4 GHz, pulls 160W on stock speeds, yet hasn't fried my motherboard in 3 months. Please stop spreading misinformation.

You're all of a sudden completely changing your wording, as you can see from what I originally quoted from you, as an attempt to control damage on your ignorant misinformation. Plus, you are now comparing older architecture, less power consuming cpu's with fewer cores etc. The fact that you are ignorant enough tho think all of this is kind of ridiculous, and to be able to say those absurd statements when you are supposed to be knowledgeable in processors especially baffles me.

 
Your arrogance and lack of any common sense, what is left of knowledge is just unbelievable

I.
Your FX 6350 has 75% of what an FX 8350 has. The FX 8350 will always consume more power. FX8350 =/= FX6350.

II.

The FX is OCed to 4.4 GHz as well as the I7 3770k. The I7 will pull around 95W/100W out of the motherboard (depending on temperature) at stock, 75W at full load. The FX will pull over 160W out of the motherboard (depending on temperature) and 130W? at stock(not entirely sure here).

All of this is at full load. In this whole paragraph there is nowhere "Idle" being said. I do have to apologies myself - I misplaced the comma (sorry, English is one of my foreign languages). It should be "The I7 will pull around 95W/100W out of the motherboard (depending on temperature), at stock - 75W at full load." - here, fixed. Or maybe "and at stock - 75W under load".

Stock means stock clock. Stock has nothing to do with load. Stock =/= idle. There was no idle anywhere in that post.


III.
Let me break it down for you:
I7 4770
Stock (no IGP): around 75W
OC 4.4 GHz (no IGP): around 95/100W

FX 8350
Stock: 130W?
4.4 GHz: 160W

IV.
http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/7

Complete system with FX 8350 at stock / 4 GHz: 213W

Complete system with FX 8350 at 4.8 GHz: 364W

That is delta of 151W. This means that the overclock itself increases power consumption of the CPU itself by 151W. Even if the FX 8350 consumes less than 125W at stock, the delta is too great. My estimation of 160W for FX 8350 at 4.4 GHz is a rough guess. Power consumption does not scale linearly with MHz, it scales progressively.

V.
You continue to believe that TDP is actual power consumption. This is very, very, very false belief.

http://www.intel.com/content/www/us/en/benchmarks/resources-xeon-measuring-processor-power-paper.html

Or if you are too lazy to read:

http://www.overclockers.com/forums/showthread.php?t=708365

mokrunka:

"More typically, TDP stands for Thermal Design Power. It means that whatever you are using to cool the electronic component should be capable of dissipating this much heat, in order to keep the part from failing/malfunctioning/melting into a pile of goo when under the "maximum perceived load" (usually a reasonable estimate of the maximum amount of power the part will require).

It does not mean that if your TDP=100W, then your CPU is using 100W; only that for whatever amount of power the CPU IS using, it requires the dissipation of 100W in order to function properly at that performance level."

VI.
I was giving information available to me, by tests done by me, which may or may not be 101% relevant but helpful. It was to illustrate the difference between TDP and ACP. But since you do not understand the basic concept behind those, it "baffles you". No wording has been changed and no points that I exposed in my posts, except for grammatical errors I sometimes make, for which I apologize. English is not my only foreign language, plus I am only a human. Forgive me for the damn comma, go take the yellow books for physics and electronics for dummies and read properly. Have a nice day.
 
My favorite thing about you, Shneiky, is that you continuously insult me in every single comment you make, yet end every one with "Have a nice day" or "Cheers". Really good conduct buddy.

Neither of us are being helpful to this thread, so I am going to go ahead and be the mature one and stop your string of misinformation and trash-talk here. "Have a good day"
 
You started with the insults, and on not just one instance, I only replied with "insult" like words in my last post. My "favorite" thing about you, apcs13, is how you disrespect people, do false accusations, insult others, fail to self reflect and when you get confronted with the same type of attitude as your own, you pretend to leave as the mature one. Well have a nice day anyway.
 
Yeah I don't think the TDP thing is a feasible argument to spend more on an Intel board and chip. Even if you did save a little bit of energy, one would still have to take in account the money that was spent on a perfectly good Asus board that is still in it's prime and let it collect dust. That would be a loss and a shame.
 
Considering there is a very good possibility that a decent percentage of the people here, for one reason or another, do not actually pay for electricity, and $2 a month at most really is chump change compared to the hikes the a/c and/or heat causes, the differences in wattage actually consumed by different cpu's, honestly I feel are a moot point and realistically not worth considering unless you are a 'green' sob or someone so ignorant of computers in general, that when they see 'latest technology so saves you money on electric!' Advertisements on newegg etc they believe!
 



Very good mobo for overclocking the 8320.
 


No. They get Intel because people tell them to. They look at benchmarks and immediately decide it's the better chip to have. When yet it's wayy more than they need. You mentioned before that oohh Intel is faster with Apps. Tell me this. For a normal user such as say 60-70% of people on here. How much performance gain would you see on a Browser or Microsoft Office maybe even a Windows Media Player? Zilch. It would perform exactly the same. And to go get an i7 4770k for gaming? LOL. That is Funny. All you really need is an i5.
 
And that from a person with a AMD FX-9370, how much difference did you see from a 8320, couldn't you OC yourself?. As far as editing, streaming and screen capture while gaming, I7 is better than I5. Multi-monitor, SLI or whatever - I7 > I5. Some people want an I7 now and not deal with upgrades for years. Noone knows what few years in the future will bring.
 
You're right Shnelky in some aspects but you're wrong in a few aspects as well, the FX-8320 usually can't be OCed as high as the 8350 so that's something to remember when talking about OCing an 8320 to 9370 levels. You're right about not wanting to upgrade though but of course that does incur a larger initial outlay, when you already have a great mobo it'd be a bit odd to not atleast go with an FX-8320 first.
 


First of all. I grabbed the 9370 because it was indeed cheaper than what an 8350 was going for and I am guaranteed 5 ghz out of the box. They are binned remember? Who said the average user does editing, screen capture while gaming. That's pretty specific and therefore gives me the notion that's what you do. Not everyone else. Streaming has nothing to do with processor speed as most processor's on the market can easily manage what a router feeds.
 
My main concern is whether or not I'll get better performance out of an i5/i7 vs the FX chips. I stream a bit, but mostly play games on a 1080p 60hz monitor (although I'll probably upgrade the monitor in a few months time to a 120hz or maybe one of those fancy g-sync ones as I bought the current one I had in a pinch because my old one burned out on me). I don't want to have to worry about upgrading the GPU (hence my purchase of a GTX 780 which arrives Thursday 😀) or my CPU for at LEAST 2-3 years. If that means shelling out an extra 200-300 now then that's fine by me since I already invested over $500 in a GPU, but if I can get similar performance out of the less expensive chipset then I'd be a fool not to.
 
The FX-8350 or 8320 will perform nicely for you, especially in streaming.

The i5/i7 or xeon 1230 v3(as fast of a cpu as 4770 but can't OC and has no IGP, a bit more than an i5 but less than i7) are superior but not sure if they're worth the cost of a new MOBO+new CPU.

If you're made of money get the xeon 1230 v3+ any mobo you like from 1150 socket depending on your feature requirement or if you want ultimate performance i7 4770k+ Noctua NH-D14 + OCing motherboard.
 
pretty much you can figure the 8320/8350 on being around for a few more years, as one of the mainstream amd processors, unless amd design team starts really reading some of these posts, getting a clue, and figuring a way to really smack it to intel with a cpu that obviously is better than the top i5,i7 cpus from both LGA1150 and 2011. The 9 series is just better binned 8 series with a higher OC from factory. Obviously they aren't doin the job, and at 200w TDP, its no wonder they come with an AMD version of a h-80i.

I really do like AMD, and have for years, but apart from better architecture and all that brings, the only real reason I can think of for why Intel performs better is that 90% of the software out now is tailored for single thread application. If AMD could find a way to tap into THAT, Intel wouldn't stand a chance.
 
Memhorder,
Well I don't think an FX 9370 does ensure 5 GHz out of the box, but if you got it for less then FX 8350 - then that was a good deal and I have to apologize. And well, what I am is a 3D(Maya)/Compositing generalist and that is how I pick my hardware, not a over-the-top gamer. I do game a bit, but if I can sacrifice 50% gaming performance for gaining 5% rendering power/software performance, I will do it any day of the week. ( and hence taking up on a 650TI except a HD 7790 for the same price, or an I5 + top tier GPU)

Depending on the streaming software used - the screen capture is either completely CPU or CPU + GPU accelerated. And yes, streaming is connected to your screen capture, the compression on it which depends on the streamer (since it is going to be streamed, you cant just output raw), hence to your CPU or CPU + GPU. Also the op never said completely what he did, so we were trying to take a look at all scenarios.

Now back to the question. Bottom line an I7 will outlive an FX 8350. My 2011 Sandy Bridge is still kicking even after 3 years, is stronger then then the 1Q older 1100T, beat the FX 8150 when it came out, beats or on par with the FX 8350 and neither Ivy Bridge nor Haswell compelled me to upgrade. Whereas 90% of those who got a Bulldozer either moved to Piledriver later or to Intel. I am not a die hard fan or anything. I wish it was the opposite way. I miss the time when Intel had that stillborn child called the Pentium 4 and AMD was ruling. Since my job is stuck in the CPU world (well 3D rendering is still 99% software and did on the CPU) I want some movement in the CPU segment. But there is none. AMD lost it since the 2005 when the Pentium Ds came around. For over 15 years I have been jumping around and always comparing Intel and AMD ( well I did not have an AMD CPU since 2002), but whatever happens, in the past 10 years, always Intel outlives AMD. Be it superior IPC or more effective L3, better branch prediction, more memory bandwidth to the CPU or something, something small, it always happens. Sometimes I wish AMD did a die shrink on the Jackhammer with minor tweaks here and there and put it against Sandy Bridge and win, instead of going with that Bulldozer nonsense, but hey, things just went like that.

Intel outlives AMD and that is why people buy Intel. It is a no brainer in the past 10 years.
 


Actually the 9590 guarantees 5 ghz. 9370 guarantees 4.7 out of the box which is still significant and it has the high TDP which I saw was a good thing. I went against the grain and bought it because I figured it would be a fun and unique chip to have, do to everyone scorning it's Thermal Design and no one would have it. It was cheaper. Although I did have to buy the 240mm Radiator to keep it cool and that didn't bother me because I've never owned a closed loop cooler before and quite frankly I think it's cool. Literally :) I guess that is how a Nerd like me would think. The chip was a significant upgrade to what I had before. It is very fast and I don't regret going AMD.
My initial Statement about how you wouldn't "see a difference" between the 2 system was more aimed at a person that goes on the Computer to check FB, Email, Stream and download music and videos, play games, surf the web, use Microsoft Office and all the General things people do with them from day to day. Microsoft word, Media players run to best of their ability and having a high end CPU ain't gonna change that is all I'm saying.
I'm not really sure by what you mean "Outlives" But I hope AMD stays in CPU game. There was rumor they were going give up high end CPU and focus on APU's but that was quelled when they stated they are not leaving Enthusiast behind and keep things competitive with prices and performance wise. The FX lineup will move through 2014 and on. , hopefully doing more with the AM3+ socket. Without them we would be paying out the ying yang for a mediocre CPU and not to mention all have the same computer. I like a little variety. It would be very neat if Nvidia got into the CPU game as they are already tapping.