Intel Haswell vs. AMD Steamroller

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Graham Seyffert

Honorable
Apr 29, 2013
15
0
10,510
So I'm looking to build a new computer over the summer. I like gaming, but I'm also a CS Major, so I'll be using the computer both for work and leisure. I have most of the components decided on, but I'm undecided as to the CPU I want to get. I've never built a computer before, so I don't have too much experience with this. In terms of the AMD CPUs, I know they're cheaper, and also have 8 cores compared to Intel's 4. But the new Steamroller architecture doesn't come out until early October 2013, whereas Haswell is being released early June. On the Intel side, I'm looking at the i7-4770k, and for AMD I'm looking at the FX-8350 8 Core (waiting for Steamroller). Price is less of an issue; I want a CPU that'll run fast for a number of years. Which would you guys recommend?
 
Either chip will be plenty fast but as a CS major I would be more concerned with things like: machine virtualization, UEFI vs bios, multi OS compatibility so my preference would be 4770 non-K proc (K series cut out some aspects of hardware virtualization) with a high end mobo and cuda enabled video card.
 


OpenCL > CUDA...check Tom's reviews...CUDA is far less utilized than OpenCL.
 


Good point on the OpenCL, much of the stuff I've encountered in the scientific field is coded directly in cuda but openCL might get you further in other disciplines.
 
I would like to point out that while FX has a TDP of 125W, it's not pulling 125W all the time. The amount of extra power it pulls does not drive up the power bill as much as you think either. It's not the fastest but its most definitely not as bad as people portray it. Yes Intel is very powerful and efficient but their price/performance ratio is not as good as AMDs. An AMD system is just cheaper overall but offers performance that does challenge Intel to a degree. I'm not trying to sound like a fanboy but its true.
 


I don't know if I'm just having one of those days but I found myself laughing uncontrollably when I read that sentence. Not sure but it was either the idea of a time machine, or that somebody has one and it's currently unused and in storage. Thank you AM2A, I so needed that 😀
 


What hafijur doesn't understand is that he's trolling and talking about something he doesn't understand trying to fish for a flame war. Your statements are not supported by facts...so please come back when you have facts that support any of the nonsensical statements you make.
 
I just cant stop smiling when people compare cores with different base speed at the same speed... If the I5 3570k is 3.4-3.5ghz base it can OC to about 4ghz still being stable but the AMD FX-8350 that starts at 4ghz can be OC to 4.8 and still be stable... i dont see any points into comparing OCed I5 to an base clocked 8350...
OKAY i do understand AMD tend to be much hotter than Intel but with a good water cooling you can still get awesome performance with great OC from an AMD

I personaly OCed my 8350 to 4.8ghz on an Sabertooth 990fx and im using a Thermaltake bigwater 920 plus to cool all that and i get much bether performance that my friend that OCed his 3570k to 4ghx wich is 15-20% just like my AMD...

And for what is about the power consumption... today you can get a 850w plus bronze modular PSU for like 150$... so if we take an exemple that you buy a single high end card like a GTX680 from EVGA or even ASUS that takes 200w or so, i dont see how you would get to the maximum power output of your PSU. I know HDD also takes power but doing the maths you have 350w out of 850 take so you still have 500w free for all your other hardwares that require some.

When i bought my gaming build i opted for a 8350 especialy because i found that both AMD compatible Motherboards (sabertooth 990fx) and the CPU his-self were way cheaper than theyr competitors from Intel (sabertooth z77 is 30-50$ over the 990fx) and I still dont regret my choice after over 6 months of having my 8350 and my computer is able to run ALMOST every game in the maxe settings. There are some games i can't run to max but its not a CPU issue but definely a GPU issue(i only have a gtx 560 TI)

I say that useing my personal experiments as arguments.

Ty for reading and sorry for my bad English(im speaking french usualy so)
 


None of this makes much sense? Why is it unreasonable to compare CPU's at the same clock speed? That would only seem to make sense to me. Your argument that the i5 tops out at 4GHz while the AMD chip makes it to 4.8 is complete nonsense. They both overclock well with adequate cooling. In fact, the Intel chip doesn't even need aftermarket cooling to match the AMD chip's stock clock speeds. There are plenty of i5 3570K chips out there running at 4.8 GHz all day long perfectly stable. AMD simply clocks their chips higher out of the box because their IPC is bad and it's the only way to keep up. Clock for clock and core for core, the 4 year old Phenom II's are even slightly better than Piledriver.

Power consumption is a big deal. Not everyone wants to sink $150 into a power supply and even then you might want more headroom for SLI or Crossfire down the road.

The Sabertooth Z77 motherboard is a complete rip off. It offers virtually no advantages relative to a $150 dollar board. Comparing it to the AMD Sabertooth is irrelevant. Both are overpriced with the AMD unit possibly being justified by its offering of PCIe 3.0.


I'm not biased toward either company and both chips are good, but let's not spread misinformation.
 
The difference in power consumption at peak is 40W...if a 500W PSU would work in your intel...an extra $10 would get you a 550W PSU for the AMD rig.

Additionally...with an up front cost difference of $40 between the 2, based on my electricity costs...it takes 4.2 years and change to make up the difference in electric bills if you run your PC maxed out 6 hours per day for 365 days per year. Being as most people won't use near that level of performance from their PC, I expect to make up the cost difference going to intel you'd actually need to run closer to 8 years to recoup your $40 in electric savings
 


To answer the OP's question straight forward (in which case he is asking about things that are not yet released) it's a bit tough to offer a suggestion that actually means anything...(wait til the CPU's are released and we'll talk).

Comparing Intel to AMD is as old as the day is long and it seems like we're at a point that there are so many different things to compare about them you really have to know what exactly you are using it for, where the majority of the time spent will be & how much will it cost. By this I mean what specific programs will you be running, what kinds of games will you be playing etc. This is a day and age that people benchmark just about everything and if you will be using widely used software for the work you will be doing I can promise you someone out there benchmarked it's performance (or in your case they will) with various CPU's..

So we'll make it easy:
Gaming:
What games do you play?
What resolution do you play at?

Work
What programs will you be using for work?

Once upon a time CPU's were much more straightforward and easier to compare. Today they do so much more that to do a real comparison you have to know what (as in exactly what) you are using it for to get the best comparison.

In the meantime I would research your preferred programs or games with the current AMD/Intel CPU offerings (as I said I bet there's plenty of benchies out there already) as a head start.

 


This, additionally though, be sure to look at benchmarks for that program in your preferred OS. If you run Linux the results are dramatically different than the results from Windows
 
Thats exactly why i came to the conclusion that there is no real big deal in that wattage difference... and 6 hours per day at 100%... is there realy anybody who can do that with normal use? and about that answer about that you would not want to buy a 150$ PSU for your build using a high end cpu.... it made me laugh dam lot... I think the most important thing in a computer is balance... there is no point in buying a 200$ cpu and placing it on a 50$ mobo so i think same goes for a psu... there is no point building a high end or even mid range computer whitout geting a good powersuply where you can be sure it is rly good so i think if people cant afford buying a 150$ psu they should more go on a 700-800$ prebuild computer instead of trying to build one theyrselves because it will just end beein inbalanced and it wont work to the top and will just be bottlenecked, i think using 10% of your budget on the psu for building a long lasting computer is a good deal.
 
You havent folowedr wath i said. I said when you use a good psu... 150w isnt much for a good psu and you probably could connect a titan or even a 690 on average cpu once the 8350 power is used... considering the most energivore video card dont even take 300w, you could go with a 8350, and a 650w psu you would still have 100-150w easly left after your done with your cpu and card. and i go again on the thing wich is called balance. you wont go put that 7850 connected to a power supply that barely have the minimum power left that would be rly dangerous... personaly i use a 650W plus bronze psu that i paid like 90-110$ a year ago and its way enought power for my gtx 580 plus my 8350 and my other hardwares. i understand i cant probably go on a SLI build but well when you got the money to afford a good sli build you, most of the time, have the money to afford at least a 3770k wich, NOW, would be a bether choice...Personaly i havent ever see a ''mid range gaming build'' with a SLI because its pretty useless especialy when you are building a brand new computer.
 
It's just such an insanely personal decision these days. To know what's the best set up for yourself you really have to be specific. "I'm using it for both games and work" just doesn't cut it anymore.

15 years ago "work" for the average person was spread sheets and email hence high end rigs were really made for hard core gamers. Today my 11 year old does video editing...

Nowadays both gaming and "work" computers are a whole different ball of wax. Even gaming has evolved from that neck and neck race of CPU vs GPU where they had to be close because one would bottleneck the the other. Although CPU's have (I think) surpassed anything that's needed for gaming as today's games are more GPU driven (I'll take the top processor of 4 yrs ago with a current GPU over the opposite any day to run current games)

For gaming:
Aside from the technological end of it (meaning fancy new lighting features, textures, physx, etc.) the most hardware demanding aspect about gaming seems to be resolution. Many gamers are using big 27" monitors and resolutions of 2560 x 1440 to dual and triple of these monitors bringing that number to 7680x1440. GPU is the most important aspect to consider for this by a long shot. Not saying the CPU doesn't count but I believe that by today's standards if you're all about gaming the order is definitely GPU first.

I myself am a gamer, I use a single monitor at 1920x1080 plus I record and edit a lot of gameplay for youtube.
My specs are
PhenomIIx6 1090T @3.9Ghz
Corsair H80i water cooler
GTX560TI 2GB x2 in SLI
16GB system ram.

I play the latest FPS games and I edit videos using Windows Live Movie Maker, Sony Vegas, Power Director. I record with Fraps and Dxtory.

My next upgrade (in about 4 months) will be video card/cards. In a year or so (from now) I will upgrade my motherboard/processor (mostly for the motherboard..not so much for the processor) and a solid state drive. My dumb motherboard is SATA II (but thats a long story) so I'm not getting an SSD til I get a new mobo/CPU

For me I can't justify the price difference between an Intel processor vs an AMD when compared to their performance. This of course is for me personally and for what I use it for.....My point is just that I am curious as to what programs will the OP be using for work and what will his gaming be like and then we can certainly look into what runs what the best and for the best price.
 


The difference between the i7-3770k, and the FX8350 is 41W. The i5-3570k is the same CPU without HTT. The power consumption is not drastically different.

Look at Tom's reviews.
 


The power consumption difference is much more than 40w between an i5-3570k and FX-8350. You need to learn how to do some research before posting such information. While the i5-3570k is not present in the below chart, the i7-3770k is and the difference against the FX-8350 is 107w according to the chart below.

Like I replied to your post in a different thread, the actual cost of electricity depends on how much you are paying per KWH.
Additionally, using the correct peak/maximum power consumption figures also help with calculating an estimate.

http://www.techspot.com/review/586-amd-fx-8350-fx-6300/page7.html

Power.png


 


If you are going to reference a source, then at least provide a link to the source. Posting the chart would be good too.

Tomhardware's review of the FX-8350 shows an average power consumption difference of about 63w between the i7-3770k and the FX-8350. However, that does not represent max/peak power; it is just average which includes idle and max/peak power in the overall average power consumption. The difference between 41w and 63w is relatively small small I suppose, but that means your 41w has a 34% margin of error vs. the average power consumption.

Source
http://www.tomshardware.com/reviews/fx-8350-vishera-review,3328-16.html

average%20power.png



 
http://www.tomshardware.com/reviews/ivy-bridge-benchmark-core-i7-3770k,3181-23.html

That's an average difference of 46W against a CPU that's notably less efficient in average consumption as well...

Here's the 3570k
This review shows the difference in power consumption to be roughly 10W between the 2.

http://www.tomshardware.com/reviews/core-i5-3570-low-power,3204-13.html

That's going to make the difference give or take 53W in average consumption...

Now, hafijur where was I wrong again? Oh, that's right...I wasn't.
 




If you are going to give advice, then you need to understand the meaning of the words / phrase you are using. Peak means something very different from average.


I believe this topic has run it's course so it, therefore it is being locked to prevent any further flame bait.
 
Status
Not open for further replies.