AMD FX: Energy Efficiency Compared To Eight Other CPUs

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

techpops

Distinguished
Jul 3, 2009
56
0
18,630
What worries me most now is the CPU price increases I'm seeing with Intel and even AMD's Phenoms. The knock on effects of this processor range are really just starting to show now. I have nothing positive I can say about Bulldozer at this point. I don't believe any future revisions will help either. If I were AMD and I could possibly afford it, I'd drop the whole architecture and start again. Really how much would it cost just to shrink the top 1100T Phenom? It has to be cheaper than trying to flog this dead horse into gaining a few per cent a year speed bump and I presume the same measly savings in power too.
 

The 2500K went from $220 to $215 on newegg, and it can be had for even cheaper at superbiiz. com and amazon.
 

masterasia

Distinguished
Feb 9, 2009
1,128
0
19,360
Power consumption and heat has always been AMD's problem from the get go. Didn't surprise me here with these bench marks. I really wanted AMD to come out with a winner here. Let's face it, AMD is not know for being innovative. They've always copied Intel (which Intel let them do). They just need to copy a little bit better.
 

travish82

Distinguished
Aug 1, 2007
33
0
18,530
[citation][nom]The Greater Good[/nom]As far as the average user goes, yes. This is correct. I run BOINC so my desktop is always at 100% CPU load 24/7 365; 4 cores/8 threads always maxed.[/citation]


You, My friend, are a liar... I was wondering what BOINC was, so I googled it.. and guess what??? BOINC server are down right now. I guess you aren't running at 100% then huh?
 

ashkal

Distinguished
Mar 25, 2010
35
0
18,530
Toms hardware efficiency test is tottally wrong.Power consumption to be measured against total trasistor of processor.If you do so AMD Bulldozer is
seems winner in all respect. Change the test parameters.
 

Check out all the reviews in regards to BD and power consumption. They all say the same thing...that BD is a power sucking hog that should have never been released to begin with. It's a total FLOP in every sense of the word.
 

linuxlowdown

Distinguished
Nov 2, 2011
7
0
18,510
There is one key (and huge) variable you have neglected in your equation - were all your tests conducted on software compiled with Intel's C compiler with their optimisations? If so, it would be necessary to do a just comparison using AMD's open64 (version 5) C compilier with their Bulldozer optimisations. If you don't know what I'm talking about (and I'm sure you do, old Tommy boy) then I suggest you head over to Phoronix dot com. Until then, I would rather suspect that this article is less scientific and more about sales promotion (or negative publicity).
 

ashkal

Distinguished
Mar 25, 2010
35
0
18,530


Simple MATHS, You cann't compare apple to melon BD has TWO billion trasister where as other have less than a billion.
 

linuxlowdown

Distinguished
Nov 2, 2011
7
0
18,510
[citation][nom]masterasia[/nom]Power consumption and heat has always been AMD's problem from the get go. Didn't surprise me here with these bench marks. I really wanted AMD to come out with a winner here. Let's face it, AMD is not know for being innovative. They've always copied Intel (which Intel let them do). They just need to copy a little bit better.[/citation]

Before you make fanboy generalisations, you ought to know your stuff. Lifted from Wikipedia regarding Pentium 4 -

"Intel also released a series of Prescott supporting Intel 64, Intel's implementation of the AMD-developed x86-64 64-bit extensions to the x86 architecture."

So in your mind, AMD does not innovate but rather just copies Intel. BS son.
 


i read the phoronix's fx 8150 and fx 4100 reviews when they came out. in the fx 8150 benches bd barely kept up with a 2600k, sometimes edging out. 2500k could keep up with fx8150 in the benches (probably using a lot less power than the bd setup.). the review concluded that bd cores don't scale well. that was on linux's programs that are multithreading optimized.
the fx4100 turned out to be a crap cpu performing behind llano and core i3.
bd doesn't need negative publicity for the bad performance, the facts are enough. :D


according to your maths bd should perform better because they have more transistors, right? but the core cpus easily keep up or beat fx cpus with less transistors (and they have built in gfx).
 

ashkal

Distinguished
Mar 25, 2010
35
0
18,530
More the transister more the power onsmption. Here we are talking aout efficiency power consuption. Not the performance. I should correct Say so 0.0000001mW for one trasister will traslate multified by billion transistors.
 

you know what? i totally did not know that.
yet it makes me wonder why sandy bridge cpus are more power efficient than nehalem cpus. why fx cpus have more transistors (no igp) and be less power efficient than sandy bridge (all cpus have igp).
imo you shouldn't calculate cpu power efficiency by disregarding cpu performance. efficiency does depend on input and output power but cpu performance plays a vital role. here fx's efficiency is being compared to other cpus that's why performance is important. otherwise atom and zacate could be overall better cpu than bulldozer.
and both bd and sb cpus employ powergating and other power saving features. fx's power saving features work less efficiently.
and sandy bridge cpus don't have a billion transistors. i suspect that they didn't need that many. :)
 

techelite

Distinguished
Sep 18, 2010
14
0
18,510
[citation][nom]de5_roy[/nom]i read the phoronix's fx 8150 and fx 4100 reviews when they came out. in the fx 8150 benches bd barely kept up with a 2600k, sometimes edging out. 2500k could keep up with fx8150 in the benches (probably using a lot less power than the bd setup.). the review concluded that bd cores don't scale well. that was on linux's programs that are multithreading optimized.the fx4100 turned out to be a crap cpu performing behind llano and core i3.bd doesn't need negative publicity for the bad performance, the facts are enough.[/citation]

I suggest that you go back to Phoronix and read their latest articles on testing Bulldozer with the Open64 C compiler (AMD's own engineered compiler) to understand what I'm talking about. The ubiquitous MS Windows and associated software (including test suites) are compiled for Intel CPU optimisations because Intel dominates the CPU market, so that makes sense in the marketplace. But if we are wanting to compare in a pure sense the engineering value of Bulldozer, it is hardly accurate to use Intel CPU optimised software, is it? When it comes to super computers, over 90% use Linux. And when it comes to servers, Linux dominates again. These areas of computing matter greatly. And markets for which this chip design are overwhelmingly aiming.
 
you guys oughta read the stuff you refer.
and please stay on topic.
i use fx and bd interchangably sometimes.



first things first:
i did not know that blender, 7zip, lame mp3, handbrake were intel optimized. how ignorant of me. (and darn those intel-favoring freewares.)
you're right in the sense that benchmarks should not use intel optimized softwares. although the test setup softwares didn't seem
intel-biased to me (that ms powerpoint one seems to be favoring intel, may be use libreoffice's calc?). ;)
super computers? why are supercomputers relevant here? this is about fx's performance and power efficiency compared to amd's other desktop cpus and intel's desktop cpus.
linux? why is linux's domination in servers and super computers relevant here?
if amd is really aiming for fx cpus to run overwhelmingly in linux server/super computer system, they can disregard
the huge number of regular ubuntu or windows users who use desktop pcs. oh wait, they already designed valencia and interlagos cpus for those high performance computing type of systems. zambezi i.e. fx is the cpu being tested in this article.

now about phoronix articles:
both newer articles conclude that fx is still hit or miss and even with amd's optimizations and amd's suggested test setups sometimes it underperforms.
none of the newer articles compare fx with intel cpus while running amd optimized softwares. (common knowledge: amd's drivers usually suck.)
none of the phoronix articles have provided any power consumtion or power efficiency comparison figures so far.
phoronix doesn't even show what test setups (specs, temperature, noise level etc) they're using.
phoronix is a seperate site with their own rules and hardly relevant here.
there is a seperate forum thread that tracks phoronix's exploits with bulldozer, not this one. i like to read what phoronix does but that's not the point here.

if the next posts are similarly off-topic and full of irrelevant matters, i'll simply ignore them.

32 nm sandy bridge (and a lot of old 45 nm nehalem cpus) cpus are still more power efficient than 32 nm fx.
 

techpops

Distinguished
Jul 3, 2009
56
0
18,630


I'll just add here that Maxon, the guys who coded the Cinebench benchmark which is really just the engine from Cinema 4D in a nice benchmark package, are on record talking about AMD and Intel hardware making absolutely no difference whatsoever when it comes to choosing what runs best on not only their code, but the rest of the graphics industries, which includes all the heavyweight 3D rendering packages, Photoshop and so on. So while some desperate ideas can be thrown around about how you should use AMD optimised benchmarks as well as Intel ones, the real world results of this are meaningless.

All this kind of talk about needing AMD optimised code does is make real AMD fans look like lunatics grasping at anything they can to make their case while the bulk of real fans, fans like me, someone who has always used AMD processors just sit here stunned at how bad BD really is, concluding that Intel has to be the next upgrade. Even if only power concerns interest you, there's just no case to be made for AMD right now on the desktop, or in any kind of crossover workstation class systems.

Real AMD fans should be angry at AMD and let them know they need to sort this out. I suspect AMD is in a transitional phase here moving away from the desktop but just not wanting to admit it as there is still money to be made. The bulk of the future profit could be coming from servers. I'm not personally convinced of this but you could argue this case without resorting to lunatic ideas.

Probably more relevant to us enthusiasts is just how miserable BD is when overclocked if power consumption is even slightly important to you.

And finally, Intel fanboys kicking AMD fanboys while they are down is not only unattractive, it's neither clever or helping one side win over another. You just look pathetic. Get real, these are huge companies, not your friend who needs your support through good times and bad. Sane people choose what's best based on performance/price comparison and I wish we could all do that and put these silly branding games away. There's enough craziness in the cult of Mac for the whole industry without resorting to little fan cults around processor manufacturers.
 

judge_dredd

Distinguished
Nov 4, 2011
1
0
18,510
Amd really disappointed me with BD. The more, but weaker individual cores analogy says it all. The people responsible for this travesty need to be fired. Then hire back the architects who'm forced Intel into the fetal position many years ago. A weaker FPU than the Phenoms? C'mon. What were their execs smoking?

kurtcunningham wrote:
my co-worker's sister makes $70 an hour on the laptop. She has been unemployed for 8 months but last month her check was $7699 just working on the laptop for a few hours. Read this site ho.io/p9qj

This site you posted is a scam. Everyone should avoid it!
 

techelite

Distinguished
Sep 18, 2010
14
0
18,510


This is for the benefit of those following this post.

AMD settled in court over Intel compiler optimisations just in late 2010. Purposefully crippling computer code leads to poor performance that leads to power inefficiencies. Relevant and on topic. I'm not sure what the outcome would be using a neutral testing platform (ie Linux) compiled with optimisations geared to each brand of processor. But I'd sure like to know the truth, rather than believing the clearly unreliable results from the biases inherent in the test bench used in this article. Wouldn't you all?

Quote from Wikipedia http://en.wikipedia.org/wiki/Intel_C%2B%2B_Compiler

Intel and third parties have published benchmark results to substantiate performance leadership claims over other commercial, open source and AMD compilers and libraries on Intel and non-Intel processors. Intel and AMD have documented flags to use on the Intel compilers to get optimal performance on Intel and AMD processors.[18][19] Nevertheless, the Intel compilers have been accused of producing sub-optimal code with mercenary intent. One developer, in 2009, wrote[20]:
“ The Intel compiler and several different Intel function libraries have suboptimal performance on AMD and VIA processors. The reason is that the compiler or library can make multiple versions of a piece of code, each optimized for a certain processor and instruction set, for example SSE2, SSE3, etc. The system includes a function that detects which type of CPU it is running on and chooses the optimal code path for that CPU. This is called a CPU dispatcher. However, the Intel CPU dispatcher does not only check which instruction set is supported by the CPU, it also checks the vendor ID string. If the vendor string is "GenuineIntel" then it uses the optimal code path. If the CPU is not from Intel then, in most cases, it will run the slowest possible version of the code, even if the CPU is fully compatible with a better version. ”

This vendor-specific CPU dispatching decreases the performance on non-Intel processors of software built with an Intel compiler or an Intel function library - possibly without the knowledge of the programmer. This has allegedly led to misleading benchmarks.[20] A legal battle between AMD and Intel over this and other issues has been settled in November 2009.[21] In late 2010, AMD settled an US Federal Trade Commission antitrust investigation against Intel.[22]

The FTC settlement included a disclosure provision where Intel must:[23]:
" ...publish clearly that its compiler discriminates against non-Intel processors (such as AMD's designs), not fully utilizing their features and producing inferior code."
 
G

Guest

Guest
Why even release this CPU?
What a waste of time, money and resources.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
Sorry, if Intel;'s compiler discriminates against AMD, then AMD should be designing around that, not Intel. Also it sounds remarkably like it isn't that it 'discriminates against non-intel processors' so much as it is optimized for Intel processors... once again, not Intel's problem.
 

techelite

Distinguished
Sep 18, 2010
14
0
18,510
[citation][nom]Draven35[/nom]Sorry, if Intel;'s compiler discriminates against AMD, then AMD should be designing around that, not Intel. Also it sounds remarkably like it isn't that it 'discriminates against non-intel processors' so much as it is optimized for Intel processors... once again, not Intel's problem.[/citation]

Draven35, it was determined in a court of law and was settled (along with other proven monopolistic practices, such as paying channel vendors not to promote, stock or sell AMD processors) with court ordered compensation money from Intel. The Intel compiler was proven to ".... discriminates against non-Intel processors (such as AMD's designs), not fully utilizing their features and producing inferior code". Inferior code equates to crippling potential performance. AMD have tried to work around this, producing their own C compiler which is now maturing. The problem is that the Windows operating system is compiled using the Intel compiler (it makes sense prima facie as the majority of computers for which it is designed, ie home ones, utilise Intel processors). So when one benchmarks an AMD processor using Windows OS as the platform, it will result in inferior performance. How much so, I don't know. But it should not be ignored when benchmarking an AMD processor against an Intel one. Particularly bearing in mind that the home computing market is not the only one. In fact, I would dare say that most home consumers don't care about a processors energy efficiency (excluding you). However, energy efficiency is a huge concern for server farms and supercomputers. And Linux/ Unix is used predominately in this space. Hence the need to test on this platform to find the true performance of each company's chips, using each other's compilers and optimisations.
 

techpops

Distinguished
Jul 3, 2009
56
0
18,630
[citation][nom]techelite[/nom]Draven35, it was determined in a court of law and was settled (along with other proven monopolistic practices, such as paying channel vendors not to promote, stock or sell AMD processors) with court ordered compensation money from Intel. The Intel compiler was proven to ".... discriminates against non-Intel processors (such as AMD's designs), not fully utilizing their features and producing inferior code". Inferior code equates to crippling potential performance. AMD have tried to work around this, producing their own C compiler which is now maturing. The problem is that the Windows operating system is compiled using the Intel compiler (it makes sense prima facie as the majority of computers for which it is designed, ie home ones, utilise Intel processors). So when one benchmarks an AMD processor using Windows OS as the platform, it will result in inferior performance. How much so, I don't know. But it should not be ignored when benchmarking an AMD processor against an Intel one. Particularly bearing in mind that the home computing market is not the only one. In fact, I would dare say that most home consumers don't care about a processors energy efficiency (excluding you). However, energy efficiency is a huge concern for server farms and supercomputers. And Linux/ Unix is used predominately in this space. Hence the need to test on this platform to find the true performance of each company's chips, using each other's compilers and optimisations.[/citation]

The funny thing about this is it only comes out right after the fail that was Bulldozer. No mention of it in the tech press before then. AMD didn't bring it up as any kind of excuse for BD's lack of performance. In fact, only some AMD fans have got this story, not really into the press as such but into comments around articles talking about Bulldozer.

Come on guys, get real, accept it, BD is a flop. Lets hope AMD do better next time but no need to go all conspiracy theory nuts and declare all benchmarks everywhere null and void now. Geez this is getting pathetic.
 
Status
Not open for further replies.