It is not exactly true that one intel core is much stronger than one FX core. You repeat this very often, thus let me say that if the cores are working in 'isolation' one of others that is rather correct, but when the cores are working together as a whole then each intel cores lost performance due to poor scaling. 8350rocks wrote some useful analogies. I recommend you to read them.
An intel core is stronger than any FX core in any task. This is a reference to any singlethread task or any task where we are comparing EQUAL Numbers of cores. Your point makes no sense. More cores equals the same or weaker scaling (this is why given the same performance any programmer wants fewer cores). I'm not talking about HT here but core for core performance.
Either you did not understand what I said or you just ignored.
The bad scaling of Intel chips is also the reason why Xeon based chips are/were unpopular on supercomputers. The most powerful supercomputer Titan uses AMD Opterons.
I think thats because they signed a contract with AMD.
If you believe that AMD did put a gun in the head of every supercomputer maker of the top 5 to avoid superb Intel technology then you are more foolish than I believed
In any case your single core argument does not invalidates my point. I see a CPU as a whole, which has been designed as a whole.
HT is not magic sauce. I said that does not work for a kind of games. And you ignore that with your claim. Why do not explain your invalid point to someone who needs to disable HT because of the regression on performance. No wait... all of this was explained to you before and you were given links showing how HT can increase or decrease performance depending of the specific situation.
Most games show some improvement from HT
This is not true and this is because most people recommend an i5 over an i7.
Crysis 3
Near doubling of performance from the pentium to the i3. Yes we are comparing SB vs IB and 3.3 ghz vs 3.0 ghz but that can't explain a near doubling of average fps.
There are other examples too. Most of the time the advantage is slight but still there. There are games where the i7 performs better than the i5 because of HT.
According to the review cited Haswell 4770k gives 1% more average performance in games, but performance regressions of up to 5% in some specific games, whereas increases power consumption by about 10%. Adds the usb3 issues and the PSU incompatibilities and you will start to understand why some people here is calling it "failwell", hasbeen",...
That review is GPU bottlenecked. Is it gaming power consumption? (Don't think so)
GPU bottleneck does not introduce performance regressions. No it is not gaming power consumption but ordinary power consumption measurement as in other reviews.
Anyone familiar with CPUs knows the power consumption argument used by anti-AMD people. The same people is now trying to dismiss power consumptions as irrelevant, because Haswell increases the consumption. LOL
Funny that you appeal to mobile space and give power consumptions of desktop processors. More LOL
Yes, because if there power consumption (efficiency) sucks on the desktop its going to suck in mobile. Mobile is difficult to compare because all systems are different so it makes sense to compare the architecture using desktop numbers. If you look at the benchmarks you will see that a 17 watt ivy i7 ULV competes with and generally beats the 35 watt a10-4600m in straight cpu performance.
No. By your own logic mobile Haswell will be less efficient than ivi because desktop is less efficient, but that logic is wrong.
Those 17 watts are not counting the superior gpu of the a10... and beats depending of what benchmarks you are using. The i7 will beat with intel-oriented benchmarks, sure.
The figures that you give are from a review where they used any possible way to increase the power consumption of AMD chips whereas cuting down its performance. Time ago since I analyzed their review, but I recall perfectly those figures, because is not the first time someone gives them to me. Last time was a guy who is very well-known for its intel bias and fantastic claims: he sures to anyone who he meets that his FX consumes 200W more than his i7, which is nonsense.
I am writing this from memory and maybe some detail is wrong, but overall it is right. This is for the FX-8350 and the i7-3770k.
They selected the more power hungry mobo (Asus) for the AMD chips and the more power efficient mobo (MSI) for Intel. Even at iddle the difference between the Asus AM3+ vs the MSI AM3+ was of about 10W and about the same for Asus 1155 vs the MSI 1155. Selecting power saving setting on BIOS the difference between the Asus AM3+ vs the MSI AM3+ was larger still. THe review does not explain that conf they used. But I would not be surprised if they selected power saving for Intel chips but not for AMD ones.
This may be true but there is still a very large power gap between ivy bridge and FX. There is a 96 watt spread between the 3770k and the 8350. They are on the balanced profile (speedstep and cool and quiet are working)
Comparing like to to like (e.g. Asus mobo to Asus mobo) reduces 13 W and considering a 7% due to the FX hotfixes and the 96 W reduce to 77 W, that is a 20 W gain by doing a more fair review! Of course, there are reviews using other configs that reduce the gap to the 47 W e.g. Bit Tech review.
They selected a high-performance G.Skill memory kit for Intel, but an "Entertainment" Patriot kit for AMD. Moreover the AMD chip did run with memory under the stock speed by a 20% generatin some artificial performance gain for Intel.
How many times do I have to say that memory speed rarely matters for cpu performance (outside of a few things such as winrar). The tech report review used the same RAM for all the recent cpus.
For intel? Sure. For AMD FX and APUs? No.
They used W7 and installed the FX hotfixes manually. As shown by toms the hotfixes introduced performance regression for AMD chips (2% on average, but larger for video tasks). Moreover, the patches had a bug that affected the power consumption of FX chips, because blocked some power saving states. According to Toms the hotfixes increased power consumption of FX chips by a 7% average.
Why then do so many people argue that the hotfixes are required?
One thing are the automatic hotfixes and those included in W8 and other thing the hotfixes manually installed. Those are what generated problems with performance and power consumption.
That is how they managed to achieve the FX to run slower whereas consuming more power. Time ago I did an estimation of the real power consumption and task energy and if my memory does not fail the FX moved from 8.1 to near 6.0.
No professional review has ever seen FX compete with Ivy or even sandy bridge on efficiency.
Nobody said that give the same efficiency. Read what has been said.
I don't recall now the setups of the 3960X, but was not using memory overclocked by a 40% for increase its performance?
They are all running the same speed and timings ram with the exception of the first gen core i systems. Furthermore any memory overclock would have an insignificant effect on performance in that test (0-2%)
I revised and yes they used stock speed, but improved performance of the intel chip by selecting a quad channel configuration.