Intel Quadcore Vs. AMD Octacore - Gaming and future octacore-optimized development.

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

prankstare

Distinguished
Mar 29, 2010
50
0
18,630
Hey,

So we all know Intel's architecture is much better and energy/performance efficient per thread/core but how about multi-tasking performance? Also, do you think that, in the near future perhaps, not only games but also most computer programs will all benefit from using 8 actual cores like next-gen consoles are doing for games?

The reason I'm asking this is because I'm a bit torn between buying "faster" but expensive Intel's quadcore solution i5 3570k or "slower" but much cheaper AMD's octacore FX-8350. However, if the future say 8-12 months from now will be eight-core optimized sofware all the way (including games and overall multi-tasking), then I think such "slower" (for now) AMD solution is worth it.

So, any ideas?

Thanks!
 




I had an old GTX 570 laying around, and built it up as a file server / thingy i hook up to my 55 inch tv to watch movies on.

that doesn't change that HW offered so much improvement over IB, when you round the % increase to the nearest 10...it rounds down to 0. That's not progress...that's stalling

Yeah, stalling because if they releace there best stuff at good prices, AMD would go out of business and the gov't would crawl all over there asses, because >Lel MONOPOLY.
Intel is being held back, anything AMD inproves, Intel will release something slightly better at a higher price point. Intel loves AMD fanboys that buy AMD just because AMD is AMD, and not Intel. These are the people we like to call "Hipsters".

hipster-2.jpg
 


If the linux part is true, it should not be very difficult to load a different kernel and have, say, Linux mint on your gaming system. I am running Win7, and in an Arch VM right now, Linux is > windows for basically everything but gaming.
This could posably be a selling point for Sony, even sell it with a full PC distro to dual boot with...
 
Actually...the hipster crowds tend to buy Intel...just basing this off of target market analysis done for the game I am working on.

Either way...Intel is actually held back by several things at this point, and it isn't AMD. Sorry to burst your bubble, but this is reality:

1.) 22nm yields were poor for Intel, and 14nm is proving to be an extremely difficult process for them to master, hence pushing back broadwell and releasing a haswell refresh next year. This is mostly why overclocking has been hit and miss on their CPUs after Sandy Bridge.

http://www.dailytech.com/Report+Intel+Delays+14+nm+Broadwell+Schedules+Haswell+Refresh+for+2014/article31770.htm

2.) Intel uses a bulk wafer, which compared to other types of processes, gives much lower yields with much more variable quality results. They use a complex tri-gate architecture on their CPUs to increase performance, but this also leads to lower yields as well.

http://en.wikipedia.org/wiki/Silicon_on_insulator

(The link above is for SOI, which is the process AMD uses)

http://en.wikipedia.org/wiki/Finfet

3.) Because AMD is using a larger process, there is more room to improve over Intel and they haven't hit the same brick wall Intel has. All this really means to you is that, AMD is close on a larger process, when they get to the transistor size that Intel is at...their performance will exceed what Intel has accomplished on the same process.

http://semimd.com/blog/tag/14nm/
 


The condition was, battery life while web browsing...you overlooked part of that specifically. Also, the discussion was about the Pentium M, not hasfail!

Show me 10 hours battery life on the IBM laptop you said you had with the Pentium M while web browsing...stop deflecting the question. Produce the benchmarks or reviews showing the 10 hours while web browsing.

EDIT: As an aside, I don't know about you, but my laptop is turned off when I am not using it...battery life at idle is worthless to me. Your idle power consumption argument is like putting a dress on a dog...at the end of the day...you still have the same dog...now it's just wearing a dress.
 
Additionally...that doesn't change that HW offered so much improvement over IB, when you round the % increase to the nearest 10...it rounds down to 0.

That's not progress...that's stalling.

That's because they didn't increase the CPU much. Fixed a few things here and there, but they bumped up the igp a lot. It's still behind AMDs newest offering (so far, we still need to see the best igp.) but it's much improved over IB's IGP. And somehow that's a fail....
 
I have read all the thread.

Have you guys considered to compare intel to amd at the same speed? I mean 8350 is 4.0 and intel is 3.5 for i7. But we all know that intel processor are easy to OC to 4.0.

In this scenario i think intel wins hand down. It is the best cpu for the enthousiast!
 


If you're a dedicated PC Gamer, it is...you're not using the iGPU on anything unless you're trying to build a $500 system, even then you'd be better off with something like an A10-5800k, seeing as it outperforms the 4770k in gaming performance utilizing the iGPU.
 


Show me a laptop with HD 5200 graphics for under $1000...you can buy a gaming laptop with a dedicated GPU that will outperform the HD 5200 by a longshot for that money.

Additionally...it performs at a level of about 60-65% of the GT 650M...which is not, by the way, anywhere close to the GTX 650 in performance.

55290.png


55291.png


55292.png


Note: HD 5200 cannot play Crysis 3 on medium settings @ 1600x900.

 
This thread reminds me of Saturday Night Live's skit on Point- Counter Point with Dan Ackroyd and Jane Curtain!

On a serious note I own both a 3770k rig AND a 8350 rig.
The 3770k is OC'd to 4.4Ghz on an Asus P8Z68 V Pro/gen3 mb and the FX8350 is OC'd to 4.6Ghz on an Asus Sabertooth 990FX Rev1 mb. In gaming the 3770k is quicker. However, the 8350 is hardly a slouch.

What I find silly is the insinuation that because a certain cpu might be quicker in a few benchmarks it is better overall. I find overall the 3770k is a faster chip than the 8350. However it should be as it was more expensive (eventhough I snagged one from MC for $229.99 + tax).

The whole future proff idea is crazy. Intel has now moved to a 1150 socket for the Haswell. Though I hope the 990FX chipset will support a SteamRoller, I really have my doubts. The SteamRoller core appears to be appearing in APUs. I really wonder if AMD will bring it out for the FX990 chipset?
 
If you're a dedicated PC Gamer, it is

I find it odd that you think HW is a fail, but consider PD and probably BD just fine. As I mentioned above HW didn't increase IPC much at all, at least for the CPU. BD/PD however took a step back from Stars. PD is nearly back to where Stars was, but is still a percent or two behind Stars. Why label HW a fail yet sing the praises of PD?

itunes.png


What I find silly is the insinuation that because a certain cpu might be quicker in a few benchmarks it is better overall.

Huh? Assuming "better" means faster, how is the CPU that DOESN'T stumble on the occasional game not the better CPU? More so when it uses less power? Or supports more things? (Thunderbolt, PCIe 3.0) That is silly? This is why we benchmark CPU/GPUs in the first place. To find out which is better for our money.
 
While I consider Bulldozer to be a giant leap forward in architectural design for CPUs, I consider the execution to be a failure. BD should have been PD, PD should have been what SR will be, etc. I feel it was rushed to get "something" out to compete with Intel, probably wholly (or at least in part) motivated by lack of cash flow because of the outgoing management's poor decision making.

Having said that, though...Piledriver is a much better execution of the hardware design's capabilities. I consider it to be a drastic improvement...(and at 15% nearly across the board it was).

I am not biased in my conditions for performance improvements... BD was a flop, while PD is considerably better. SB was good, IB was incremental improvement (but approached 10% almost across the board), and HW is a flop.

It's ok for Intel to flop...it happens. But I am not one to dance around it, I call a spade what it is...a spade.

EDIT: In all honesty...I expected at some point this would happen. Intel's chase for ever shrinking transistors was bound to hit a brick wall using bulk wafers (which give relatively poor yields), and using a complicated process with many steps in tri-gate (which makes poor yields even poorer). I applaud the effort, but it's not unexpected.
 


We are not discussing laptops now...thank you for your input.
 
http://www.mobiletechreview.com/notebooks/Sony-Vaio-Pro-13.htm

We don't usually combine these two sections, but Intel's fourth generation platform ties both in for some significant changes. Is Intel's new Haswell CPU/GPU much better than Ivy Bridge in terms of performance? No.

our Core i5 easily ran for 6.5 hours of actual use time (not standby) on its not very high capacity 4,470 mAh battery using the Balanced performance profile with WiFi on and brightness at 50% (with auto-brightness turned on in Vaio Control Center).

Battery life will depend on what you're doing with the laptop: streaming HD video uses more power than editing MS Word documents and processing HD video will consume more power than web browsing or doing email.

The Vaio Pro 11 and 13 are available with Intel Core- i5-4200U and i7-4500U dual core CPUs with Intel HD 4400 integrated graphics. There's no option for HD 5000 graphics, and we suspect there won't be because the Vaio Pro is too thin to allow for adequate cooling for the somewhat hotter GPU. Sony has the Vaio Duo 13 for those who crave slightly faster integrated graphics. RAM is soldered to the motherboard, so you can't upgrade it later. The machine is available with 4 and 8 gigs of DDR3 1600MHz RAM. You can get the Vaio Pro with a 128, 256 or 512 gig SSD drive and it uses the new, faster PCIe standard rather than mSATA for extremely fast storage performance (our 128 gig SSD is made by Samsung). This SSD will likely be the only upgradeable part in the machine.

To access internals, you'll need to remove three screws: one under the sheet battery connector rubber cover and two covers near the back edge of the machine. You can then remove the bottom cover, which wraps around the sides of the laptop. Since the CPU and RAM aren't upgradable, the only part of interest is the PCIe SSD drive, once aftermarket drives are available.

Website: www.sony.com

Price: starting at $1,249

Nothing about that is at all impressive. $1250 for a crappy ultrabook with 6 hours battery life doing minor tasks, and that's only if you set the screen brightness to 50%.

Haswell is not "the second coming"...
 


What happened to 10 hours battery life? That one only has 5 hours...and most of that is in the bigger/more efficient battery. Also, when you take a number like 3 hours and turn it into 5 hours that's a 40% increase, and it sounds like a HUGE number...but reality is, it's not a dramatic increase.

For example, going from 1 hour battery life to 2 hours battery life is 100% increase...but in the grander scheme of things that's only one more hour. Where as in our example above...40% gained was 2 hours.

So, while 57.5% sounds ENORMOUS...it's all relative...and 5 hour life on a laptop does not impress me. In fact, that's pretty average. I am guessing IB must've had horrible power consumption before, if they gained 57.5% plus the additional battery capacity to get to 5 hours of battery life.
 


The old laptops with big batteries claiming 8 hours battery life were running that at idle. No laptop in this day and age, even now, gets 8 hours battery life during peak usage...much less 10-12 hours.
 


Well, i got this [strike]old[/strike] BRAND NEW computer with THREE count them, THREE intel Pentium 4's in SLI that you might be intrested in. It has 300% [strike]MOAR[/strike].

Lol.

I would also like to know about AMD's inprovment in the mobile secotor... I have an A-6 laptop that runs minecraft at ~50 fps on far, and i was quite impressed, actually. Battery life is terrible though, not sure if toshiba just has bad batterer or if it sucks that much power..
 


The issue I have specifically with that number is precisely this:

The battery life figure is quoted from the outgoing model vs. the new model, and they did some cocktail napkin math to extrapolate how much of the battery life improvement was the actual increase in battery size and how much was the CPU. Additionally, they didn't take into consideration any of the changes from the old model to the new model in their figure.

So, while haswell may have improved battery life at idle, I am not sure that the improvement is really 57%...because several other review sites...(namely not AnandTech.com) have seen battery improvement over IB in other laptop models that was actually quite a bit lower.

So either that improvement on that specific laptop is an anomaly, and it's really much closer to 30% improvement across the board for idle power consumption (which seems far more likely). The other option is, the rest of the review sites are wrong, and AnandTech, who got 57% with cocktail napkin math, are correct.

I am not scoffing at 30% idle power consumption improvement, however, that doesn't translate into a real world 30% improvement in battery life under productivity work.

That's my point, perhaps I was unclear, if that's the case I apologize.

 
Haswell is a massive battery life increase.

55504.png


504 minutes is 8.4 hours.

Macbook Air

(Haswell also supports LPDDR3 which is used in the MBA 13")

55644.png


More than 10 hours on a 54 watt hour battery. 47% increase

55645.png


Nearly 9 hours on medium, 67% increase

Nearly 5.5 hours under heavy testing, 55% increase.

55646.png


Massive 50%+ increase in battery life on the MBA (similar though not identical batteries) with comparable cpu performance (haswell is very slightly slower) and slightly better gpu performance (about 10-15% better).

Other reviews

Tech Radar

http://www.techradar.com/reviews/pc-mac/laptops-portable-pcs/laptops-and-netbooks/11-inch-macbook-air-2013-1158387/review

11 incher
In our test, we set the screen brightness to around 50% and streamed the news channel on BBC iPlayer for an amazing seven hours, 15 minutes. This is more than double what its 2012 counterpart could manage.

Another Review

http://arstechnica.com/apple/2013/06/same-wrapper-all-new-candy-center-the-2013-macbook-air-reviewed/4/

ran two tests on both the 2013 and 2012 MacBook Airs with the screen brightness set to 50 percent and the keyboard backlight disabled. The first opens five webpages in Safari at the rate of one every 20 seconds, closes the window, and then starts over again while an MP3 is looped. This replicates a light-to-medium workload in which browsing, opening and closing apps, and occasional media consumption are the main activities. This is the kind of workload that Haswell really shines in. The 2012 Air lasted for five hours and 52 minutes under this workload, while the 2013 Air boosted this to nine hours and 48 minutes. Perform the same test without the looped MP3, and the 2012 Air lasts for seven hours 51 minutes while the 2013 Air goes for an amazing 11 hours and 49 minutes.

Looking around you can certaintly see that the battery life gains from haswell for ULV are simply phenomenal.








 
(Haswell also supports LPDDR3 which is used in the MBA 13")

This is my point...it isn't just 4th Gen Intel getting those gains...there have been other system wide optimizations to draw less power...the combination of which yields greater battery life. It's not "haswell decreased power consumption by 50%" because it didn't do that. Laptop manufacturers have put together a total package that draws less power.

For example...a Toshiba laptop with an A8 AMD APU has 6 hours battery life at peak usage. That same laptop from last year only had 5 hours usage. That's a 20% gain from improvements that had nothing to do with the CPU or the battery.

Notice no one is talking about how greatly trinity mobile increased battery life year over year? Because there was no "new released product" to go along with the other optimizations. Intel is just capitalizing on the normal status quo of laptops to make their CPUs look far more efficient than what they really are.

This is desktop, but the results are the same...24% reduction in idle power consumption:

haswell-power-consumption.jpg


Now, how can you sit back and claim that a 24% reduction in idle power consumption equates to a 50% increase in battery life? That doesn't add up at all. I would give you 20-25% increase in battery life in laptops is because of haswell...when battery life is measured at idle or low power tasks.

Here is arstechnica talking about it:

http://arstechnica.com/gadgets/2013/06/the-u-is-for-ultrabook-intels-low-power-dual-core-haswell-cpus-unveiled/

All of this adds up to a set of improvements that don't radically change the Ultrabook form factor, but they may solve some of the problems keeping certain Ivy Bridge Ultrabooks and tablets from being great. Ivy Bridge tablets like the Surface Pro are the strongest case in favor of this argument, but the poor battery life of Acer's original Aspire S7 or Asus' twin-screened Taichi are also good examples. Haswell might not enable radical form-factor changes, but we suspect that several almost-great Ivy Bridge devices will be able to use Haswell to get the boost they need.

It seems that IB's power consumption in ULV was so lackluster, that the improvement of this generation over last is magnified because there was so much room for improvement to begin with, and combine that with other low power optimizations and you get the results you posted above.
 


That chart you have there is amazing. On a desktop system there is a lot of power dedicated to non essential needs (mobo is larger and less efficient, fans, higher idle clocks etc,--its the reason why a desktop gpu idles at 10 watts while the mobile version idles much lower at 2-3 watts as it has a smaller PCB + no dedicated fan for the most part). On a mobile system the rest of system power use is MUCH lower than a desktop (essentially take that chart there and cut off 15 watts from every figure) making the cpu a much larger power user proportionally to the system.

That chart there shows that assuming same mobo power consumption haswell idles using more than 10 watts less power on just the CPU. That is pretty amazing.


 


AT IDLE there is lower power consumption, as soon as you use the CPU the power draw is higher in desktop. Haven't found mobile numbers for laptops with similar type breakdowns...(which is quite odd that there's such a lack of detailed power usage over given intervals).

I am extremely familiar with how mobile solutions work, everything is optimized for low power. As a matter of fact, there have been many advances in LPDDR memory and other areas recently that, combined with 4th gen Intel, make the entire package consume less power.

http://www.tomshardware.com/reviews/core-i7-4770k-haswell-review,3521-18.html

Look at those power consumption figures...and that's not the only one.

There are several others showing similar consumption numbers. So, unless the laptop CPUs are dramatically different (they aren't, btw)...then the power saving improvements shown in battery life are largely not 4th Gen Intel CPU's doing it (outside of idle power consumption improvements).
 
Status
Not open for further replies.