FX 8350 or i7 4770k

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

ANIR0X2K00L

Honorable
Mar 10, 2013
69
0
10,640
I need help with this topic, i am getting a new PC and i can wait till then end of this year. The problem is that i will have to keep this PC for the next 5 to 6 years. I can spend extra to get a 4770k when it release but from what i have heard is that next gen games would run better on AMD hardware and will require more cores. I play graphically extensive games like Battlefield 3, Crysis 2, Crysis 3. I will be getting battlefield 4 so this will give u an idea of what type of games i am playing and all of these games need a good cpu to run, especially crysis 3 and possible Battlefield 4. Please just tell me that should i go with fx 8350 or the i7 4770k (when it releases in june), or should i wait for amd's steamroller or intel's broadwell. And please even list a good motherboard like asus maximus or crosshair formula motherboard or something better. I have noticed one problem that the AMD chipset i.e. 990fx dont have all the latest features.

Any help is greatly appreciated, please understand that i dont want to make a mistake cause if i make one i will regret it for the next 5 to 6 years.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Most older games are single/dual threaded. This mean, roughly, that they are using about a 12.5--25% of an eight-core FX chip, but about a 25--50% of an i5/i7.

Future games will be heavily threaded, and thus will be using the full potential of eight-core FX chips. Consoles such as the PS4 use eight-core chips from AMD. The PS4 developer kit uses a FX eight-core chip... This is the reason why all triple-A game developers recommend the FX-8350 as the best CPU for future gaming. See the eurogamer article.

If you plan to play only older games then the i5 is probably the choice for you, because the i7 HT is useless for those games and Haswell 4770k (Failwell) only provides about a 1% gaming performance than Ivi Bridge, being more expensive and consuming more power from the wall.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


And those multicore chips from Intel suffer from performance (bad multithreading scaling) and are power hungry. Six core extreme chips from Intel consume lots and lots of more power than eight-core designs by AMD.
 

whyso

Distinguished
Jan 15, 2012
689
0
19,060




Unfortunately Perf/watt is really all that matters in the mobile space and with a shrinking desktop cpu market amd is not in that good of a position.

x264-power-peak.gif


x264-power-task-energy.gif


Seem to be roughly equivalent in power consumption (differeing test setups). Except that the 3960x is much faster.



 
G

Guest

Guest
So what I'm hearing from the AMD guys is "if you want to buy the perfect gaming PC, buy a PS4 or whatever the new Xbox is called." Because if all you are doing is gaming, then why even buy a PC? Why not just spend $400 or $500 on a console that will play all the same games that an $800 AMD PC can play and even some console exclusives? Nobody is going to buy an AMD PC for anything other than gaming. For the price of a current top of the line AMD gaming system (8350 + 7990) you can get BOTH next gen consoles and a few games and membership to LIVE.

Sure, you can upgrade the PC, but those upgrades are minimal. By the time the 8350 isn't able to run new games in full resolution and medium settings, a new console will be out and it will cost less to get one than upgrade your PC.

The original poster wanted to know what the best all around PC is between a 4770k and an 8350. Not the best one dimensional gaming PC. Not the best value PC. The best period.

"Advanced" Micro Devices, Inc. the company: This is purely about their business model. They are betting the farm on consoles, obviously. The way they do business now, they are making close to nothing on their GPUs and APUs just to undercut the competition. In fact, according to their stock chart, they are losing money by selling inferior quality products at a very low retail price; a lot of money. In 2006, the common stock symbol AMD was worth $40 per share. As of today, it is worth a shade over $4. Just last year, it was $8. The stock is plummeting, they have been losing half their money every year. What do you think is going to happen if there is even a single problem with either console? Even if it doesn't have anything to do with an AMD component, the company will cease to exist as we know it. They will go the way of Atari and most likely be bought out by a giant tech name or disappear for good. Their market cap is a little under $3 billion, which sounds a lot to us, but compared to Intel's $120 billion or Apple's $425 billion, it is nothing. Apple fluctuates by $3 billion or more each day sometimes. And this is the second largest CPU company. They are betting it all on x86 consoles at a time when ARM based mobile devices are completely dominating the personal computing sector. If you want to use the word "fail" creatively, how about in a way that makes sense. Fail is exactly where AMD is headed and this discussion won't even be an issue in a few years. Less if there are YLOD and RROD by the boatload with the next gen consoles. Intel made a very smart move avoiding the console market altogether and giving AMD not one, but TWO chances to fail.

Sure, games will be heavily threaded in the future... but not CPU threads. This is the reason why even mid-range GPUs have nearly 1,000 shader cores. Those are the cores that matter in games. Having 8 or even 10 or 20 cores doesn't do much good when each individual core is weak. Intel doesn't need more than 4 cores in their CPUs because their individual cores are enough. Games do not "require" multiple cores. When a multicore CPU maxes out the resources of one core, it spreads the load to the other cores. This is why on current triple-A games, a 3570 is only using 2 cores at most at any given time and the processes in those two cores equal what the 8350 needs 3 or 4 cores to accomplish in the same amount of time. The x86 AMD is identical to the x86 Intel instruction set since they are both 64-bit processors. Ironically, since AMD was the first to design x86-64 processors, Intel has to pay them royalties for it. So, yes, there are different "kinds" of x86, but the instruction set used in the new consoles is the same as the one used in both AMD and Intel PCs since none of them are 32-bit.

 

elemein

Honorable
Mar 4, 2013
802
0
11,160


There we some very, very heavy handed statements in this post, some with even some personality of their own. I feel like there needs to be some balancing here, and I have replied in bolds.
 

8350rocks

Distinguished


But in the mobile space, AMD solutions have competitive power consumption...as in the server space as well. Why would you compare mobile solutions by showing desktop power consumption?
 

8350rocks

Distinguished


Erm, why would you buy an intel PC under the same premise? You wouldn't.

Sure, you can upgrade the PC, but those upgrades are minimal. By the time the 8350 isn't able to run new games in full resolution and medium settings, a new console will be out and it will cost less to get one than upgrade your PC.

Intel hasn't given the Nehalem owners a solid reason to upgrade yet...what's your point?

The original poster wanted to know what the best all around PC is between a 4770k and an 8350. Not the best one dimensional gaming PC. Not the best value PC. The best period.

Welcome to the conversation...did you just get here?

"Advanced" Micro Devices, Inc. the company: This is purely about their business model. They are betting the farm on consoles, obviously. The way they do business now, they are making close to nothing on their GPUs and APUs just to undercut the competition. In fact, according to their stock chart, they are losing money by selling inferior quality products at a very low retail price; a lot of money. In 2006, the common stock symbol AMD was worth $40 per share. As of today, it is worth a shade over $4. Just last year, it was $8. The stock is plummeting, they have been losing half their money every year. What do you think is going to happen if there is even a single problem with either console? Even if it doesn't have anything to do with an AMD component, the company will cease to exist as we know it. They will go the way of Atari and most likely be bought out by a giant tech name or disappear for good. Their market cap is a little under $3 billion, which sounds a lot to us, but compared to Intel's $120 billion or Apple's $425 billion, it is nothing. Apple fluctuates by $3 billion or more each day sometimes. And this is the second largest CPU company. They are betting it all on x86 consoles at a time when ARM based mobile devices are completely dominating the personal computing sector. If you want to use the word "fail" creatively, how about in a way that makes sense. Fail is exactly where AMD is headed and this discussion won't even be an issue in a few years. Less if there are YLOD and RROD by the boatload with the next gen consoles. Intel made a very smart move avoiding the console market altogether and giving AMD not one, but TWO chances to fail.

If AMD is headed for failure, then woe be to intel...they're commanding market share over a saturated market with stagnant growth and offer uncompetitive products in most other arenas, with the exception of servers. They also don't offer a GPU solution at all and have no diversity. They're just able to keep the dam propped up for the time being staving off the flood.

Additionally, Intel wanted the console market...badly...so did Nvidia, which is why NVidia had such bitter comments after Sony chose the AMD integrated solution. Intel has been tightlipped, but don't think for one second they're happy they lost. It wasn't intel's choice to make, Sony chose AMD.

Additionally, AMD's stock price has fluctuated heavily over the years...that has nothing to do with anything. Their stock price has declined, but recently it's up double what it was 2 weeks ago...now what? You know back in 2008 Ford stock fell from $60/share to below $2/share right? Where is it now? Would you bet they would fail?

Sure, games will be heavily threaded in the future... but not CPU threads. This is the reason why even mid-range GPUs have nearly 1,000 shader cores. Those are the cores that matter in games. Having 8 or even 10 or 20 cores doesn't do much good when each individual core is weak. Intel doesn't need more than 4 cores in their CPUs because their individual cores are enough. Games do not "require" multiple cores. When a multicore CPU maxes out the resources of one core, it spreads the load to the other cores. This is why on current triple-A games, a 3570 is only using 2 cores at most at any given time and the processes in those two cores equal what the 8350 needs 3 or 4 cores to accomplish in the same amount of time. The x86 AMD is identical to the x86 Intel instruction set since they are both 64-bit processors. Ironically, since AMD was the first to design x86-64 processors, Intel has to pay them royalties for it. So, yes, there are different "kinds" of x86, but the instruction set used in the new consoles is the same as the one used in both AMD and Intel PCs since none of them are 32-bit.

This is categorically wrong, PC games ports will be coded for 8 x86 cores...since that's what is in the consoles. Why would you gimp your game by not utilizing every last bit of capability available to you? You wouldn't. It might take game developers a year or 2 to figure out how to wring the maximum possible performance out of the new consoles...but don't think for one second they're going to treat PC ports "special" to accommodate for intel products. As a matter of fact...most of the game developers I know like AMD alot better than intel. A great many of them want to utilize AMD's strengths, because this technology now is starting to move forward to a future with enormous possibilities. If intel has their way...the future will keep looking back at the past. Innovation is the way, and intel has been caught sleeping.

EDIT: FYI, the newest PC ports will not be coded for HTT...it would be a superfluous waste of code for a game coded to run on a system that doesn't use HTT.

 
G

Guest

Guest
No, if I have an Intel PC, I'd buy a console as well which is about the price of a video card. No one who takes computers seriously uses them only for playing games that 13 year olds play. Nobody would buy an AMD PC AND a console because they are exactly the same thing according to you.

What's my point, what's my point? Hmmm... Intel is better than AMD in every possible way period, and your posts do nothing to prove otherwise.

Like I said, there is no such thing as "8 x86". Thanks for proving you have no idea how a CPU works.

Your "proof" is that you think game makers would want to do something, and why not, right? This is also known as speculation.

My proof is that a single core can in theory be more powerful than a 100 core CPU which is fact. More cores never means more power. Ever. Nothing is "coded" for mulitple cores. Where on the top of your head do get this stuff from. Any computer engineer knows that. If one core can do the job, then that is all that is needed. Intel's current consumer level chips have 4 good cores. AMD's current chips have 6 or 8 gimpy cores. A computer program might be advanced enough that it requires a certain CPU to use all 8 cores, but if a well designed single core CPU can execute all the instructions in the same amount of time as the 8-core, then the program isn't going to crash and wonder where the other 7 cores are. That is simply not how it works, nor will it ever be like that. Sure, that single core chip will need about as many transistors as the 8 core has, but there is no need to divide the work load. In fact, it would be more efficient if the CPU was working as a single unit. Here is a surprising fact: the 3770k has approx. 1.4 billion transistors, the 8350 has 1.2 billion. This is how the 3770k and 3570k make up for a lack of cores. Not only do they have more transistors, they are better 3D transistors, which you deny for some reason, which are more efficient because they produce less leakage, are nearly twice as fast and use less power than the plain Jane transistors found in the 8350. Intel is already selling CPUs with 3D transistors at a time when all the other silicone companies, including AMD, have yet to even produce a working prototype or have basically given up on the idea completely (which AMD has). Core count doesn't matter.

Nobody is talking about hyperthreading. Nor is the main function of a PC playing games. You are obviously some basement dwelling gamer who has no idea how the AMD PC you built works. You may know how to snap a few boards together, install an OS, and download and install pirated games (because why else would anyone want an AMD PC over a console?) but you don't know a thing about how it actually works. I'm not some computer engineer, but at least I don't make things up simply to win a meaningless argument on an internet forum. If you really think that AMD is better, then good for you. Keep buying their products as long as they make them because it isn't going to be for much longer.

You tell me what you think is going to happen, what might happen. I show you actual facts and things that are currently happening or have already happened. Why do I get the feeling the next reply is more of the same?
 
G

Guest

Guest
By the time AMD comes out with a mass produced 3D transistor CPU, Intel would have probably released a quantum CPU. Thanks, your logic is rubbing off on me.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


It is not exactly true that one intel core is much stronger than one FX core. You repeat this very often, thus let me say that if the cores are working in 'isolation' one of others that is rather correct, but when the cores are working together as a whole then each intel cores lost performance due to poor scaling. 8350rocks wrote some useful analogies. I recommend you to read them.

The bad scaling of Intel chips is also the reason why Xeon based chips are/were unpopular on supercomputers. The most powerful supercomputer Titan uses AMD Opterons.

In any case your single core argument does not invalidates my point. I see a CPU as a whole, which has been designed as a whole.

HT is not magic sauce. I said that does not work for a kind of games. And you ignore that with your claim. Why do not explain your invalid point to someone who needs to disable HT because of the regression on performance. No wait... all of this was explained to you before and you were given links showing how HT can increase or decrease performance depending of the specific situation.

According to the review cited Haswell 4770k gives 1% more average performance in games, but performance regressions of up to 5% in some specific games, whereas increases power consumption by about 10%. Adds the usb3 issues and the PSU incompatibilities and you will start to understand why some people here is calling it "failwell", hasbeen",...

Anyone familiar with CPUs knows the power consumption argument used by anti-AMD people. The same people is now trying to dismiss power consumptions as irrelevant, because Haswell increases the consumption. LOL

Funny that you appeal to mobile space and give power consumptions of desktop processors. More LOL

The figures that you give are from a review where they used any possible way to increase the power consumption of AMD chips whereas cuting down its performance. Time ago since I analized their review, but I recall perfectly those figures, because is not the first time someone gives them to me. Last time was a guy who is very well-known for its intel bias and fantastic claims: he sures to anyone who he meets that his FX consumes 200W more than his i7, which is nonsense.

I am writing this from memory and maybe some detail is wrong, but overall it is right. This is for the FX-8350 and the i7-3770k.

They selected the more power hungry mobo (Asus) for the AMD chips and the more power efficient mobo (MSI) for Intel. Even at iddle the difference between the Asus AM3+ vs the MSI AM3+ was of about 10W and about the same for Asus 1155 vs the MSI 1155. Selecting power saving setting on BIOS the difference between the Asus AM3+ vs the MSI AM3+ was larger still. THe review does not explain that conf they used. But I would not be surprised if they selected power saving for Intel chips but not for AMD ones.

They selected a high-performance G.Skill memory kit for Intel, but an "Entertainment" Patriot kit for AMD. Moreover the AMD chip did run with memory under the stock speed by a 20% generatin some artificial performance gain for Intel.

They used W7 and installed the FX hotfixes manually. As shown by toms the hotfixes introduced performance regression for AMD chips (2% on average, but larger for video tasks). Moreover, the patches had a bug that affected the power consumption of FX chips, because blocked some power saving states. According to Toms the hotfixes increased power consumption of FX chips by a 7% average.

That is how they managed to achieve the FX to run slower whereas consuming more power. Time ago I did an estimation of the real power consumption and task energy and if my memory does not fail the FX moved from 8.1 to near 6.0.

I don't recall now the setups of the 3960X, but was not using memory overclocked by a 40% for increase its performance?
 

elemein

Honorable
Mar 4, 2013
802
0
11,160


You have shown a very, very finite understanding of hardware, software, as well as business in this post. Your closeminded-ness and fanboyism is, and has poisoned you. Arguing or reasoning with you is a null point. I will make factual statements shortly, if you do not agree, you are pushing away FACTS. That is nonsensical.

- Intel is not better than AMD in every way. Lets put an AMD 7990 vs an HD 4000 and see who wins.
- If you were referring to processors strictly, I point you to the mobile sector. While heat and power consumption may not be AMDs forte, they absolutely SLAUGHTER Intel processors in IGP power, it is truly shameful to watch a HD 4000 vs a 7660G. Embarassing actually.
- If you meant proccesors down to x86 level, I refer you to try an Intel Atom N2800 alongside an AMD C-70. I dare you to say the Atom is better. I outright dare you.
- It is not speculation to think that console develepors will not use all 8 cores at their disposal. It is common sense. They will do this too, I guarantee it.
- If a console costs 500$, a 500$ AMD will outperform the console in every way in both usability, practicality and performance.
- Your understanding of transistors is.. Menial at best. A 3D transistor is NOT faster than a "plain Jane" one. They are more power efficient, and run cooler, allowing them to run higher theoretically. There is nothing that makes them inherently faster however. If two processors are 100% identical, besides the one using 3D and one using conventional transistors, they wil perform identically. The 3D will have the capability of needing less voltage however, leading to plenty of benefits. Speed isnt one of them.
- AMD will be around for a long time. It is indisputable.
- Software can be coded for cores. Multiples of them. I can give you an experiment to try this out too if you have 50$ and like robots and PICAXE chips and have a passion for electronics, but the point is: Software can be written for, and utilize multiple cores. You wont see it done in entry level Java and C# programing, but you can do this. You can certainly assign one core to perform different tasks. For example, in a game like CoD, one core can handle hard-body physics and manage projectiles and vectoring, while another core can perform mapping, communications, and transactional calculations like health and stamina. The thing is: this is actually how games ARE coded. Now what if I want to run this game on a single core? By your logic, my game would crash as I'm "missing" a core. Maybe thats where your logical understanding of coding comes from, but this isnt how it works. Instead of crashing, the program would perform ALL the tasks on that single core, at an OS level. This is how it works.

Any questions?
 

whyso

Distinguished
Jan 15, 2012
689
0
19,060


 

8350rocks

Distinguished


As pointed out just above me...you have ZERO understanding of the concepts being discussed.

You have taken a non-technical source somewhere, and tried to turn it into a technical exposition.

So, show me your facts.

Instead of talking about all this nonsensical rubbish, which you clearly believe, supply facts. Show evidence that even 10% of what you're saying is true.

You can't do it, but I invite you to try...it means you'll go back under your troll bridge and start reading "hardware for dummies" or whatever it was you thought made you an expert again.

You, sir, clearly are not grasping the concepts being discussed.

EDIT:

Facts:

Intel's i7-3770k has more transistors because it has built in graphics, the FX CPU does not.

Intel uses planar FinFET, where are you coming up with "3D transistors"? They don't have that technology yet.

AMD FX CPUs are more powerful than the console CPUs, but they get bogged down by running windows OS. Consoles don't have such interference, so lesser hardware can be more efficient under such circumstances.

More cores means more threads. The FX8350 can run up to 12 threads concurrently on the FX8350...the i7-3770k can run as many as 8.

A single core CPU can be powerful at single threads, but if you have software coded to run 100 threads...the 100 core CPU will obliterate the single core CPU.

Questions:

Think I am lying? Then why does the FX8350 outperform the i5-3570k and even the i7-3770k on Crysis 3 which uses many cores?

If AMD has 6 or 8 "gimpy" cores...then why is it that they beat many intel products without "gimpy cores"?

If AMD product is inferior...why are they having so many design wins on multiple fronts, where intel and NVidia were offering competing hardware?
 

whyso

Distinguished
Jan 15, 2012
689
0
19,060

 

SomeNickname

Honorable
Nov 17, 2012
13
0
10,510

1 disabled because of yield? That's like selling every dualcore as a quadcore "well it does have 4 cores, but 2 have been disabled because of yield". And the big plus of consoles are that they don't really need an OS, just a firmware. yes with all that social media shit going on atm, there might be more background processes, still not enough to justify designating a whole core just for that.


So... You actually think that the games coded for consoles, which always have resource problems few years after release, will only use a part of the resources because they cant be bother to use all of it? All you need is a nice engine with good threading. If you don't want to code/fund one, license it (e.g. CryEngine). If you produce a game so detailed that you need that power, you can afford yourself some possibility to make usage of all cores.
 

elemein

Honorable
Mar 4, 2013
802
0
11,160


This could get messy using multiple bolds, so I'll just cut them out and reply individually:

- The two aren't in the same category
Trying to point out that Intel has no dedicated GPU category whatsoever, that Intel cannot possibly be better in every way than AMD, as they cannot compete at all in the dedicated GPU market. That was the purpose of that statement; and also a fact.
-Slaughter? Not really. 7660G is around 30-40% better than a non ULV hd4000 in most games. Intel generally has unpredictable gaming performance. Sometimes the hd4000 is almost as good as the 7660G, sometimes its a lot worse. Not to mention that intel slaughters then in cpu performance.
30-40% is indeed a slaughter; considering that every generation of CPU and GPU are only about 10% better overall than their predecessor, the HD 4000 is a comfortable 3-4 generations behind. That's also comparing it in the 3630QM, now, take the i3-3110M that is in the same price range as the A10-4600M, but with the HD4000 at an even lower clock, and compare again. The results are even more staggering. It is a sound win for AMD.
- Not really sure about this but here is a review between the E-450 and the D2700
http://uk.hardware.info/reviews/2700/cpu-shoot-out-intel-atom-d2700-vs-amd-e-450
Fairly even.

These two processors are not in the same category, and are for different purposes as well. The E-450 is an ULV processor used in everything from small notebooks to full desktop PCs (I've seen it before... It's saddening...), whilst the D2700 was meant to be an HTPC or low-power server. In addition, the E-450 is ULV, whilst the D2700 is an Atom, a level of power even lower. In addition, the E-450 was meant as a general purpose APU. In addition, whilst it IS impressive for the D2700 to come remotely close in CPU performance to the E-450, the IGP in the E-450 genuinely slaughters (much more than 30-40% in this case) the IGP in the D2700.
I highly doubt this will be true at release when you look at the big picture. - I believe it will be. Upon release, feel free to PM me and I'll make a build list of parts that will beat the PS4 or Xbox Infinity/720. All I have to beat is a Jaguar 8-core; I don't think that'll be too much of a challenge.
Yes they will but will they have 8 cores at their disposal? Probably not. 1 will likely be reserved for the OS and one possibly disabled for yield reasons. Not to mention that those are weak cores compared to any desktop. - Actually, yes, the OS will have a dedicated CPU completely apart from the 8-core Jaguar. It will be an ARM chip IIRC.
No doubt that software can be written for more than one core but it can be difficult and time consuming to do so. - True, though it is very, very necessary in complicated software like games, and is done very often in the console area.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


An important note here. As Mark Cerny has explained in several occasions Sony left developers choose the hardware of the PS4 and developers chose 8-cores rather than 4. It is fair to think that the same developers are those that claim now that the FX is the better CPU for future gaming.
 

whyso

Distinguished
Jan 15, 2012
689
0
19,060


 

whyso

Distinguished
Jan 15, 2012
689
0
19,060


 

elemein

Honorable
Mar 4, 2013
802
0
11,160


Again, this is gonna get messy as all hell... Just snipping out what I need to reply. Also, I'm very glad that we can keep this conversation very civil. Kudos to you for not having a masssssive e-peen.

LOL what, 3 generations behind? In mobile Arrandale->HD 3000 =100% better->Hd4000=40%better. Thats really one generations and Haswell is going to close the gap almost all the way (disregarding GT3).
3630QM runs the igp at 1150 mhz, i3-3110M runs it at 1100 mhz (but less cache). Its a sound win for AMD true but the difference is only so great for the a10, a8 with 2/3 the shaders is pretty much equal to the hd4000 and at ULV levels the HD4000 is surprisingly competitive.

Fair enough when comparing IGP generations. Also, Intel is definitely a very, very, strong competitor when you pit ULV Intel vs ULV AMD (A series and E series), the A series IGPs at ULV level are only a bit better than the HD4000 at low clocks, and the E series is actually beaten out. I give ULV Intel CPUs the ribbon for all-around power; the only thing that dissapoints me by them is their price. E series and A series APUs are availible at significantly lower price than Intel i3-3217U or i5-3317U-- but that's aside the point. I agree that Intel is overall, better in the ULV mobile sector. Fair point. However, it is NOT fair to say that Haswell will close the gap between the HD4000 and the 7660G. First and foremost, Haswell will have the HD4600, HD5000, HD 5200, and HD... Something else that'll be used in the desktop. There will be many levels, so it'll be hard to compare. But moreso, the real reason they're uncomparable is: Richland and Kabini. They'll bring along new IGPs that I'll confidently say: Will soundly beat Intel, yet again, in IGP power. I'm confident in it.

The review also showed power consumption which was pretty much equal. Which was why I picked that comparison. Atom and bobcat compete quite well in straight cpu performance. Bobcat's gpu kills atom though.
When just comparing power consumptions, sure, they do get a little bit closer. But if you shove aside power consumption and CPU power, the E-series is significantly better at well... Everything else. Price, IGP, availibility, everything else really.

Yeah I know this its just that given all the problems devs are having now more work is not what they need. xbox 360 is tricore (dual threaded) and PS3 is one main processing unit and 6 extras.
I feel this statement isn't really a good argument. While it is true that it takes more time to write multicore software, the fact of the matter is: It'll surely happen, no doubt. PS4 and Xbox 720 developers will fight tooth and nail to squeeze every last Hz out of the IGP and CPU, laziness will be a factor, but if laziness were as a big of a deal as you say, every single console (and computer for that matter) would be a single core or a dual core. Nothing more.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
You have a bad habit of replying without formatting adequately the message mixing up all. Ok, I will do the same when replying to you.



 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Then you are not interested in Haswell 4770k because has poor performance per watt than Ivy.



If intel managed you to believe this nonsense, then no strange that they are laughing.
 

8350rocks

Distinguished


If you honestly believe the i7-3970x is a "low end" product...then there is no point in further conversation with you.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
Since you do not format properly. I will reply you in the same way