AMD Plans New FX Processors to Go Up Against Ivy Bridge

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Last week AMD claims victory by comparing the 7970 to a 580, a card thats over 18 motnhs old and last gen...... apples and oranges, now the new current gen card is out with the 680...nuff said
 
[citation][nom]Earnie[/nom]I get a kick out of comments like this,they only come out after there(Nvidia) new card was released,where was this comment last week?Not to mention,without AMD in the processor market,we'd all be enjoying our $500 i3's..[/citation]

Last week AMD claims victory by comparing the 7970 to a 580, a card thats over 18 motnhs old and last gen...... apples and oranges, now the new current gen card is out with the 680...nuff said
 
SO basically this article should be titled "AMD will not keep up with Ivy Bridge with their new FX releases" For that matter they can't even keep up with Deneb, the FX series is a total and absolute failure
 
wow it is amazing how foolish all you intel fan boys are. There is really no practical use for the fastest intel chip, unless you are making star wars or something. I mean as I stated earlier my BD runs fast. All games i play run on ultra in 3d with a 570. what is so hard to understand. Now having said that I am impressed with intel tech, pretty awesome stuff. But very expensive. Right now I have my BD running at 4.6 ghz with the H100 idles at 19c and never goes above 46c. I will say again plays ALL GAMES I HAVE at ultra in 3d, how exactly does that suck? Yes BD is not he best chip in the world but I got mine for under 300 bucks and it has more then enough power. And if I could afford I would get the best chip possible just because it's cool. But since I enjoy traveling and having sex while traveling. Euro women are expensive. I keep my builds to a reasonable budget, which by the way PLAYS ALL GAMES AT ULTRA IN 3D NO PROBLEM.
 
Whatever!, I own a Amd processor, and I'm happy with it however it will probably be my last, my I5 based laptop with hyperthreading on performs as well if not better the my 6 core phonom 2. So amd's mouth should writing checks their ass can't cash.
 
Last time I've checked, Mainboard + AMD A3850 cost less than Intel's SB i5 750.
And for those, who actually like the fact, that underdog has problems, think again what you'll be buying from intel if AMD is gone and for what price.

[citation][nom]jurassic512[/nom]I get a kick out of phony predictions. If prices went that far up, the majority of consumers would not upgrade their computers for MUCH longer, forcing new OS's to come once every 5 years instead of every 2, thereby leaving Intel and Microsoft hanging by a thread-and those are just two out of MANY players that would be affected. Look at the price of gas. Do you think people are using it like it's still $2/gallon, or do you think they are being more conservative? And gas is more essential than cutting edge desktop PC's.[/citation]

How old are you, kid? Haven't we been there in 90th?
Oh, and there goes overclocking. K-type CPUs not costing much more is just for a while.
 
So much for "it's not about AMD vs Intel anymore"

But don't give up AMD! As badly as you've been doing lately, we still need you to prevent Intel from inflating into a mega-monopoly!
 
[citation][nom]reconviperone1[/nom]Whatever!, I own a Amd processor, and I'm happy with it however it will probably be my last, my I5 based laptop with hyperthreading on performs as well if not better the my 6 core phonom 2. So amd's mouth should writing checks their ass can't cash.[/citation]
[citation][nom]ohim[/nom]Misleading title is misleading but seeing ppl comment around here about core I3 you must be a marketing idiot and a fanatic intel fanboy to even try compare the I3 be it w/e version vs bulldozer. the I3 can`t even keep pace with a Phenom II while multitasking. Yeah intel is good at the top but below the I5 line is crappier than what AMD has to offer. And here i`m not talking about games where the charts are about few fps .[/citation]

Phenom II x4s enjoy a significant lead in highly threaded performance over a Sand i3, but the FX-4100 is barely above the i3s in highly threaded performance and the Ivy Bridge i3s look like they will match the FX-4xxx CPUs in highly threaded performance. Certainly not great, but it is not crap. The i3s also are a lot faster for dual and single threaded work so they win significantly in the dual/single threaded games and match FX quad cores in highly threaded work.
 
[citation][nom]jurassic512[/nom]Even with die shrinks (ticks), more than just clocks and TDP are changed. They make architectural changes/tweaks as well. ie: QuickSync 2.0, and Intel Graphics 4000, which has been said to be on par with current Llano APU's in terms of graphics.[/citation]

The HD 4000 is supposed to be 60% faster than HD 3000s on equivalent processors (different processors even from the same family have different clock frequencies on their HD graphics so the HD 3000 on the 2600K and on the 2500K actually have somewhat different performance), so it just catches up, maybe surpassing, the A4 graphics. The A8s are supposed to be something like 300% or 325% faster than the HD graphics on the i3s, so HD 4000 probably doesn't even come close despite it's performance advantage over it's predecessor. It shouldn't even beat the A6's graphics and shouldn't touch the A8s.
 
[citation][nom]dragonsqrrl[/nom]AMD has a long history of platform upgradeablility, something Intel has started doing only recently. I agree that the number of viable upgrade paths to Bulldozer are fairly limited, but it's still an option available to those who want it.[/citation]

The Intel processors all used the same LGA 775 socket for years. Most of the P4s use it and all of the Core 2 series uses it except for the extremes, some of which (was it only one of them) use the LGA 771 socket for dual-processor compatibility. It wasn't until Nehalem that Intel shied away from this. Sure, some of the old motherboards didn't run the newer processors, but many of the newest LGA 775 motherboards right now are still compatible all the way down to the first LGA 775 P4s. Intel, for whatever reason, decided to keep Nehalem-based processors locked onto their own sockets the LGA 1156 and 1366 sockets. However, the current LGA 1155 socket is good for two generations, the Sandy and Ivy bridge architectures, so Intel decided to give us a little leeway. Kinda sucks how the next architecture, Haswell, uses a different socket (LAG 1150 if I remember correctly), but the changes at that point are well-deserving of it, like the changes made from LGA 1366 to LGA 2011.

Sure, AMD has had more consistent cross-compatibility, but to say that it's only recently that Intel did is completely wrong. In fact, it's only within the last few years that Intel left their cross-compatibility by using a new socket. However, Intel makes changes that make a new socket reasonable, so it's kinda justified that they changed from LGA 775 to LGA 1156. I don't know why we went from LGA 1156 to LGA 1155, but the other changes have had good reasons. We see the northbridge getting integrated into the processor, improving memory latency significantly (among other things), but then we had to add a lot of pins to the socket for the memory. LGA 2011 sees 40 PCIe lanes getting integrated into the processor and that's also a lot of pins. LGA 2011 also sees another two memory channels, necessitating more pins.

AMD started off with integrating the memory controller long before Intel did (AMD did it back in 2003 with the Athlon 64/FX families if I remember correctly and Intel didn't do it until Nehalem in 2008 or 2009) and didn't make much changes there, so they didn't need a new socket for that. AMD also isn't adding PCIe lanes to the processor, so no big changes there. AMD's processor changes don't seem to be very major for communication outside of the processor, so they don't need to change sockets completely as often as Intel does it.

The point of amuffin's post (as I can see it, at least) is that Intel has easy and large upgrade paths available from the start. The i3s are already very capable gaming processors, but the i5s and i7s can go even farther above them, despite them being from the same family. Basically, you can get a high end CPU, then upgrade to a higher end CPU if you want to. With AMD, well the higher end CPUs for them right now are fairly parallel upgrades for gaming. Want to go from an FX-4100 to an 8150 to get a 100% performance boost? Well, it's not going to make much difference at all, so you might as well stick with the lowest end options you can.

With Intel, there're CPUs that are literally tiers above their already very good low end offerings. Want to get a 50-100% performance increase over an i3? Well, get an i5 or i7 and overclock it to around 5GHz and you got that huge performance boost you wanted. Sure, you also need a massive graphics system to pair with such CPU performance, but it's an option and an incredibly future-proofed option at that.

There are no games that come to mind (except maybe BF3 with 64 players or something like that) where the i3s are really hammered by anything in AMD's FX lineup. There are no games where the i5s are killed (usually not even close anyway) by the Phenom II lineup. i3s are good for even a 6970 or GTX 580/Radeon 7950/7870 class performance. AMD's best processors start to become bottlenecks with such cards in the resource intensive games and don't really top out AMD's low and mid range options anyway.
 
[citation][nom]zeratul600[/nom]Im and amd fanboy but they cant compete with intel anymore??? why? could someone explain it to me? i even remember the very few months when amd was over intel with x64 processors and dual core while intel was like 10ghz processors at 300 degrees ... i know that amd its doing great with the gpu thing, and apu for entry level, but i just dont understand how amd got so way behind intel??? i dont like it but if i have to buy a new pc, it will be an ivy bridge (im waiting ) and i'll do it with a tear on my face.. AMD you were the chosen one![/citation]

Once Nehalem came out several years ago, AMD has never beaten it while Intel beat it once with Sandy and is doing so again with Ivy. Nehalem is ahead of Phenom II which is ahead of FX. Sandy is far ahead of Nehalem and Ivy is somewhat ahead of Sandy. There are several tieros of architectures between AMD;s best from years ago and Intels current, let alone Intels architectures from years ago. AMD has been behind ever since Core 2 back in 2006, but the gap widened every year after that as Intel released consistently better architectures while AMD made steps all over. After FX/64 was Phenom which was junk, then Phenom II came out and it matched Core 2 in performance while Intel had Nehalem. By the time AMD came out with a new architecture, Bulldozer (the new FX), Intel had already gotten another architecture out (Sandy) that beat Nehalem significantly. Intel was already most of the way done with their next arch, Ivy, and it comes out soon enough. AMD has stagnated and made mistake after mistake. That is how they got so far behind. Right now, Sandy is usally around 50% faster clock for clock than FX and around 40% faster clock for clock than Phenom II.
 
[citation][nom]Russosaur[/nom]I gave up on comparing the 2 chips, honestly. I have come to the conclusion. For the price AMD is your best friend, if you have deep pockets and want great performance go Intel.[/citation]

Intel has just as much, often more, performance for the price outside of the sub $100 price range. The i3s are better for most games and on par with for the others with FX, but the FX-4100 uses around 75% more power than the i3s and the FX4170 uses far more power, bringing that difference up from 75% to well over 150% more than the current i3s. These huge power usage differences are actually so great that even though the 4100 is cheaper than an i3 up front, it becomes more expensive over a three year time period (a common amount of time to keep a processor)

Ivy Bridge i3s should match the FX-4100 in highly threaded performance and beat it greatly in lightly threaded performance, all while using half the power. The difference between the Ivy i3s and the FX-4170 will be even greater. All of this should be kept in mind with the Ivy i3s supposed to be priced at the same points as the Sandy i3s are right now and the Sandy will get price cuts (some of which have already begun).
 
[citation][nom]Anonymous[/nom]I have the FX8150 it replaced the 1100t both excellent cpus. Sure they don't bench like intel, but they don't need to. I play bf3 at ultra with my 570, can have as many apps running as I like with out a problem. Don't realy get all the hate, they are good affordable cpus end of.[/citation]

The 1100T is a better gaming CPU than the FX-8150. The 8150 is only better for work that uses eight threads. The i5s and i7s are FAR better gaming processors and the i3s actually meet or beat the FX CPUs in gaming so there is a lot of very good reasoning behind disliking AMD's current CPUs. Guess what, the Phenom II architecture is several years old now and the Bulldozer architecture is even slower than Phenom II and Core 2 (Phenom II and Core 2 are very similarly performing architectures). Bulldozer CPUs only creep ahead of their predecessors through sheer clock frequencies that can't even get much higher than they already are due to power usage restrictions.

Sandy and Ivy have huge performance per clock and are good at overclocking (i5s and i7s that is) all while being very power efficient.

[citation][nom]kartu[/nom]Last time I've checked, Mainboard + AMD A3850 cost less than Intel's SB i5 750.And for those, who actually like the fact, that underdog has problems, think again what you'll be buying from intel if AMD is gone and for what price.How old are you, kid? Haven't we been there in 90th?Oh, and there goes overclocking. K-type CPUs not costing much more is just for a while.[/citation]

Last I checked, the i5-750 is NOT a Sandy Bridge processor. It is Nehalem, hence it is on the LGA 1156 socket. Besides that, it is in a different level of the desktop market than Llano (it's mid-range, Llano is low end). It's also three or four years old. The current i5s can be had for almost as little as an A8 anyway so your argument is completely flawed no matter what angle you look at it from. The i5-750 is cheaper than an A8 plus motherboard.

Also, only Sandy needs K edition CPUs to get god overclocks. All of Intel's other architectures (including Ivy bridge) support BLCK overclocking. K edition just means multiplier overclocking and only Sandy CPUs need it because only they can't do large BLCK overclocks. All Sandy Bridge i3s, i5s, and i7s have a four number model number IE i3-210, i3-2105, i5-2400, i7-2600, sometimes with a single letter (S, T, K) suffix denoting low power and multiplier unlocked. Three letter model numbers (with or without the letter suffix) are all LGA 1156/1366, Nehalem based processors. Fail troll is fail.[citation][nom]popatim[/nom]AMD's 6 core will beat the dual core Ivy cpu's ... I hope[/citation]

Six core FX processors (FX-6xxx) keep up with Sandy Bridge i5s in highly threaded performance and eight core FX (FX-8xxx) are right behind the Sandy Bridge i7s in highly threaded performance. Quad core FX (FX-4100) are just ahead of Sandy Bridge i3s in highly threaded performance. Phenom II x4s are right above FX-4xxx in highly threaded performance and Phenom II x6 is between FX-6xxx/i5s and FX-8xxx in highly threaded performance.

i3<FX-4xxx<Phenom II x4<i5<FX-6xxx<Phenom II x6<FX-8xxx<i7

That's how the highly threaded performance goes.
 
[citation][nom]monsta[/nom]Last week AMD claims victory by comparing the 7970 to a 580, a card thats over 18 motnhs old and last gen...... apples and oranges, now the new current gen card is out with the 680...nuff said[/citation]

AMD has been doing that ever since the 7970 came out when there was nothing else to compare it to. I don't care if Nvidia's top card is five years old, if it is all there is to compare it to for the best of Nvidia to the best of AMD at the time, well then it is what the new AMD card would be compared to. The same is true if ti was AMD that had an old card being compared to a new Nvidia card because AMD had no new cards at the time. Besides that, the 680 has it's troubles. Overclocking performance on it and the 7970 is very similar, possibly with the 7970 creeping ahead of the 680 most of the time.

The 680 is a very highly clock design. Remember, the 7970s can almost always have a huge overclock, but the 680s struggles with even fairly moderate overclocks. Sure, a lot of people don't want to overclock and don't do a whole lot of GPGPU compute or games that use it (Dirt 3 and Civ 5 come to mind) and most games aren't limited by 2GB of VRAM until you go far over 1080p, but for real enthusiasts, but true enthusiasts want the best and that usually means overclocking. Overclocked, the 680 shows significant weakness, especially in compute heavy and memory bandwidth heavy games (Crysis games come to mind here).

Does the 680 win? Only at stock. Do I give Nvidia perks regardless of that? Damn strait I do. Also for pricing it at a better place than AMD did so AMD will be forced to lower their ridiculous prices of the 7900 cards down to levels we see with Radeon 6000 and current prices for the GTX 500 cards (IE after the recent price cuts). The 680 also struggles to get only two thirds of the 7970's single precision performance and a mere one less than one fifth of the 7970's double precision power. It seems as if AMD and Nvidia have switched places for who has the more general purpose graphics cards, but the difference is even more drastic than before. The GTX 680 is like a super pumped GTX 460/560 replacement that was renamed (in fact, that is what it is). It has roughly similar DP compute performance to the GTX 460.

I find it odd that Nvidia decided to cripple their consumer cards to force Nvidia buyers to go to Tesla and Quadro families to get decent DP performance whereas AMD has decent DP performance on their consumer cards. In the past, Nvidia has been fairly compute heavy, even on consumer cards (look at Fermi, it's very compute heavy comapared to VLIW5 and VLIW4). The problem with them relinquishing this trait is that more and more games are supposed to be compute-heavy and AA is memory bandwidth heavy too. The problem there is that the 680, despite it being a big jump over the 580 in gaming performance, has the same memory bandwidth. The 680 shows less efficient AA (bigger difference between AA on performance and AA off performance) performance than the Radeon 6000 and GTX 500 cards, but instead of stepping back, AMD stepped forward here and has the most efficient AA performance of all with 7900 cards. Nvidia took what could have been a great card had they increased memory interface width to 284 bits and kept the SP to DP ratio at the 8 to 1 ratio it was with the 480 and 680. Those really are my only problems with it; stupid things that Nvidia did.

AMD also gave it's 7900 cards very conservative clock frequencies and kept the same number of ROPs as 6900 despite the greatly increases performance pretty much everywhere else, so it's not just Nvidia making stupid mistakes. AMD had to redesign parts of the memory interface because the previous designs for the ROPs and memory controllers didn't allow to keep 32 ROPs with a 384 bit bus. Had AMD simply went up to 48 ROPS this wouldn't have been a problem. However, that would have also increased die space usage (and power usage) considerably, so I guess it's somewhat understandable. Despite it's having too few ROPs, it still manages very balanced performance so it seems to have been a decent compromise.

The conservative clock frequencies are fixed easily through overclocking. However, Nvidia's problems aren't so easily fixed. There's little to nothing we can do to change it's SP to DP ratio and there's even less for us to do to fix it's memory bandwidth problem since it's memory is already clocked as high as it will go (without decreasing performance instead of increasing it) since it's already over 1500MHz. Given the choice between a 7900 card and the GTX 680, I'd go for the 7900 card so long as the drivers stop limiting it to 1125MHz shader clocks and (at least most of) the other driver issues are fixed. The 7970 shows that even at 1125MHz, it creeps up near the GTX 680 in the second part of the GTX 680 review. Considering that there is room to easily hit 1.3GHz on the 7970, I think it might be able to meet or beat the GTX 680 if given a chance. However, the 7970 seems to be considerably less power efficient than the GTX 680. It seems that AMD and Nvidia switched sides in their previous differences, except AMD still has more VRAM.

We currently see Nvidia being more power efficient, using smaller dies, and less designed for GP computational performance. However, I'd go for the 7970 if it sees a price drop and I can overclock it to 1.3GHz or so. If it drops significantly enough below the GTX 680, maybe the increased power usage of the 7970 would not be great enough to offset it's price difference with the 680 through the electrical bill. I can't deny that the 680 is a great card, but it's horrible DP performance and mediocre SP performance (both compared to 7900) don't make me happy. Like I said earlier, more and more games may see greater usage of GPGPU capabilities and Kepler's intentionally crippled DP performance does not a happy, future-proofing customer make. If two 7970s or GTX 680s can run most, if not all, games even at 5760x1080 at great frame rates, how long would it take for two of either card to be limiting factors for 2560x1600 gaming? By that time, would these games be more compute and/or AA and/or memory capacity heavy? If so, then the 7900 cards are obviously the way to go.
 
[citation][nom]cardoski[/nom]wow it is amazing how foolish all you intel fan boys are. There is really no practical use for the fastest intel chip, unless you are making star wars or something. I mean as I stated earlier my BD runs fast. All games i play run on ultra in 3d with a 570. what is so hard to understand. Now having said that I am impressed with intel tech, pretty awesome stuff. But very expensive. Right now I have my BD running at 4.6 ghz with the H100 idles at 19c and never goes above 46c. I will say again plays ALL GAMES I HAVE at ultra in 3d, how exactly does that suck? Yes BD is not he best chip in the world but I got mine for under 300 bucks and it has more then enough power. And if I could afford I would get the best chip possible just because it's cool. But since I enjoy traveling and having sex while traveling. Euro women are expensive. I keep my builds to a reasonable budget, which by the way PLAYS ALL GAMES AT ULTRA IN 3D NO PROBLEM.[/citation]

That Bulldozer at 4.6GHz you have there is a rough equivalent to a stock i5-2500K in gaming performance so of course it plays games very well. However, it does so where the i5 could work with it's stock cooler at almost half the power usage. Sorry, but you're the fool here if you think that getting ANY FX processor and an H100 that matches an i5-2500K at stock with stock cooling in performance and uses FAR more power was a good deal. i5-2500K? $230 or so on newegg. FX-8150 plus H100 cooling? Over $400 together if you bought them more than a few weeks ago when AMD cut prices again. Also cue in the several dozen more dollars you spend on powering that instead of the i5 (counting the difference, not the total to remain fair to both the i5 and the FX) more than a year or two being on more than 4 hours a day. That's USA dollars.

Oh but please, try to explain how your purchase was better. Sure, the FX may multi-task better, but the i5-2500K would do that more than well enough and this is really a more of a gaming comparison than productivity. The i5 can then get a Cooler master Hyper 212 plus or Evo and be overclocked to about 4.4-4.6GHz where it is still far more efficient than the FX and then multitasks about as well, unless you are doing some huge work in the background while you game (I know I don't do anything like encoding or such while I'm gaming and if I did I would prefer a i7 instead of that FX+H100 because the i7 would be faster, more power efficient, AND a LOT cheaper up front too). Does this help you understand why the BD chips suck? The only one worth buying whatsoever is the FX-4100 and even then, only if it gets a price cut when Ivy surfaces or the Ivy i3s will beat it even more than the Sandy i3s beat it.

If I want decent performance from AMD then I'll search around for a Phenom II, but the difference between Phenom II and Ivy is similar to the difference between Sandy and FX so it's obviously getting obsolete too. I've said it time and time again that AMD needs to really pull the rabbit out of the hat with Piledriver. I know it can be done because many of Bulldozer's greatest problems aren't huge fixes (well, not huge fixes if you consider the context here), but AMD has been a little all over the place lately. I'm not some Intel fanboy and I really want AMD to succeed, but I do recognize that Intel does have a massive lead right now. Phenom II versus Nehalem really isn't a bad difference for AMD, but FX versus Ivy Bridge is horrible with Ivy reaching for more than 60% more clock for clock (IPC) performance than Bulldozer FX CPUs.
 
[citation][nom]belardo[/nom]Title is wrong... Oh well.I'm an AMDer, in general... even selling some AMD systems last year. But when Bulldozer came out... I've gone intel... I've waited, and waited.. and was disappointed. First of all, Sandy Bridge is about 5~50% faster than anything AMD has. Ivy Bridge is about 10% faster than Sandy Bridge.AMD requires almost twice the power over intel... more heat. By all means, Lliano is great in its class - but is becoming over-priced. AMD is NOT #1. They haven't since Core2 hit the market... and intel screwed AMD with illegal business practices, which did hurt AMD's R&D abilities.AMD is not competing well against intel, much less themselves. When their "8 core" CPU has trouble surpassing their own 4-6core CPUs from a year ago... its a PROBLEM. AMD did the most retarded thing, they made their own version of the Pentium 4/netburst... and seeing so many people LEAVE AMD shows the screw up. We will, someday find out WHO came up with the crappy design.The FX chips are marketed as 4/6/8 cores... but their design is Hyper-Threaded (netburst) so they are really nothing more than 2/3/4 core CPUS. They should be sold as such. The FX8150 should be a $150, at the most... as its still (STILL) slower than the i5-2400k which sells at about $140~150. The fx8000 series should be priced well below $200... not $260.Someone said that the older X6 AMD960T was $200 cheaper than intel... ? HUH>I can walk into a store in Dallas and pick up the i5-2500k and motherboard for $240. the 960T with a typical board is $200~220. Also the 960T is the first X6 from AMD pretty much... the i5-2400 will bitch slap it anyday... and still run much much cooler.[/citation]

Sandy is only ~50% faster clock-for-clock than FX, but it is closer to 40% faster clock-for-clock than Phenom II. Ivy will be more like 50-60% faster clock-for-clock than Phenom II, but Intel isn't THAT far ahead of Phenom II just yet. Llano is only good if you compare it against other systems that lack a graphics card. A8-3870K is something like $130 or $140 and a Pentium G620 plus Radeon 6670 are about $130. Guess what? The Pentium system has about 50% higher frame rates in most games. Even AMD CPU plus discrete video card solutions are better because you can get an Athlon and a Radeon instead of a Llano and save money. Llano is like an $70 CPU plus a $40 sold for a higher price than the two halves are worth together. Performance wise, it's basically a lowly clocked Athlon II plus a Radeon 5550 (A8s only, all A6s and A4s are even slower). The 6670 is far faster and a standalone Athlon II has higher clock frequencies for the money, thus higher performance or similar performance for less money.

Intel has no such quad core CPUs this low end except for the cheapest Core 2 Quads from several years ago and the Nehalem i3s (dual core with Hyper-Threading, but similar quad threaded performance nonetheless), so technically, AMD wins here for performance, but that's just because Intel is neglecting the sub $100 CPU market. The cheapest Intel CPUs truly worth buying are the i3s. The cheapest low end CPU (and really, the only ones anyway) worth buying is the Celeron G530 at $42 or so for a dual core, 2.4GHz part that beats out the Semprons in the same price range. Besides it, AMD pretty much wins until FX and Phenom II versus the i3s.

Bulldozer actually isn't a bad architecture. Most of it's problems are it's memory/cache interface's huge latencies and it's poor design methods (BD is a purely computer-designed chip whereas all other CPUs are partially or purely designed by hand). Fixing those two problems alone should bring FX up to Nehalem like performance per clock at between Nehalem and Sandy power efficiency. Considering FX's huge clock frequency potential, it could fight it out with Sandy at least in performance with a much improved IPC and slightly higher clock frequencies. Wouldn't yet match Sandy efficiency, but if it were priced right, it would actually be a massive step in the right direction.

If AMD makes some architectural fixes too then it may be able to compete with Ivy. Basically, Piledriver and it's successors have great potential so long as AMD doesn't have more serious screw-ups. I'm worried about that because that seems to be something that AMD's CPU teams managed to consistently do over the past few generations. Of the last five or six years, AMD has had three architectures, but two out of the three had significant problems. I think that FX is having new architectural growth pains rather than being a complete flop like Netburst was. Look at Phenom, it gave way to Phenom II, a design good enough that we still see it in common usage. Phenom II is like a more power efficient Core 2. Similar IPC, but more efficient.

For highly threaded work, the FX-8150 runs circles around all i5s. In gaming it gets ran around by i5s and sometimes even i3s, but not in highly threaded productivity. Hyper-Threading is NOT affiliated with Netburst (most Netburst processors did NOT have it), some of the Netburst processors were simply the first to have it. Also, no, Bulldozer is not like Hyper-Threading. Hyper-Threading takes a single, fast core and uses it's left over resources in another thread that runs on the core when it's main thread is not running on those resources. Bulldozer has two cores per module that share some resources, but not the ALUs and such that are shared in Hyper-Threading. The two technologies are actually very different. Look at the results too. We have the Sandy Bridge i3s that have two very fast cores. Hyper-Threading usually gets you another ~30% more performance by using otherwise unused resources. Bulldozer has two slow cores.

Here is some oversimplified math that helps to explain it. Lets say we have an i3 at 3.3GHz (I think the 2120 runs at 3.3GHz) and the FX-4100 at 3.6GHz and the Fx has it's turbo disabled (i3s don't have turbo anyway). The i3 has two 3.3GHz cores and has about 50% more IPC than the FX and Hyper-Threading improves highly threaded performance by about 30%. 3.3*2 (two cures)*1.3(hyper-threading)*1.5(IPC difference between Sandy and Bulldozer)=~12.87

FX is 3.6*4=~14.4 Oversimplified, but shows the difference in performance quite clearly. We see the FX-4100 outperforming the i3 in theoretical highly threaded performance by almost 12%. In reality, the difference is fairly close, but often a little less.

Here, we see that Hyper-Threading increases performance significantly, but only by about 30% (cited from many sources across the internet, look it up if you want proof), but the four cores of the FX scale much more linearly because they are full cores. Basically, Hyper-Threading duplicates some resources whilst Bulldozer duplicates cores. They are closer to polar opposites than being similar.

One thread on the i3 runs at more or less full speed on the core, two does the same. The same is true for the FX, so it has a huge disadvantage here because two cores of it and two of the i3, the i3's cores are FAR faster. However, Hyper-Threading and more cores after the first two have somewhat different effects on performance. The third and forth thread on the i3 only run at around 30% the performance of the first two because the first two get most of the CPU's per core resources. The FX, however, nearly doubles in performance because it has four full cores to use instead of two that have two threads that share each core. The FX scales much better, but the i3 has much higher single and dual threaded performance while having similar quad threaded performance. With four threads performance is similar, but with a mere two threads, the performance difference is huge.

Obviously not similar technologies. Hyper-threading duplicates certain per-core resources to have two threads share a single core, Bulldozer's modules share non-core resources between two more or less independent cores. Intel's solution favors lightly threaded performance whilst AMD's solution favors highly threaded performance. Honestly, I prefer Intel's, but AMD's obviously works too.

The problem with FX's high core CPUs is that their performance scales very well for highly threaded work, but hardly at all for lightly threaded work. The FX-4100 is almost identical to the FX-8150 for gaming, but the 8150 is almost twice as fast for highly threaded work. So, if AMD prices the 8150 at $150 and the 4100 at $110, well for a mere 27% price increase, you get nearly double the highly threaded performance. Sure, consumers would love that, but that leaves AMD selling CPUs hugely undervalued for their highly threaded performance just because their lightly threaded performance is negligibly higher than the much cheaper models that have far fewer cores. For the money, the FX CPUs have more highly threaded performance than Intel already. We have the 8120 under $200 despite it only being right behind a $300 i7 in highly threaded performance and the 8150 somewhat more expensive.

The 8150 is stupid for anyone who overclocks because despite it's higher price and clock frequency than the 8120, it isn't really even higher binned than the 8120. The two overclock to the same frequencies at the same voltages, using the same amount of power. The 8150 is only for anyone not willing to touch BIOS settings to get a minor performance boost. Intel actually bins their chips, but AMD's binning is more about finding defects rather than performance at a given voltage. For example, the i5-2500K uses less power than the i5-2400 if you put then both at the minimum stable voltage for the same clock frequency because it has a slightly higher binning. The i7-2600K also overclocks slightly better than the i5-2500K and the 2700K a little better than the 2600K. Phenom IIs, Athlon IIs, Semprons, and FXs (if not going even further back) tend to overclock equally well to their higher clocked brothers.

The FX CPUs only have trouble beating Phenom IIs in situations where clock-for-clock performance matters. For example, the FX-4xxx CPUs are beaten by Phenom II x4s, but the Phenom II x6s are beaten by the FX-8xxx CPUs in highly threaded performance. Windows 8 is supposed to give FX a decent performance boost too, so lets watch Windows 8 performance numbers closely to get a more whole performance picture to analyze.
 
Status
Not open for further replies.