AMD Piledriver rumours ... and expert conjecture

Page 79 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
I remember someone posting an article a bit back, the last time this came up, that actually showed AMD closed the FAB before they claim the Intel stuff happened, I think it was because their new Dresden FAB was coming online at the time. I wasn't defending AMD at all but was rather pointing out the one thing some people miss, even to this day. Intel no matter what has more production power than AMD having more FABs than AMD.

Even if AMD was selling every last CPU, when they cannot meet the supply and demand the companies will go elsewhere to meet that said demand.

Then add in Intels marketing, being more well known and that means they will still sell even more.

I think if AMD didn;t buy ATI right after Core 2 hit and Barcelona was a success, it could have been a much different story. I think AMD would have profited and had the funds to build more FABs and put more into R&D.

But that didn't happen. They overpaid for ATI, Barcelona was not successfull, especially with the recall on server parts, and so AMD started taking losses which hit them hard as all get out. If the same events happened and Intel didn't "bribe" the OEMs, AMD still would have lost market share and still would have had losses.

Still, this is what I can say: intel and AMD have gotten over it, everyone else can too.
might want to check some facts then, AMD announced the fab closures the exact same time that gateway publicly announced they will not sell AMD cpus, the same time that other companies started doing the same thing. This is the time of the athlon, just about to release the Athlon XP to wipe the floor with Intel in performance and no one will sell your cpus?

Just a reminder of cpu status back then: http://www.xbitlabs.com/articles/cpu/display/athlon-xp-1800.html

And your not going to sell it? At this point in time, 2001, AMD has no clue why they are losing sales. They don't find out till around 2005.

http://www.theregister.co.uk/2001/09/25/amd_shuts_fabs_fires/

AMD was absolutely NOT fab constrained. Think about this in a business standpoint instead of saying "oh thats an old fab, doesn't count." you have 2 choises, sell, or upgrade.

choice 1: Currently you can't sell what cpus you make, does it make sense to put 1B in upgrading for future (1-3 year rough timeframe for an overhaul) sales that you don't know are coming.

choice 2: Currently you can't sell what cpus you make, save some money, and unfortunately fire people close and sell the plant.

This is what you do when your making money.

http://www.eetimes.com/electronics-news/4084184/Samsung-to-upgrade-U-S-fab-cuts-500-jobs

And again, as soon as that upgrade was done ... http://www.affinityproperties.com/wordpress/2012/01/18/samsung-could-spend-1b-to-boost-austin-complex/

you upgrade. ya, some people lost their jobs, but when the fab re-opened, more jobs were needed. Kinda hard to keep them working without machines for 2-3 years.

Say it all you want that its in the past, it doesn't change the fact that the damage done is Permanent and cannot in any way shape or form be undone. Intel won, thats why AMD is "no longer competing with Intel."

Intel got off easy with paying AMD 1.25B. This doesn't help AMD recover time lost trying to figure out why they can't sell anything rather than making sure the next greatest cpu is in the R&D shop. IMO justice would have been to allow AMD access to all Intel research for 1 year and to give AMD an entire fab plant.
 
Nice try there JS, really slick.

I am sure if the A8 won you wouldn't be using it as an example for this.

And to I reply,

What conclusions can be drawn from this data, then? Clearly, the A8-3870K is a better platform for general productivity, particularly when you run threaded applications (or do a lot of multi-tasking) able to leverage four physical cores. The Pentium G620 and discrete Radeon card combine to form a superior gaming system.

Intel simply doesn’t have anything in the same price range able to match the A8-3870K’s blend of graphics performance, capacity for threaded apps, and overclocking headroom.

There’s no denying the strength or sensibility of an APU in a compact form factor where value and complexity are closely related. But we're talking about full-blown desktops here. If you're building a general-use desktop PC, the A8-3870K works really well as a low-cost jack-of-all-trades. It does everything fairly well right out of the box, but there's not much of an upgrade path.

Looks like AMD did "Win" overall if you even want to call it a contest. The Intel was better at "gaming" because it had a dGPU. Put in a cheaper APU along with a compatible dGPU (ACF) and suddenly the AMD chip wins at "gaming". Determining which was the better "gaming" chip wasn't the point of the article, but to evaluate each ones strengths and weakness's.

Several times, in many posts, I've stated that APU's on desktops don't really make sense unless your going ultra low end $500 or less. In which case you wouldn't be using the "top of the line" APU but one that was ~$100 or less. Once you do the the Pentium + dGPU loses it's entire appeal as there is no longer room in the budget to get a dGPU. The APU trounced the Pentium in anything that used three or more cores, games rarely need more then two hence the Pentium provided just enough power. In that comparison half the AMD's cores are just sitting idle, this I recommended the tri-core as a compromise. You still get superior performance but at a lower cost that allows for a dGPU to be acquired which would put the A6+dGPU on par and better then the G620 +dGPU.

Of course none of this matters to you, you'll never say anything nice or positive about the AMD company.
 
Hold on a minute with that. Myself and my team built a program and we designed it to be many processes which communicated by way of a well defined IPC mechanism. It worked quite well. Not a high performance system like a game but it worked quite well. Very easy to build upon. I recall counting 35 or so processes. No one actually wrote a start up script so I wrote one in perl and everyone began to use it. Now this is an extreme case but we had our reasons for building it that way. My point is that software can be built this way. I am aware that you pay for such a design with complexities in synchronization and this takes some time to work out but I believe in it. I was at one point concerned about performance but we learned that the OS worked with it quite well.

I understand that some processes (the general term) cannot be broken down due however in general I very much believe that if you sit down and agree to do it this way you will be successful. While this does not entirely deal with the issue of making code execute faster, it should be part of the solution, especially today. You are too pessimistic on this.

Synchronization = Performance Bottleneck. You may be using many, many processors, but that doesn't mean the code is operating in parallel.

Heck, I recently worked on what the designer called a "well threaded" piece of code; "well threaded" because both cores shot to 100% during execution. The reason this happened though, was because the designer did something akin to:

if (x.status != complete)
sleep(10);

Hence why I tend to be very skeptical using core loading statistics from task manager when it comes to defining what is and is not a "well threaded" program.

[BTW: Messages are the proper way to handle the above situation, so you don't thrash constantly polling for status. With the exception of sleep(0) [which executes any outstanding APC's, which can be useful], using sleep in a program is NEVER correct].

I think the biggest program I ever worked on, from a thread standpoint, was about 50 or so threads. While each specific task in the program was relativly easy to spread across multiple cores, each task had to be done in sequence. As a result, even though each task was very parallel in nature, because the overall flow was serial, we only gained about 15% performance using a second CPU core, with diminishing returns after that. Thats typical for most programs: You can thread a specific part of the code, but the overall flow is serial. Thats your performance bottleneck.
 
Regarding the Intel monopoly thing...

You guys have it wrong and prolly most of people "defending" AMD don't realize a simple fact: AMD would do the exact same given the opportunity. Companies are ALL about profit and bottom line. That's one thing; the other is that Intel did wrong and screwed with my hard earn money. You guys don't forget that: when Intel crippled AMD's chances to sell, they messed with your HARD EARNED MONEY, period. That's not something to forgive guys, you have to knock on Intel's doorstep and slap them hard. I mean, yeah, what's USD$40 more for a CPU, right? Just like a couple of gallons of fuel, a bunch of food, etc... Don't forget and don't forgive that easily.

Cheers!
 
lol
we need both company (or new players) to keep this game of new, affordable, wide range, compition etc running forever.
This will favor consumer.
:lol:
if only 1 company will left then we will see amd/intel hater boy/girl

Unfortunately Intel has done their hardest to make this a one-player market. They own the patent and licensing rights to the x86 instruction set. Thus any program that is compiled to x86 would only run on a CPU they designed or a CPU they allowed someone else to design. The only reason AMD even has a license is that IBM forced Intel to give it to them back in the 80's. IBM required at least two manufacturers for every component of the IBM PC. This was done to protect IBM from any single manufacturer shorting them on parts or price gouging in case IBM's PC become successful. What they didn't expect was people to figure out that if there are multiple manufactures of each component then non-IBM companies could put those components together and thus gave rise to the "IBM Clone". Ever since IBM + DOS + Windows became popular Intel has done everything in it's power to be the sole producer of x86 CPU's and thus control the entire IBM PC Clone market. Which now accounts for 90~99% of Personal Computers in the world.

A short synopsis of the legal action between those two.

http://news.cnet.com/Intel-and-AMD-A-long-history-in-court/2100-1014_3-5767146.html
 
What you fail to constantly get is Keith, just because you are crazy in love with AMD, it doesn't mean that people who currently have an Intel based system are just as in love with Intel.

Most people who are active on computer forums and have Intel systems, are simply crazy in love with computing, and will take which ever company's products work best for them.

They are not on some immature crusade of sticking to one company no matter what.

So people who are crazy in love with computing, will comment all over the place on computing products, not just stick to the makers of their currently owned product.

It also means that when a diehard of one company is making outlandish statements that have little to no basis in the real world, they will face opposition from people who find their crazy claims to be an affront to logic and reasoned debate.


What you fail to constantly get is that I actually pity you and others that have closed minds who refuse to think for themselves and also refuse to grow up and stop insulting other people just because they will not to bow to your perceived illusion of superior intelligence. Someday you might realize that people that will not blindly accept your opinions as truth are not all stupid people. (Although based on your post history it is doubtful you have that ability.)

Just because someone will not accept your diatribe does not mean that anything they say automatically has no basis in the real world. (Such as in my case... nothing I posted has been incorrect or illogical even though you seem to have a personal and perhaps psychological need to pretend it to be so. But I understand that you can't debate against reality so you haven't bothered trying to dispute anything I previously posted since you know it to be true.)
 
Nice try there JS, really slick.

I am sure if the A8 won you wouldn't be using it as an example for this.

And to I reply,

What conclusions can be drawn from this data, then? Clearly, the A8-3870K is a better platform for general productivity, particularly when you run threaded applications (or do a lot of multi-tasking) able to leverage four physical cores. The Pentium G620 and discrete Radeon card combine to form a superior gaming system.

Intel simply doesn’t have anything in the same price range able to match the A8-3870K’s blend of graphics performance, capacity for threaded apps, and overclocking headroom.

There’s no denying the strength or sensibility of an APU in a compact form factor where value and complexity are closely related. But we're talking about full-blown desktops here. If you're building a general-use desktop PC, the A8-3870K works really well as a low-cost jack-of-all-trades. It does everything fairly well right out of the box, but there's not much of an upgrade path.

Looks like AMD did "Win" overall if you even want to call it a contest. The Intel was better at "gaming" because it had a dGPU. Put in a cheaper APU along with a compatible dGPU (ACF) and suddenly the AMD chip wins at "gaming". Determining which was the better "gaming" chip wasn't the point of the article, but to evaluate each ones strengths and weakness's.

Several times, in many posts, I've stated that APU's on desktops don't really make sense unless your going ultra low end $500 or less. In which case you wouldn't be using the "top of the line" APU but one that was ~$100 or less. Once you do the the Pentium + dGPU loses it's entire appeal as there is no longer room in the budget to get a dGPU. The APU trounced the Pentium in anything that used three or more cores, games rarely need more then two hence the Pentium provided just enough power. In that comparison half the AMD's cores are just sitting idle, this I recommended the tri-core as a compromise. You still get superior performance but at a lower cost that allows for a dGPU to be acquired which would put the A6+dGPU on par and better then the G620 +dGPU.

Of course none of this matters to you, you'll never say anything nice or positive about the AMD company.

I like how you try to put words in my mouth. Guess my set of GPUs from 2003 to current. Can't remember, as I have said it time ana again? 9700Pro -> 9800XT -> X850XT -> HD2900Pro -> HD4870 -> HD5870 and I plan at some time to go to a HD7970 even if Kepler is better. Why? Because I like AMDs (ATIs) GPUs better and I like CCC better than nView. Even more interesting is the fact that I have bought multiple AMD products in the past year and no Intel. I plan on a APU for my next HTPC as it makes sense, not for gaming just for video playback. Also planning to grab a Phenom II X4 for my wifes machine before they become unobtainable as I feel she needs a quad core.

I have said that the APUs are great in IGP, their CPUs just are not so great on a per core and per clock basis vs Intel. Thats just fact, not me hating AMD.

And the point of the article was to compare it in the same price range, something people tend to do. If someone is gaming, why would they go with a APU like the 3870K when a G620 and HD6670 is the same price but better? If they want overall productivity, then I can see going with the APU even thougth it still uses more power than the Intel + GPU alternative.

And noob2222, if AMD has moved on so can everyone else. Besides all that money wasted on lawyers can now be focused towards R&D and development instead.
 
What you fail to constantly get is that I actually pity you and others that have closed minds who refuse to think for themselves and also refuse to grow up and stop insulting other people just because they will not to bow to your perceived illusion of superior intelligence. Someday you might realize that people that will not blindly accept your opinions as truth are not all stupid people. (Although based on your post history it is doubtful you have that ability.)

Just because someone will not accept your diatribe does not mean that anything they say automatically has no basis in the real world. (Such as in my case... nothing I posted has been incorrect or illogical even though you seem to have a personal and perhaps psychological need to pretend it to be so. But I understand that you can't debate against reality so you haven't bothered trying to dispute anything I previously posted since you know it to be true.)
Keith,
In a previous discussion many months ago before Thuban was released, you questioned how I could be sure that AMD would have a Turbo function in their CPU's.

I responded I knew they would have it because AMD's designers are not stupid.

You stated that you weren't so sure that AMD would have Turbo, and hoped that they didn't because Turbo was a kind of dishonest technology or some other rot, and that if AMD did implement Turbo, you would go into your Bios and disable it. :lol:

What kind of reality is that?

Keep in mind, I am free to use CPU's from any x86 maker, you have limited yourself to a very different situation.

It is not hard to work out who should be pitied here.
 
the new AMD server direction is..."

http://semiaccurate.com/2012/02/29/amd-buys-seamicro-and-what-a-buy-it-is/


Anyone else seeing this as bad news? AMD had a pretty good relationship with Cray and they almost killed their profits last year being so late with Bulldozer. The server market is highly competitive.

For a 5bil company they sure are splintering off into a lot of potential markets.
 
Keith,
In a previous discussion many months ago before Thuban was released, you questioned how I could be sure that AMD would have a Turbo function in their CPU's.

I responded I knew they would have it because AMD's designers are not stupid.

You stated that you weren't so sure that AMD would have Turbo, and hoped that they didn't because Turbo was a kind of dishonest technology or some other rot, and that if AMD did implement Turbo, you would go into your Bios and disable it. :lol:

What kind of reality is that?

Keep in mind, I am free to use CPU's from any x86 maker, you have limited yourself to a very different situation.

It is not hard to work out who should be pitied here.

BZZZZT WRONG... I have not limited myself in any way. I am free to buy an brand I wish. I'm just not stupid enough to limit myself by blindly accepting the opinion of the majority and am free to actually have my own opinion based on facts and research. But you seem to think that people that don't accept the opinion of the majority are clueless. Which only shows that you are blind to reality.

As for the rest of your rant... it is interesting how you have quoted me out of context. Let me restate the truth: Using a Turbo feature while quoting the base frequency on graphs is completely dishonest in a review that is supposed to be presenting "clock per clock" comparisons. Only someone naive would think it is acceptable to pretend base clock frequency is what the CPU is using when running a single threaded game or application if the Turbo feature is not specifically disabled. And it is completely dishonest for Intel to not provide the ability to see what the running frequency of the chip is while it is in a turbo mode. Why are these facts so hard for you to understand? Other than when benchmarking "out of the box" experiences the turbo features of any brand must be disabled when doing comparative reviews or the review results don't mean anything UNLESS the exact and true running frequency is provided for all chips.

As for the discussion of what AMD's engineers should or should not add... I vehemently oppose hyperthreading and still think it is one of the worst ways to accomplish their goal. It is also basically worthless if a program is efficiently threaded anyway. I also don't like the current implementation of shared resources AMD has introduced but at least it is real hardware and a much more eloquent solution. When they tweak it and get rid of the minor problems it might actually be acceptable. But hyperthreading won't be. Ever.
 
BZZZZT WRONG... I have not limited myself in any way. I am free to buy an brand I wish. I'm just not stupid enough to limit myself by blindly accepting the opinion of the majority and am free to actually have my own opinion based on facts and research. But you seem to think that people that don't accept the opinion of the majority are clueless. Which only shows that you are blind to reality.
This claim of yours that it is about the opinion of the majority is a red herring on your part, as no one is saying that.

Facts and research are a great thing, but zealots have a funny way of creating their own "facts", when the real facts don't suit them.

As for the rest of your rant... it is interesting how you have quoted me out of context. Let me restate the truth: Using a Turbo feature while quoting the base frequency on graphs is completely dishonest in a review that is supposed to be presenting "clock per clock" comparisons. Only someone naive would think it is acceptable to pretend base clock frequency is what the CPU is using when running a single threaded game or application if the Turbo feature is not specifically disabled. And it is completely dishonest for Intel to not provide the ability to see what the running frequency of the chip is while it is in a turbo mode. Why are these facts so hard for you to understand? Other than when benchmarking "out of the box" experiences the turbo features of any brand must be disabled when doing comparative reviews or the review results don't mean anything UNLESS the exact and true running frequency is provided for all chips.
If I had quoted you out of context, you wouldn't have been asking me how I knew AMD would bring in a Turbo scheme and you wouldn't have said you would turn Turbo off in the bios on your future AMD system.

Now I know why you don't want to revisit these crazy statements you made, but there is nothing out of context in my bringing up what you said.

As for the discussion of what AMD's engineers should or should not add... I vehemently oppose hyperthreading and still think it is one of the worst ways to accomplish their goal. It is also basically worthless if a program is efficiently threaded anyway. I also don't like the current implementation of shared resources AMD has introduced but at least it is real hardware and a much more eloquent solution. When they tweak it and get rid of the minor problems it might actually be acceptable. But hyperthreading won't be. Ever.
What kind of dreamworld do you live in where software is always "efficiently threaded anyway"?

When is this software utopia going to arrive? Hardware makers should not be ignoring the real world situation.

It is frequently stated that hyperthreading only costs 5% of die space and on the average is returning far more than that on average on non-gaming software, that makes hyperthreading acceptable right now and makes a complete mockery of your claim that hyperthreading will never be acceptable.

Does anyone really doubt that your objection to hyperthreading is simply because Intel has it and AMD lacks it?

That certainly seemed to be the only basis for your objection to Turbo at the time and why you were uncertain that AMD would be implementing it into the future.
 
If IPC means more than cores, whatever the scenario, then why doesn't intel make a 1 billion transistor single core, and just improve IPC on that one core every generation?

So, yes cores are important. I think that anyone who says software will not scale past 4 cores is crazy. It will take time, but we cannot sit on 4 cores forever. People probably thought the same thing when they went to dual cores. Contrary to what was probably marketed (Two cpu cores is like have two cpus!) people who knew a lot about cpu's probably thought the idea was ridiculous.

I think so anyway, i can't say for sure, I was pretty young back then, and I didn't know what a cpu was. The only reason i say this, is because of what people say now, and that is that going over 4 cores is not going to happen, as it is not necessary.

Of course, what happens after we move on from transistors is up in the air, but software will scale past 4 cores. In fact. it already does.

Not to say that BD was a good product, software just isn't ready for it, no. BD is bad, and to make up excuses for why it is, that's just a waste of time. It has been said a hundred times before, but it is true: AMD should have made a product that would be good now, not when software allows it to be.

I speak to those who believe that BD will always be bad because software will not go beyond 4 cores.
 
a8-3870k fully oc + hd6xxx hybrid cf
VS
equally priced 2xhd6xxx + sb

i want to know which one will perform better in gpu heavy game (cf enabled), cpu heavy game (heavily threaded), and in benchmarks?

I think that A8-3870k are for budget overclockers, As they can get max out of it, while to get oc at sb setup we will need expensive cpu+ board.

(also i think that if amd can lowers the price of 3870k to 100-120, then it will kick any intel g + dgpu setup at that price)
 
ctbaars wrote : I think you posted the wrong link. It doesn't say, ""
Your avatar catch phrase is funny and oddly appropriate for the discussion...except your avatar picture is a goat, not a ram :na:
:lol: Ewe, I stand corrected! I guess that means I don't have kids ... :sol: The avatar, (wave hand), I did not alter; otherwise it be a wether. And, no, it doesn't explain the "no kids" ... :non:
 
And noob2222, if AMD has moved on so can everyone else. Besides all that money wasted on lawyers can now be focused towards R&D and development instead.
So we should pretend that what intel did didn't happen and give them a big hug? :ange:

Time can never be recouperated, sure they can focus on r&d now, they are still on a delayed time table from where they should have been had intel not broken the law. AMD can't go back in time and undo their fab spinoff.

The truth of the matter is that in the cross liscence agreement that amd made public, they pay royalties on every cpu sold to intel, and their reward is to get stabbed in the back for figuring out how to make a better cpu than intel.

I will never forget that is how intel does business, what's to stop them from doing it again.
 
This claim of yours that it is about the opinion of the majority is a red herring on your part, as no one is saying that.

Facts and research are a great thing, but zealots have a funny way of creating their own "facts", when the real facts don't suit them.

Here is what refuse to see:

I have not claimed that nobody should buy Intel. I just do not blindly advocate Intel since they are not the only viable choice. People that pretend that they are the only viable choice are ignoring facts and reality.

Basically you want me to accept your opinion as being more "real". You are attempting to claim that if I do not blindly accept your opinion that I must not have the facts or somehow don't know what I'm doing. That would be a very wrong supposition.


If I had quoted you out of context, you wouldn't have been asking me how I knew AMD would bring in a Turbo scheme and you wouldn't have said you would turn Turbo off in the bios on your future AMD system.

Now I know why you don't want to revisit these crazy statements you made, but there is nothing out of context in my bringing up what you said.

I would never have asked somebody that has your argumentative demeanor and opinion a serious question since your answer would not be based on anything that I would consider relevant.

And as I said before: I would definitely turn off the Turbo in bios if possible for all benchmarking as I have always maintained. I would probably not turn it back on since it is not really useful for more than for e-peen while benchmarking.


Does anyone really doubt that your objection to hyperthreading is simply because Intel has it and AMD lacks it?

That certainly seemed to be the only basis for your objection to Turbo at the time and why you were uncertain that AMD would be implementing it into the future.

This is the part you are probably confusing with what you said above: I would most definitely turn off hyperthreading in any brand CPU. If AMD had it I would immediately disable it. But we know AMD won't add it because they prefer to add actual cores instead a gimmick like hyperthreading which is again mostly useful only for e-peen and benchmarking.

I'm not going to keep responding to your personal insults and claims about prior posts using your selective memory. If you want to actually link to a prior post we could perhaps discuss it; but otherwise you need to shut up about it and get over it.
 
a8-3870k fully oc + hd6xxx hybrid cf
VS
equally priced 2xhd6xxx + sb

i want to know which one will perform better in gpu heavy game (cf enabled), cpu heavy game (heavily threaded), and in benchmarks?

I think that A8-3870k are for budget overclockers, As they can get max out of it, while to get oc at sb setup we will need expensive cpu+ board.

(also i think that if amd can lowers the price of 3870k to 100-120, then it will kick any intel g + dgpu setup at that price)
for example, a8 3870k oc to 3.6-3.7 ghz + cm 212 evo + radeon hd 6670 1gb ddr3/gddr5 h-cfx - $140+$35+$70/$90~ = $245/$265~. if you count shipping, 6670 ddr3 - $75~ total, gddr5 - $107~. cm 212 evo - $41~. didn't count rebates.
vs
pentium g620 ($70) / g840 ($85) + the rest can buy you cards like radeon hd 6870, 6850, 2x 6570 ($63+$7 shipping), 2x 6670 ddr3 etc. i tried to add the cheapest of each cards.
there is a lot of room for various cpu+gfx card combo within the budget. many more different combos can be made e.g. athlon ii x4 641 + 6870. since you specified the 3870k oc, i used that as the base. you can use the stock cooler for oc but.. i think that the cm hyper 212 evo will run the oc cpu better (afaik 3850/3870k is quite hot on load @stock settings).
the 2x 6570 can deliver fps similar to 6790. the cfx combo might have microstuttering.
after building the combo using my poor math skills... the pentium + dgfx combo will easily outperform the oc 3870k + radeon hd 6670 h-cfx setup in gaming. but, the apu will perform better in multithreaded tasks, multitasking etc. oc will take multi-everything performance lead even farther. however, when you oc the apus, power efficiency drops fast.
amd does seem to have an unlocked apu at $120 - a6 3670k
imo you picked kinda of an unfair competition - the oc 3870k+6670 will always lose in gaming if you keep the budget fixed. the pentium becomes almost useless for moderate/heavy multi-workloads.
edit: one more thing - a 4 core, stock/overclocked over 3 ghz cpu+gpu does seem like a viable gaming cpu regardless the igp inside (i5 2320 vs 3850/3870k).... until they're tested for gaming performance....
 
This is the part you are probably confusing with what you said above: I would most definitely turn off hyperthreading in any brand CPU. If AMD had it I would immediately disable it. But we know AMD won't add it because they prefer to add actual cores instead a gimmick like hyperthreading which is again mostly useful only for e-peen and benchmarking.
So what is your objection to hyperthreading exactly? :heink:

It is not as though one must make a choice between more cores and hyperthreading.

Are you suggesting that if one bought a Quad with Hyperthreading, one should disable hyperthreading and run it as a 4 core, 4 thread CPU, rather than a 4 core, 8 thread CPU?

If that is your view, why?

And who knows what AMD does in the future.

For the next 4 or more years they are stuck with the BD design and will be working on trying to improve it and I agree that it is very unlikely they would attempt to bring in hyperthreading to it, but who knows what happens in 4 or 5 years when their next major CPU architecture gets released.
 
Anyone else seeing this as bad news? AMD had a pretty good relationship with Cray and they almost killed their profits last year being so late with Bulldozer. The server market is highly competitive.

For a 5bil company they sure are splintering off into a lot of potential markets.

Dunno if I'd call it bad news - AMD seems to be covering more bets such as possibly using large ARM arrays for server, similar to what Intel supposedly wants to do with Atom-based servers. Anyway, this should give them more options, esp. in the 'non-competing with Intel' arena..
 
So what is your objection to hyperthreading exactly? :heink:

It is not as though one must make a choice between more cores and hyperthreading.

Are you suggesting that if one bought a Quad with Hyperthreading, one should disable hyperthreading and run it as a 4 core, 4 thread CPU, rather than a 4 core, 8 thread CPU?

If that is your view, why?

And who knows what AMD does in the future.

For the next 4 or more years they are stuck with the BD design and will be working on trying to improve it and I agree that it is very unlikely they would attempt to bring in hyperthreading to it, but who knows what happens in 4 or 5 years when their next major CPU architecture gets released.
I will interject here and say yes.

35047.png

http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/20

There are 4 of 10 cases where HT hurts performance. 40% of games actually appear to run better with HT off, so why would any gamer run a HT machine...

rather than adding another reply, I will edit here, yes, its fairly minor difference in performance, but $100 isn't a minor price difference.
 
more to using a computer than games
in other applications like encoding,rendering etc
HT will help
the old HT debate goes back to Pentium 4 days
and it was proven that HT was beneficial in most applications including the running of your OS which is multithreaded
and if you look at the bench you showed
it is less than a 5FPS difference in gaming
so yes if all you do on your computer is game (kind of pathetic IMHO might as well buy a Xbox 360 and dont bother with a PC)
then get a 2500K
but if you use your computer for business to do something silly like make money
then HT is beneficial

and this is from a proud AM2 and AM3 owner LOL
 
Status
Not open for further replies.