AMD Piledriver rumours ... and expert conjecture

Page 63 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
when did they do this?



Amd bragged about how having 4 cores is better then 2 dual cores slap together with Intel then they lost the performance round then, They also Said 50% more throughoutput with 33% more cores and then they said Up to 35% more performance from 33% more cores. Again a overestimation when its less then 20% on the server side of things Some times even less performance then a 12 core Opteron.


Watch this

http://www.youtube.com/watch?v=G_n3wvsfq4Y

And then read this

http://blogs.amd.com/work/2010/08/02/what-is-bulldozer/

So i say this its much better to underestimate then overestimate because when you underestimate and the product comes out to be better then expected then you have no mad Stock holders and fans! Not to mention it throughs off the competition!

Read this

http://www.tomshardware.com/news/amd-fx-8150-cpu-launch-marketing,13701.html

"I remember Intel as being extremely conservative with its performance estimates. In the end, Conroe arrived with a performance boost that was 80 percent higher than the company initially had promised. It was Intel's conservative communication that caught AMD completely on the wrong foot. Just three months before the launch of Core 2, AMD told me in interviews just how confident they were that the Athlon X2 would keep or immediately regain the performance crown from Intel, should Intel in fact capture it. I was not sure if AMD already knew how strong Conroe was and their answers were marketing bubbles, or if they actually fell into Intel's trap."


 
Amd bragged about how having 4 cores is better then 2 dual cores slap together with Intel then they lost the performance round then, They also Said 50% more throughoutput with 33% more cores and then they said Up to 35% more performance from 33% more cores. Again a overestimation when its less then 20% on the server side of things Some times even less performance then a 12 core Opteron.


Watch this

http://www.youtube.com/watch?v=G_n3wvsfq4Y

And then read this

http://blogs.amd.com/work/2010/08/02/what-is-bulldozer/

So i say this its much better to underestimate then overestimate because when you underestimate and the product comes out to be better then expected then you have no mad Stock holders and fans! Not to mention it throughs off the competition!

Read this

http://www.tomshardware.com/news/amd-fx-8150-cpu-launch-marketing,13701.html

"I remember Intel as being extremely conservative with its performance estimates. In the end, Conroe arrived with a performance boost that was 80 percent higher than the company initially had promised. It was Intel's conservative communication that caught AMD completely on the wrong foot. Just three months before the launch of Core 2, AMD told me in interviews just how confident they were that the Athlon X2 would keep or immediately regain the performance crown from Intel, should Intel in fact capture it. I was not sure if AMD already knew how strong Conroe was and their answers were marketing bubbles, or if they actually fell into Intel's trap."

I own both generations of Phenom and fpu performance is flat regardless of clock per core :s
They keep making mistakes that is driving the industry into the muck and it is us consumers that are paying the price in the end. It is 2012 and still people are paying over $100 usd for dual core products and over $200 often for quad cores.
 
Those bench's were done at 1920x1080 with 4xAF and 8xAF with a high end discrete GPU. That completely defeats the purpose of an APU to begin with. Half their die space is devoted to a GPU that your just disabling. Redo those bench's with the built in GPU vs the HD3K on the SB's and watch the APU's smoke them. Which is the entire point of the APU to begin with. You won't be doing high end gaming with a APU, it just won't work. It's good for mobile gaming and low power budget PC's. That was a good article for desktop decisions as it show's that a decent discrete GPU greatly diminishes the APU's worth.

You are aware the entire point of the article was to bench CPU's? All that was shown was that llano was hopelessly CPU bottlenecked compared even to $80 processors in most every major title, even when paired with the best GPU on the market.

My point being, llano will never be suitable for gaming with any significant graphical settings, simply because the CPU side is a cripple. Now theres nothing wrong for having a low power IGP for laptops/netbooks that is actually halfway decent, but as I've said before, I expect tablets/smartphones to make those two form factors obsolete within a decade. I simply don't see APU's as a major factor going forward.

While I'm a bit late, I'm gonna stop you here and now.

I said,
That's not quite right. We're talking mobile gaming, meaning Sabine platform not Llano although most lump them into the same category.
Maybe I should of been more clear and said Sabine and not Lynx.

http://en.wikipedia.org/wiki/AMD_Fusion#Sabine_.28Mobile.29

You are trying to take a desktop scenario with a monster GPU and apply it to a light weight mobile scenario, that is what makes your statements slightly dishonest.

Your also incredibly wrong about the gaming potential, I actually have one do you? What your doing is called "moving the goal post", define "significant graphical settings"? Is it 1366x720 (common resolution on cheap / light laptops) or is it 1920x10280 4xAA 8xAF? By not defining the intended metric you can declare any statement without fear of being wrong. You declare the CPU as "cripple" when its just a K10.5 stars core with more L2 cache (1MB vs 512MB per core) but no shared L3.

Attempting to bench an APU with a moderate to high end GPU is a waste of time. The entire point of the APU is to play games without a discrete GPU or with ACF and a low end budget card, and as AMD isn't making many budget cards anymore that last scenario is dieing out. An APU would crush an i3 and it's HD3K. That bench just showed that if your building a desktop PC that you shouldn't be using APU's if you plan on putting a big GPU into it. And honestly APU's make absolutely no sense in medium or larger desktops. Their for light notebooks, Internet kioks, HTPCs or really small form factor desktop PCs. Any BTW the budget market is bigger then the mainstream / enthusiast markets. Enthusiasts don't typically shop at Dell / IBM / Sony / Samsung for their laptop / computing needs.

About the biggest thing I'd pair an APU up would be a 6750M / 6770M (6770M really pushing it). Note those are M products meaning mobile chips. For desktop, midrange cards tare too cheap to really bother with an APU. Now I could see a Sabine platform 3530/3550MX APU on a mini-itx board put inside a small case. Those things are so small and with such a limited power budget that adding a discrete GPU would be nearly impossible. The Via Nano used to be the power performer in this field, but their onboard graphics unit is beyond ****. It makes the HD3K look like a beast. I could see AMD moving in and taking the lead, they just need to get the mini-itx board manufactures into the game.
 
Amd bragged about how having 4 cores is better then 2 dual cores slap together with Intel then they lost the performance round then, They also Said 50% more throughoutput with 33% more cores and then they said Up to 35% more performance from 33% more cores. Again a overestimation when its less then 20% on the server side of things Some times even less performance then a 12 core Opteron.


Watch this

http://www.youtube.com/watch?v=G_n3wvsfq4Y

And then read this

http://blogs.amd.com/work/2010/08/02/what-is-bulldozer/

So i say this its much better to underestimate then overestimate because when you underestimate and the product comes out to be better then expected then you have no mad Stock holders and fans! Not to mention it throughs off the competition!

Read this

http://www.tomshardware.com/news/amd-fx-8150-cpu-launch-marketing,13701.html

"I remember Intel as being extremely conservative with its performance estimates. In the end, Conroe arrived with a performance boost that was 80 percent higher than the company initially had promised. It was Intel's conservative communication that caught AMD completely on the wrong foot. Just three months before the launch of Core 2, AMD told me in interviews just how confident they were that the Athlon X2 would keep or immediately regain the performance crown from Intel, should Intel in fact capture it. I was not sure if AMD already knew how strong Conroe was and their answers were marketing bubbles, or if they actually fell into Intel's trap."
the quote was for server workloads where 16 core intelagos has up to 50% more through put than the 12 core magnecors. Where it was pretty much true on integer based tasks. New architectures sometimes losing to old ones in some tasks isn't something new, it happens every now and then, this is why you research what your needs are.

For all AMD knew, their dual cores were going to beat intel's so it doesn't really show much there, intel didn't want to say their cpu would be 80% faster because nobody would buy the pentiums if they did, intel was still selling a lot of cpus even when AMD has the crown. The core2quads were not great compared to what they could have been if they were true quad cores, their success was because the core 2s were so good.

The projection for phenom I was supposed to be decent but they got hit with bugs in the design as well as the low clock rates. In a perfect world, they would have been very competitive but they weren't. The core scaling was still better than the c2q if its any consolation due to the nature of the chip being a true quad core.

Intel projected that the i7 920 would be much more powerful than it was, it was decent but it was still over hyped. They also said hyper threading was going to give the same performance as dual cores. Overall any company can make projections of performance with an "up to" attached to it. Only reason intel doesn't hype nearly as much any more was because they are sitting pretty and far far ahead of AMD. I don't mind hype as long as it has some substance behind it, we can now wait and see about trinity.
 
I own both generations of Phenom and fpu performance is flat regardless of clock per core :s
They keep making mistakes that is driving the industry into the muck and it is us consumers that are paying the price in the end. It is 2012 and still people are paying over $100 usd for dual core products and over $200 often for quad cores.
software is still catching up, which is why we are on faster duals and quads now instead of slower 6 and 8 cores.
 
@gamerk316: imo, this is exactly what 'they' refuse to understand. that article showed how a sub-$200 desktop cpu would perform for gaming. that's it. there was no mobile issue, no igpu issue. llano igpu's superiority was already demonstrated, but it's ability to drive a gaming gfx card was not demonstrated.
and who's to say what cpu should be tested or not? that's like ordering someone to abide by a restricted pc configuration (apple, anyone?).
when llano came out in june~ last year, i really, really wanted to build a llano pc. my plan was to use the igpu for a few months then add a radeon hd 6850 or gtx 560ti right around now and turn it into a cheap gaming pc. since the a8 3850s could be overclocked to 3+ ghz, i figured it's be great for a cheap (compared to other 'cheap' quadcore cpu available at that time - 2.8 ghz core i5 2300), single gfx card gaming pc. plus, i'd have a cheap 32 nm quad core cpu for multitasking and stuff. i knew that the apus were entry level, but they should at least be able to drive a mid range/upper mid range gfx card like cheap, entry level pentiums or slightly costlier core i3 cpus. i am a bit relieved that i didn't go that way. i still want a capable cpu + an igpu like llano's. llano set the bar for igpu, for desktop and mobile.


Why in the hell would you ever build a gaming desktop with an APU in mind? It's mind blowing. There is no "they" here, your taking a product and trying to put it in a role it was never designed for, then upon it failing that role you call the product cheap and write it off from everything.

APU's on a desktop are generally a bad idea, period. There is a small segment in the low-profile HTPC and that's about it.

The APU's intended competition is the ... drum roll .... Atom. APU is only offering them in the desktop segment for OEMs to play with.
 
Standard oil was limited to the US. Intel would be a world wide monopoly. As for arm, name a desktop or laptop computer that's capable of running today's high end software or much less games on arm.

I see. However, one must understand that Intel would not be a monopoly. Many vendors produce SoC devices other than Intel. Calling Intel a global monopoly is a logical fallacy. It competes at so many levels. On the basic consumer desktop...yes, I agree. On other things like serves, smartphones, portables devices, calculation devices, and so forth? No. I must disagree with you at 90%
 
software is still catching up, which is why we are on faster duals and quads now instead of slower 6 and 8 cores.
I think the other major issue with our underthreaded software is that the average person who goes and buys a dual core now. Two cores look really nice to them, and if two more threads became the standard 3 years later, they would be upset their dual core is not as good anymore. Not as good in the sense that they would be limited only having two cores.

Basically, the market doesn't really want to have to have more cores on their cpu's.
 
I see. However, one must understand that Intel would not be a monopoly. Many vendors produce SoC devices other than Intel. Calling Intel a global monopoly is a logical fallacy. It competes at so many levels. On the basic consumer desktop...yes, I agree. On other things like serves, smartphones, portables devices, calculation devices, and so forth? No. I must disagree with you at 90%

And those other venders SoC device are incompatible with Windows and 99% of the worlds software.

Having AMD around keeps Intel honest, it prevents them from pulling another RAMBUS debacle. Intel's been trying to be the only producer of x86 chips for two decades now, ever since IBM's PC design become popular. If there was no competition then Intel would stop producing new products and stop lowing prices on old products. They would then leverage their monopoly in the PC market to force OEM's to build what they want them to build. Case in point, Intel's old partnership agreements that required OEM's to sign exclusive contracts with Intel or be faced with inventory shortages as Intel would prioritize their competitors over them.

That was one of the practices that Intel was found guilty of btw. That lawsuit and it's resultant verdict is why OEM's can now offer AMD components in their products.
 
I Thought you got the message?

Link please

For you? Nah. No need to. People know that is what AMD was trying for. They missed the mark. Probably GFs fault for screwing up 32nm.

I assume your referring to the xbit labs article, http://www.xbitlabs.com/news/cpu/display/20111026223104_AMD_Expects_Trinity_t😵ffer_20_30_Performance_Increase.html


The 20% improvement represents AMD's projections "using digital media workload" and actual performance advantage over currently available Fusion A-series "Llano" vary depending on the applications and usage models

Maybe intead of getting bent out of shape because someone made an attemp to analyze what a cpu physics score means with trinity, stick with the facts instead of twisting them to mean what wasn't said. Then again, maybe you just read the title of the book, I don't know. Trinity isn't 20% overall, and its not the end of the world.

No I wasn't refering to any article. Mainly to the link I posted with estimations based on that information. And again, I said it was a rumor not fact.

Thanks for assuming but, well you know what it means.

Those bench's were done at 1920x1080 with 4xAF and 8xAF with a high end discrete GPU. That completely defeats the purpose of an APU to begin with. Half their die space is devoted to a GPU that your just disabling. Redo those bench's with the built in GPU vs the HD3K on the SB's and watch the APU's smoke them. Which is the entire point of the APU to begin with. You won't be doing high end gaming with a APU, it just won't work. It's good for mobile gaming and low power budget PC's. That was a good article for desktop decisions as it show's that a decent discrete GPU greatly diminishes the APU's worth.

You are aware the entire point of the article was to bench CPU's? All that was shown was that llano was hopelessly CPU bottlenecked compared even to $80 processors in most every major title, even when paired with the best GPU on the market.

My point being, llano will never be suitable for gaming with any significant graphical settings, simply because the CPU side is a cripple. Now theres nothing wrong for having a low power IGP for laptops/netbooks that is actually halfway decent, but as I've said before, I expect tablets/smartphones to make those two form factors obsolete within a decade. I simply don't see APU's as a major factor going forward.

Why is it when you compare it to somthing viable people don't want to see it? They only want to lime light.

Llano, as I said many a times, is great for low end gaming with a IGP but with a GPU, the CPU is just, well crap.

is that why 1000$ parts languish on the shelf ?
only an Intelidiot would buy chips for more than 400$
but you have a point there are a lot of Intelidiots

Thanks for the insider info since you seem to have Intels specific sales numbers.

PD and trinity are not mature products
We would have to wait for 2014 to see mature product with excavator
In the meantime trinity and piledriver will be an improvement over lano and bull doozer
a10 5800 over lano a8 3870
ipc+ clock+
a10 as a product will be vastly improved over the top lano
PD will be an improvement over Bull Doozer
I made my prediction

a10 5800 CPU 25-30% igp 60-80% compared to lano 3870
using +ipc and +clock speed

PD 10-15% per core over BD first iteration ( ipc and clock increase)

I find it odd that the people who question it are unwilling to make their prediction
known .Instead they act like an adolescent with their panties in a wad.

2014..... thats when the 14nm shrink of Haswell will occur. Not sure if Excavator will do well unless they at least hit 22nm by then.

In history? WOW!!! :lol:

Prices would double and development would come to a screeching halt. Intel would just sit back and rake in the cash.
Pricing would surely rise somewhat in the short term, but thankfully there is the ARM threat to keep innovation ticking along.

I guess the Federal Reserve doesn't count.

Trinity isn't GCN but older VLW4.

They keep saying its GCN but early AMD slides have said VLIW. Guess we will see when it hits.

didn't Tom's have a article showing a modded rig running win8 on arm.?

edit:
http://www.tomshardware.com/news/WOA-Windows-On-ARM-OFfice-15-Metro-Style-Steven-Sinofsky,14670.html
Windows 8 on ARM will support desktop Office apps in addition to desktop tools like File Explorer, Internet Explorer 10 and more

right, nothing high end..

Games would still have to be coded for ARM though. Sure you could probably port Android/iOS apps to Windows 8 ARM but games like Skyrim etc would need a rework of the game engine to run on ARM instead of x86.
 
the quote was for server workloads where 16 core intelagos has up to 50% more through put than the 12 core magnecors. Where it was pretty much true on integer based tasks. New architectures sometimes losing to old ones in some tasks isn't something new, it happens every now and then, this is why you research what your needs are.

For all AMD knew, their dual cores were going to beat intel's so it doesn't really show much there, intel didn't want to say their cpu would be 80% faster because nobody would buy the pentiums if they did, intel was still selling a lot of cpus even when AMD has the crown. The core2quads were not great compared to what they could have been if they were true quad cores, their success was because the core 2s were so good.

The projection for phenom I was supposed to be decent but they got hit with bugs in the design as well as the low clock rates. In a perfect world, they would have been very competitive but they weren't. The core scaling was still better than the c2q if its any consolation due to the nature of the chip being a true quad core.

Intel projected that the i7 920 would be much more powerful than it was, it was decent but it was still over hyped. They also said hyper threading was going to give the same performance as dual cores. Overall any company can make projections of performance with an "up to" attached to it. Only reason intel doesn't hype nearly as much any more was because they are sitting pretty and far far ahead of AMD. I don't mind hype as long as it has some substance behind it, we can now wait and see about trinity.

Um a Phenom II is still only on par per core and clock as first gen C2Q.......
 
Incorrect.

HP was selling AMD during the period covered by the lawsuit, it was Dell who wouldn't come to the party.

Prior to the filing of that Lawsuit HP did not offer AMD products for the consumer market. The Intel partnership agreements were only for consumer devices, PC's / Laptops and such. Tier 1 manufacturers did not offer AMD products due to it violating their partnership agreements and resulting in revocations on the OEM rebates and reduced scheduled inventory.

This is all known, Michael Dell testified in court to this effect, he also provided emails between his office and the several Intel VP's / Directors about the arrangement. HP also testified that Intel had threatened it when they tried to offer AMD products, they were only allowed to remain in the partnership agreement if they ensured any product wouldn't place AMD in a better competitive position price wise. This was all put out there in the court hearing in the USA. Eventually Intel and AMD settled on the issue, but not before lots of dirty laundry got aired. Intel had been abusing it's dominate position in the consumer market to lock AMD out from high sales. This was all from 2001~2006/2007 time period. When they did settle Intel had to sign a legal agreement as part of the settlement terms saying they would never enter into those kinds of partnership agreements with any manufacturer and would not attempt to *persuade* manufacture's from using OEM components in their offerings. It's very easy to see when this happened as suddenly several manufacturers had AMD offerings when before they didn't.

It's not that Dell didn't want to offer AMD products, they asked Intel several times to allow them to, it's that Intel refused to allow them to offer the products. The punishment for violating Intel's edicts was a sudden increase in your costs and a reducing in your inventory with the extra inventory going to your competitors. Intel was playing the tier 1 OEM's off against each other.

Next you'll tell me that Intel didn't offer Rambus millions in stock's if they could force manufacturers to sell P4 + Rambus? What do you think that whole patent trolling by Rambus on every DDR/DDR2 memory manufacturer in the world was about? It was Intel trying to prevent any AMD products from launching by locking out the motherboard manufacturers. Intel / Rambus's plan was to put high royalties on all production of DDR memory, this would force the memory to be more expensive then Rambus's own RDRAM product. RDRAM would only be licensed to work in authorized Intel motherboards. This would eliminate the price advantage many AMD chips had at the low end (at that time) and simultaneously force the Taiwanese motherboard manufacturers to play by Intel's rules or have a higher royalty forced on them.

For over 20 years Intel's been scheming to remove AMD's x86 license, they've done several law suits and lost each time. Their a company with a dirty history (business wise) seconded only by Apple and Oracle.
 
I think the other major issue with our underthreaded software is that the average person who goes and buys a dual core now. Two cores look really nice to them, and if two more threads became the standard 3 years later, they would be upset their dual core is not as good anymore. Not as good in the sense that they would be limited only having two cores.

Basically, the market doesn't really want to have to have more cores on their cpu's.
Thing with that is, quad cores today are probably a lot slower than quad cores in 3 years and you may be more "future proof", you will still have a rather slow computer when the cores are being utilized. I do believe we are moving into the quad core era tho as they are becoming the standard.
 
Status
Not open for further replies.