AMD Piledriver rumours ... and expert conjecture

Page 50 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
This is the way i feel also, I'm thinking A little worse though i'm thinking the CPU portion of trinity might even be slower then Llano at certain things. Like i said before the scaling of a BD core is 80% so times that by 4 and you have a 3.2 Core processor. Not to mention the IPC is easily 10-15% Lower on BD(note why the 6 core BD is about even with the 4 core Phenom in multithreading test's). Unless Amd can fix these things i don't see it being any faster then Llano's CPU. Again i do feel however Trinity will have at least 30% better graphics performance. I also feel Trinity will have better Performance/Per watt when compared to Llano. Also i'm strictly talking about the Laptop versions of Trinity and Llano, Because on these versions its harder to just Clock the CPU real high because of power consumption and heat.

Frankly, after the BD hype, most posters here have learned to wait for the actual 3rd-party benchies before passing judgment. Have seen far too much BS from AMD itself and rumor sites during the past year, to believe any of them. Some of the posters here bought the hype first and then the AM3+ mobos second, well before BD launched, and are stuck using their Phenoms in the new mobos because BD wasn't an upgrade to them. I'd imagine that if they could, they'd bill their expenses to AMD and/or a certain community reporter who seems to be MIA these past few months 😛.

This is one of the consequences of overhyping an unreleased product that then underperforms and underwhelms according to built-up expectations - huge loss in credibility for several years afterwards. So having lost the enthusiast market for the most part (although I doubt AMD really cares much, seeing as how little of the total market we are), AMD is now pinning their hopes on mobile.
 
just a little while back AMD made announcement that they were no longer trying to compete with Intel
http://www.tomshardware.com/news/AMD-APU-Z-Series-ARM-Tegra-3,14114.html

that is older news but fits the discussion
AMD is not trying anymore to lead in the desktop CPU field
they are actually being very smart
finding new markets or taking the lead in more profitable and growing markets is a smart strategy
how many people you know dont even own or really use a desktop?
between Tabs,Ipads,netbooks and laptops
really any of these mobile solutions pretty much do what most of the computer users need them to do
check email,Facebook,play simple games like Angry Birds or Bejeweled,browse
using all your resources to compete in the Desktop CPU market especially when you have fallen far behind doesnt make sense
while the desktop is not going away it is not a large enough market to justify the expense
 
just a little while back AMD made announcement that they were no longer trying to compete with Intel
http://www.tomshardware.com/news/AMD-APU-Z-Series-ARM-Tegra-3,14114.html

that is older news but fits the discussion
AMD is not trying anymore to lead in the desktop CPU field
they are actually being very smart
finding new markets or taking the lead in more profitable and growing markets is a smart strategy
how many people you know dont even own or really use a desktop?
between Tabs,Ipads,netbooks and laptops
really any of these mobile solutions pretty much do what most of the computer users need them to do
check email,Facebook,play simple games like Angry Birds or Bejeweled,browse
using all your resources to compete in the Desktop CPU market especially when you have fallen far behind doesnt make sense
while the desktop is not going away it is not a large enough market to justify the expense



If you ask me as long as Amd is pricing their chips at the same prices as Intel they are competing with them! Also i think it's just another excuse for lower performance! Sorry but even with the Phenom's Amd had price/Performance Now with BD it's all gone. The 8150 should cost only 199.99 based on its Overall performance!


And Amd will keep falling behind if they can't make powerful server CPU's.
 
I agree with you jdwii
their new desktop CPU pricing is not good
at least before you could pick up a AMD quadcore and OC it
and cost less than Intel
but now with the lower end Sandybridge Pentium series and I3s
and with BD pricing not set right
for a budget build a 1155 is making more sense now

and also AMD used to do good with their Opterons in the server market
and they have fallen behind their
and that is a bigger market IMHO than desktop CPUs
that is where you will get larger orders and turn a higher profit
you can have a higher profit margin on a server CPU since you have large companies buying them and not worrying about a $50 price difference as compared to a home user

but AMD is smart for not trying to take the lead in deskop performance CPUs
the amount of R&D money needed to catch up to Intel just doesnt make sense
and AMD EOLing alot of Phenom II AM3 cpus I dont think was a good idea

if you look on Newegg
really can only find Athlon II 640s,PHII 9XX series

soon the days of an AMD quadcore for $75 USD will be gone

when it came to budget quadcore CPU systems AMD really had that market
for the past few years
I have an OCd PHII x 4 925 @ 3.4 I am very happy with
but now they are losing that market
I know somebody that just got a 2500k for $179 USD brand new
plus a $50 mobo and $30 ram
AMD doesnt have anything to compete with that
even what is left of the Thubans arent priced competively
 
I agree with you jdwii
their new desktop CPU pricing is not good
at least before you could pick up a AMD quadcore and OC it
and cost less than Intel
but now with the lower end Sandybridge Pentium series and I3s
and with BD pricing not set right
for a budget build a 1155 is making more sense now

and also AMD used to do good with their Opterons in the server market
and they have fallen behind their
and that is a bigger market IMHO than desktop CPUs
that is where you will get larger orders and turn a higher profit
you can have a higher profit margin on a server CPU since you have large companies buying them and not worrying about a $50 price difference as compared to a home user

but AMD is smart for not trying to take the lead in deskop performance CPUs
the amount of R&D money needed to catch up to Intel just doesnt make sense
and AMD EOLing alot of Phenom II AM3 cpus I dont think was a good idea

if you look on Newegg
really can only find Athlon II 640s,PHII 9XX series

soon the days of an AMD quadcore for $75 USD will be gone

when it came to budget quadcore CPU systems AMD really had that market
for the past few years
I have an OCd PHII x 4 925 @ 3.4 I am very happy with
but now they are losing that market
I know somebody that just got a 2500k for $179 USD brand new
plus a $50 mobo and $30 ram
AMD doesnt have anything to compete with that
even what is left of the Thubans arent priced competively

I think you're right with Ivy bridge and trinity lurking sandy bridge is cheap
 
I find it hard to believe that trinity is going to be clocked so high at base. In way to many cases that is not good for AMD. 1. It shows there are having large difficulty improving IPC on BD architecture. Which only further drives itself into the ground.
2. They either A. are not concerned about people mixing it up with higher level cpu's or B.are truly dropping out of the High end desktop market.
3. power draw. Unless this is all they worked on, i don't find it possible to clock 4 cores at 3.8 Ghz with a few hundred Radeon shaders and stay under 65 or 100w.
4. meeting demand. They had issues keeping up with Llano demand. We have also seen rumors of BD chips not able to reach the clock speeds and may be sold as a different chip. Along with their consisting issues with GF.

Thinking about it, i just think is doesnt make any sense to clock Trinity that high.

It makes perfect sense if the CPU is anything like BD, meaning lower IPC than previous gen. The only reason BD did any good against Phenom II was due to higher clocks. Trinity will probably use the same but on a per clock level, will probably not compete.

It will be another twist or marketing it seems.

All current IGPs lack the 'raw compute power' to play modern games at high settings, unless you consider <<30 fps OK 😛..

Anyway, it'll be interesting to compare the IB and Trinity and Haswell IGPs over the next 15 months or so, except for gaming enthusiasts..

I am interested as its, well interesting. It shows the future, possibly for UMDs and such.

I agree with you jdwii
their new desktop CPU pricing is not good
at least before you could pick up a AMD quadcore and OC it
and cost less than Intel
but now with the lower end Sandybridge Pentium series and I3s
and with BD pricing not set right
for a budget build a 1155 is making more sense now

and also AMD used to do good with their Opterons in the server market
and they have fallen behind their
and that is a bigger market IMHO than desktop CPUs
that is where you will get larger orders and turn a higher profit
you can have a higher profit margin on a server CPU since you have large companies buying them and not worrying about a $50 price difference as compared to a home user

but AMD is smart for not trying to take the lead in deskop performance CPUs
the amount of R&D money needed to catch up to Intel just doesnt make sense
and AMD EOLing alot of Phenom II AM3 cpus I dont think was a good idea

if you look on Newegg
really can only find Athlon II 640s,PHII 9XX series

soon the days of an AMD quadcore for $75 USD will be gone

when it came to budget quadcore CPU systems AMD really had that market
for the past few years
I have an OCd PHII x 4 925 @ 3.4 I am very happy with
but now they are losing that market
I know somebody that just got a 2500k for $179 USD brand new
plus a $50 mobo and $30 ram
AMD doesnt have anything to compete with that
even what is left of the Thubans arent priced competively

And yet a Q6600 @ 3GHz is still viable even being 5 years old. AMD has a lot of ground to make up. They wont get it by trying to grab the low end. They did that for years before Athlon 64 and never moved. They need to truly focus on creating a CPU worth buying.
 
Not on-die memory, Die-on-top-of-Die memory, aka 3D stacked.
This is the future of CPU/GPU/APU/SoC. The first generation ones are being made today but primarily for the mobile/smart tv market.

Jedec WIDEIO is a 512bit memory interface.

http://www.cadence.com/Community/blogs/ii/archive/2011/12/15/an-update-on-the-jedec-wide-i-o-standard-for-3d-ics.aspx

Then there is Micron/IBM Hypercube memory.

http://www.theregister.co.uk/2011/12/01/ibm_micron_hybrid_cube_memory/

"IBM and Micron say that HMC devices can take up about a tenth the space of traditional 2D DRAM memory sticks and require 70 per cent less energy to transfer a bit of data from the memory chip to the CPU."

That's a huge power savings meaning less heat and higher clock speeds.

It's rumored that Intel's is adding new plants specifically for this. To break further into the mobile and low end market by stacking 2-8GB of RAM on the CPU.

I've heard Micron talking of stacking 20 or more die as the process matures.
Moore's law is running out in the 2D space around the 10nm mark, that's why stacking is only logical.

They could look at this after they hit the next roadblock, but you can't waive a magic wand at thermal load. Stack more transistors on top of other hot transistors isn't a good idea. Ultimately heat is your enemy, until you get quantum size, then it's electron tunneling and superposition.

3D stacking is more about denser memory chips rather then memory on top of processors. Instead of having to lay out a bunch of flat chips in serial they could lay them on top of each other which dramatically increase's memory density. Take a good look at a 4GB stick of DD3 then at a 2GB stick, now imagine if you could cut the stick in half and make only the chips 1.5x as high. That's basically the idea.
 
Yeah Sun had a good run. They just weren't prepared for how quickly AMD/Intel ramped with their 64bit architectures combined with Linux adoption for cheap compute farms.

It is good to see them shipping new Sparc designs. 64 threads/CPU is beastly for database work which made a good fit for Oracle. It helps keep Intel honest in the high end server market for sure.

The Sparc platform is still favored by the DoD for mission critical C2/C4I systems. Their incredibly hard to kill, I still have an U60 sitting next to my desk that works perfectly even though it's been through hell. Then Solaris itself is an very solid / redundant OS, it doesn't die. When you combine these two together you get a platform that will keep working with minimal administrator interaction.
 
And yet a Q6600 @ 3GHz is still viable even being 5 years old. AMD has a lot of ground to make up. They wont get it by trying to grab the low end. They did that for years before Athlon 64 and never moved. They need to truly focus on creating a CPU worth buying.

Okay then, what would you like to see AMD to do specifically besides make a better processor. That is a given. what about a specific feature they could improve, add, remove, any new ideas or extensions?
 
The Sparc platform is still favored by the DoD for mission critical C2/C4I systems. Their incredibly hard to kill, I still have an U60 sitting next to my desk that works perfectly even though it's been through hell. Then Solaris itself is an very solid / redundant OS, it doesn't die. When you combine these two together you get a platform that will keep working with minimal administrator interaction.

True, but as most DoD stuff is embedded, you more often see a PowerPC based arch with some embedded realtime OS. Then again, guess it depends on what task the HW is being used for...
 
"And yet a Q6600 @ 3GHz is still viable even being 5 years old. AMD has a lot of ground to make up. They wont get it by trying to grab the low end. They did that for years before Athlon 64 and never moved. They need to truly focus on creating a CPU worth buying"

that is the sad part
there was an article on Toms comparing single core @3ghz IPC performance and the Core2 series did better than the PHII
AMD has been behind ever since Core2 was released
so that has been about 5 years now
and AMD realized that to make a desktop performance CPU to match Sandybridge performance would just cost too much in research and development
I dont even think that BD was meant to compete with SB
Their focus was on a modular design that would be easier and more cost productive for fabrication
it is almost like the Lego blocks of CPUs
which for low end OEM systems and possibly workstation/servers could be a big benefit
and of course as thread handling improves with Windows 8 then there will be a small improvement in BD
the win8 beta showed @ 10 percent improvement from my reading
combine this with stepping/revision improvement then BD still wont compete against Sandybridge but will do better in IPC against its own older arch
the fact is that AMD has stated it wont try to compete with Intel in the desktop performance market
they basically admitted defeat in the AMD vs Intel performance crown
why devote huge sums of money trying to do it
when the desktop performance market wont support it
the mobile market is a bigger and more profitable market for them
in the desktop arena they just need to compete in the low end OEM segment
if OEMs are using AMD for sub $500 machines then AMD wins
AMD is putting its R&D into IGP/onboard video just for that reason
 
^^ But again, to add a little more salt in the wound, how many CPU/motherboard upgrades have AMD users gone through since C2Q came out? My point being, if you brough a Q6600 5 years ago and OC'd it, you'd still be about at the same level as a PII X4. I'd wager most AMD users here have done at least two CPU upgrades since C2Q came out [Athlon X2 -> Phenom I -> Phenom II X3/X4].
 
^^ But again, to add a little more salt in the wound, how many CPU/motherboard upgrades have AMD users gone through since C2Q came out? My point being, if you brough a Q6600 5 years ago and OC'd it, you'd still be about at the same level as a PII X4. I'd wager most AMD users here have done at least two CPU upgrades since C2Q came out [Athlon X2 -> Phenom I -> Phenom II X3/X4].


that is a good point
it is at the point that even the AMD fanbois are either quiet or starting to admit that Intel is dominating
I just recently had a C2D @3ghz and went to a PHII x 4 @ 3.4
so I dont favor Intel or AMD
I will say that when it comes to single threaded apps
that I really dont see a difference
it is only because having the extra cores that I notice a difference
which was AMDs strategy
Moah cores as they joke

I do think that we are starting to go off topic somewhat
so I will either be quiet or post AMD/Piledriver info
I dont want to make any of the nice mods mad :)
 
They could look at this after they hit the next roadblock, but you can't waive a magic wand at thermal load. Stack more transistors on top of other hot transistors isn't a good idea. Ultimately heat is your enemy, until you get quantum size, then it's electron tunneling and superposition.

3D stacking is more about denser memory chips rather then memory on top of processors. Instead of having to lay out a bunch of flat chips in serial they could lay them on top of each other which dramatically increase's memory density. Take a good look at a 4GB stick of DD3 then at a 2GB stick, now imagine if you could cut the stick in half and make only the chips 1.5x as high. That's basically the idea.

Some of that heat comes from driving the wide memory buses off chip. Bringing them internal is a power/heat savings because you're not having to drive signals across 6 inches of PCB. The voltage can be lowered as well. All adding up to a fairly big savings. In some estimations a high end PC can use 40 Watts in just the DDR3 bus.

3D Stacking is mostly about putting memory on logic. They do this on the iPhone even.
In some cases multiple memory types. CPU + NAND Flash + DRAM.
 
^^ But again, to add a little more salt in the wound, how many CPU/motherboard upgrades have AMD users gone through since C2Q came out? My point being, if you brough a Q6600 5 years ago and OC'd it, you'd still be about at the same level as a PII X4. I'd wager most AMD users here have done at least two CPU upgrades since C2Q came out [Athlon X2 -> Phenom I -> Phenom II X3/X4].
Athlon to phenom I wasn't really an upgrade, why bother. to an X4 could be feasable from the clock speed advantage, but even that wasn't reason enough to upgrade for me as the programs I were using ran fine. going from an Athlon X2 to the 8120 was a massive jump.

Even so I could have changed JUST THE CPU, just mainly couldn't afford it due to having a crappy job at the time. ultimately upgraded to the fx because the motherboard stopped working after a bad storm.

http://us.msi.com/product/mb/K9A2-Platinum-V2.html#?div=CPUSupport
 
I find it hard to believe that trinity is going to be clocked so high at base. In way to many cases that is not good for AMD. 1. It shows there are having large difficulty improving IPC on BD architecture. Which only further drives itself into the ground.
2. They either A. are not concerned about people mixing it up with higher level cpu's or B.are truly dropping out of the High end desktop market.
3. power draw. Unless this is all they worked on, i don't find it possible to clock 4 cores at 3.8 Ghz with a few hundred Radeon shaders and stay under 65 or 100w.
4. meeting demand. They had issues keeping up with Llano demand. We have also seen rumors of BD chips not able to reach the clock speeds and may be sold as a different chip. Along with their consisting issues with GF.

Thinking about it, i just think is doesnt make any sense to clock Trinity that high.

how about 100W for 3.8/4.2 ghz

http://www.fudzilla.com/home/item/25919-amd-trinity-lineup-detailed
 
Status
Not open for further replies.