AMD Piledriver rumours ... and expert conjecture

Page 246 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 


(PRESS RELEASE)GLOBALFOUNDRIES Unveils FinFET Transistor Architecture for Next-Gen Mobile Devices

GLOBALFOUNDRIES today accelerated its leading-edge roadmap with the launch of a new technology designed for the expanding mobile market. The company's 14nm-XM offering will give customers the performance and power benefits of three-dimensional "FinFET" transistors with less risk and a faster time-to-market, helping the fabless ecosystem maintain its leadership in mobility while enabling a new generation of smart mobile devices.

The XM stands for "eXtreme Mobility," and it is the industry's leading non-planar architecture that is truly optimized for mobile system-on-chip (SoC) designs, providing a whole product solution from the transistor all the way up to the system level. The technology is expected to deliver a 40-60% improvement in battery life when compared to today's two-dimensional planar transistors at the 20 nm node.

The 14nm-XM offering is based on a modular technology architecture that uses a 14nm FinFET device combined with elements of GLOBALFOUNDRIES' 20nm-LPM process, which is well on its way to production. Leveraging the maturity of the 20nm-LPM technology will enable a smooth transition for customers looking to tap the benefits of FinFET SoCs as soon as possible. Technology development is already underway, with test silicon running through GLOBALFOUNDRIES' Fab 8 in Saratoga County, N.Y. Early process design kits (PDKs) are available now, with customer tape-outs expected in 2013.

"We have more than a decade of FinFET R&D to build on as we prepare to bring this technology to production," said Gregg Bartlett, Chief Technology Officer at GLOBALFOUNDRIES. "We are confident this foundation will enable us to lead the foundry volume ramp of FinFETs, just as we did with High-K Metal Gate (HKMG)."

Building on a Foundation of HKMG Expertise

The FinFET architecture takes the traditional two-dimensional transistor design and turns the conductive channel on its side, resulting in a three-dimensional "fin" structure surrounded by a gate that controls the flow of current. A key benefit of FinFET technology is its superior low-power attributes. The 3D transistor design intrinsically operates at a lower voltage with minimal current leakage, which translates into longer battery life for mobile applications or less power consumption for plugged-in applications such as networking chips in datacenters.

"Many people don't realize that FinFETs build upon the same fundamental mobile driving force as today's HKMG technology," said G. Dan Hutcheson, CEO and Chairman of VLSI Research. "While HKMG was a significant innovation in leakage reduction, FinFETs are a great leap forward in this value proposition that clear the way for many years of advancements. But to fully extract the value of FinFET technology, a company needs to be in volume production of HKMG. GLOBALFOUNDRIES has a head start in this area with almost two years of high-volume manufacturing experience with HKMG."

So to sum up, this is a press release - aka marketing blurb designed to lure in more suckers 😛 - from a company that (1) doesn't even have production 22/20nm transistors yet, let alone working 14nm stuff, and (2) thinks that they have "a head start in this area with almost two years of high-volume manufacturing experience with HKMG.".

IIRC GF touted their gate-first HKMG as superior to TSMC/Intel's gate-last (which they had at 40/45nm - long before GF's 32nm), they screwed over AMD with 9+ months of delays with first Llano and then Bulldozer, ramps and performance substandard (and likely below AMD's targets), and is now switching to TSMC/Intel's gate-last HKMG on 22/20nm.

And using "maturity" in the same sentence as "well on its way to production"?? GMAB. Sorry but this is only "news" on Planet Noo-Noo. Maybe.
 

$

:lol:

Dunno, the 4320 could be a good chip at $130

I'm interested by the pricing, AMD better bring some real improvement, or they will be starved for 8xxx sales with the 2500k/3570k selling for less. :/
 


Agreed their is no way a 8 core Piledriver will be picked(average consumer) over a Ivy I5 core! I hope Amd isn't going to do what they did last year Piledriver will only be 15% faster while still be a 125Watt chip! Their is no way its worth 30%+ more then Bulldozer, Like you said its not worth more then 199.99$.
 

Very odd 8350 being $11 from the 8320. I thin this is the same site that preordered bd at +$50 and got them after newegg and microcenter.

They are just trying to make extra money off idiots who buy from them before amd even releases the chip and much less the actual price..
 



I sure hope so because for a company who is stating their not competing in the high-end anymore they sure expect a high-end price for them if true.
 


I reached 4.8ghz @ 1.425v and ran stable through the night. I am sure the lost 800million transistors would have given it better stability at insane clock speeds.

As I said elsewhere, if you have a higher end thuban you don't really need to move on but overall the FX is a better chip despite the problems in process. Roll on Piledriver with enough changes seems a little bit closer to what Bulldozer was intended to be. I am most interested in the changes done to the memory interface and front end or should I say moderations done.


 

there could be plenty of reasons -
the game's not optimized yet i.e. issues with the coding,
intel is ebil,
nvidia is ebil,
console ports favor intel,
techreport, borderlands 2, borderlands 2's publisher favor intel,
intel had a hand in designing borderlands 2,
the game was compiled using intel compiler which cripple zambezi cpus,
the fps differences among cpus are not at all noticeable,
real gamers and enthusiasts (TM) don't care about a few more/less fps,
windows 8 will run the game properly on fx cpus,
fx8150 actually runs borderlands 2 better than core i7 990x but amd does not want you to know that. they don't care.
lastly, no one will notice the difference when you run borderlands 2 on similarly configured (e.g. with gtx 670 and such) amd pc and and intel pc. in a blind testing sponsored by amd, people will pick the amd pc for it's smooth borderlands 2 performance.
:pt1cable:
 


Wow it seems clear now haha 😀

Yeah I think Intel is cheaper in the US or maybe AMD is just cheaper in the UK but a FX-6200 is £100 while a i5 2500K is £150. Also, I don't game too hard but I do video editing etc which I believe makes use of more cores?

Judging from the Piledriver specifications that are going around I don't think I'll wait. The 6300 has a slower clock but less energy consumption, which doesn't bother me so I wont be getting that. The FX-8xxx look good but at that price I probably would go for Intel anyway :)
 

they pretty much answered it in the article.

Initially, we couldn't get Borderlands 2 to run with the AMD's FX processors as the game would cause our test system to throw a BSOD on loading. Turns out, the Asrock motherboard we were using -- as well as most AMD 900-series motherboards for that matter -- need a BIOS update to correct this issue.

obviously something going on there.

they even ended with this

The FX-8150 didn't respond quite as well to being overclocked, delivering just 51fps at 1920x1200. Based on what we have seen from other games, we believe there is something wrong here and hope that a patch is released shortly to help improve the performance of the AMD FX series parts.

Kinda reminds me of all these reviews where the FX chips wouldn't run Showgun 2: TW. runs fine now.

Thats one of the inhereted problems with being the underdog CPU maker, your hardware gets tested afterwards most of the time, so your optimizations come when they get to it.
 


Heres my specs

FX-8150 clocked at 4.0ghz cooled with V8 coolermaster
Asus direct cu II 6950 2 GB
16 GB kingston hyperx 1600mhz 9-9-9-24
1000w platinum seasonic psu
500 gig seagate hdd 7200rpm 32 mb cache

i want to see this for myself what program do you want me to bench it with? i got fraps and dxtory, but ill use anything i dont believe in these scores

 


It wasn't supposed to be 2 billion transistors. The module design was to save on transistors without adding full separate cores. Some marketing guy just got some bad numbers and waited a year to correct it.
 



They have 8 cores, just they're bottle necked until Steamroller fixes them.
 

defeats the purpose of what amd is trying to do.
there are some people who say that disabling second cores will boost gaming performance. disabling 2 cores in thubans and disabling second cores in fx cpus would somehow improve performance - close to the core i5 and beat the pesky core i3(!). if amd publicly allowed this, they'd be admitting that their current architecture fails at core multithreading. they could get motherboard manufacturers to program bios to allow that, then word will get out and the same thing will happen. amd can't afford to lose further cred after hyping their modular architecture and their approach to multithreading and multicores.
they're betting on the future where the os scheduler will efficiently allocate resources among cores and improving their own arch. meanwhile users get these intermediate products which by themselves are capable just enough to get the job done but nothing revolutionary. i think it will carry on till 2014.
 
Status
Not open for further replies.