AMD Piledriver rumours ... and expert conjecture

Page 74 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
Hmm, OK so perhaps the GPU is merely 2X what the 6620G can do 😀.. I think I know which laptop would come out the winner in the gas-powered leaf blower contest 😗

What you fail to acknowledge is that AMD's igp works in tandem with discrete while Intel's igp doesn't
you compare discrete to igp which is meaningless.

 
*Blink* *Blink* .... why are we comparing a IGP to a dGPU again. It's pretty clear a dGPU will (or should) always win, it has it's own dedicated memory for one (lower latency). DDR3-1800 (900x2) vs the DDR3-1333 you'll find in most APU based notebooks (due to OEMs). 540M has 1344MHZ core clock while the 6620G is using a 400~444Mhz clock. And finally, the 540M has it's own HSF cooling option while the 6620G must share with whatever APU it's glued to. That last is the reason it's clock locked at 444Mhz but the bigger desktop APU's have higher GPU clock rates.

Also more anandtech bench's with no mention to what memory or CPU the 6620G was connected to nor the conditions of the test. Just a number attached to a GPU model.

Trust me on this one... The GT540M is just a faster clocked GT425M, which my notebook has, and it can pull 1080p gaming like the 6620 from the A8, but slightly better (maybe thanks to the i3 in it). Maybe the GT540M and the i5 variants (3.0Ghz) pulls up ahead notebook wise overall, but its still a tad more expensive than the A8.

What you fail to acknowledge is that AMD's igp works in tandem with discrete while Intel's igp doesn't
you compare discrete to igp which is meaningless.

Credits due, where credits owed: it's just for DX10/11 titles. In DX9 it does squat for performance =/

That's why I said a few posts back that I don't know if Lucid might have something for notebooks as well; that hybrid solution for doing dual VGAs from different vendors. Intel is such a clutz in this regard... They could perfectly license something from nVidia (I'm sure money is not an issue, lol), slam it into some i5 variant and make the same thing as AMD for doing dual VGAs.

Cheers!
 
*Blink* *Blink* .... why are we comparing a IGP to a dGPU again. It's pretty clear a dGPU will (or should) always win, it has it's own dedicated memory for one (lower latency). DDR3-1800 (900x2) vs the DDR3-1333 you'll find in most APU based notebooks (due to OEMs). 540M has 1344MHZ core clock while the 6620G is using a 400~444Mhz clock. And finally, the 540M has it's own HSF cooling option while the 6620G must share with whatever APU it's glued to. That last is the reason it's clock locked at 444Mhz but the bigger desktop APU's have higher GPU clock rates.

Also more anandtech bench's with no mention to what memory or CPU the 6620G was connected to nor the conditions of the test. Just a number attached to a GPU model.

Its because Triny said that a AMD laptop at half the price could easily beat a Intel laptop. He did say above $1K but still showing a $900 laptop with a quad core, 4GB of RAM and a 540M is still impressive as nothing from AMD at $500 can beat it.
 
Its because Triny said that a AMD laptop at half the price could easily beat a Intel laptop. He did say above $1K but still showing a $900 laptop with a quad core, 4GB of RAM and a 540M is still impressive as nothing from AMD at $500 can beat it.

It doesn't have to beat it to sell well and be worth the money spent. Not everyone want's to cough up that much let alone have to worry about a discrete gpu later on failing. I still remember all the G84 and G86 gpus that made their way into many laptops only to have failed.
 
Its because Triny said that a AMD laptop at half the price could easily beat a Intel laptop. He did say above $1K but still showing a $900 laptop with a quad core, 4GB of RAM and a 540M is still impressive as nothing from AMD at $500 can beat it.




I said this before and i'll say it again most Laptops at best buy can't game until you get into the 1000$+ market with Intel and since best buy is pro Apple+Intel they don't sell any APU laptops at their store or at least around Michigan where i live, At Walmart you can find a A8 laptop for 550$ that can play almost anything at med settings at 720P and output 1080P Blu-ray with Amd. 95% of all users don't buy online Most people buy at Walmart/Best buy. So to me Intel is unreasonable under 1000$ at stores such as these But then again Amd is not worth it past 650$-800$. And i'm stretching that estimate a bit.
 
It doesn't have to beat it to sell well and be worth the money spent. Not everyone want's to cough up that much let alone have to worry about a discrete gpu later on failing. I still remember all the G84 and G86 gpus that made their way into many laptops only to have failed.

I know nVidia has had past mess ups, trust me I get the HP DV9000 and other series with the faild chipset all the time.

My point is that he was trying to state Intel was more expensive and AMD was not. While you can game on the APUs, you cannot game nearly as well as you can on a discrete GPU even in a laptop.

Add to that its a quad core for $900, its a nice deal.

I said this before and i'll say it again most Laptops at best buy can't game until you get into the 1000$+ market with Intel and since best buy is pro Apple+Intel they don't sell any APU laptops at their store or at least around Michigan where i live, At Walmart you can find a A8 laptop for 550$ that can play almost anything at med settings at 720P and output 1080P Blu-ray with Amd. 95% of all users don't buy online Most people buy at Walmart/Best buy. So to me Intel is unreasonable under 1000$ at stores such as these But then again Amd is not worth it past 650$-800$. And i'm stretching that estimate a bit.

I understand most people. And most people actually hate BB. One of my co-workers was head tech there and hated it. A lot of customers tell me they hate BB.

We try to sell best performance and value, which is why we mainly sell Asus laptops.

Still for $900 its not a bad deal. For most people though, the Intel HD3K IGP will even make the cut as most people don't game on these laptops, rather they buy Alienware or Asus ROG laptops for that.

AMD price cuts (FX-8120 & FX-6100)."

http://techreport.com/discussions.x/22546

Probably to try to sell more before IB hits. Not a bad move on AMDs part. I am willing to bet the HD7K series will see a price drop before and after Keplers release, to try to stem sales.
 
Trust me on this one... The GT540M is just a faster clocked GT425M, which my notebook has, and it can pull 1080p gaming like the 6620 from the A8, but slightly better (maybe thanks to the i3 in it). Maybe the GT540M and the i5 variants (3.0Ghz) pulls up ahead notebook wise overall, but its still a tad more expensive than the A8.



Credits due, where credits owed: it's just for DX10/11 titles. In DX9 it does squat for performance =/

That's why I said a few posts back that I don't know if Lucid might have something for notebooks as well; that hybrid solution for doing dual VGAs from different vendors. Intel is such a clutz in this regard... They could perfectly license something from nVidia (I'm sure money is not an issue, lol), slam it into some i5 variant and make the same thing as AMD for doing dual VGAs.

Cheers!
The point I was making is that AMD's igp works in conjunction with a D v card
for today's games and going forward
it's a big point
trinity,kaveri etc.. going forward will increase this advantage until Intel can somehow do the same or Nvidia can make dvcards that equal
AMD's igp+dvcard a tall order
it's a very cost effective for customer method AMD created and money alone will not solve it ,time invested and money may.
granted lano's igp is restricted to only a few vcards it will dual with but going forward as the igp gets stronger the d v cards it works with
will be more robust. compounding the problem HSA will increase this advantage.
 
My point is that he was trying to state Intel was more expensive and AMD was not. While you can game on the APUs, you cannot game nearly as well as you can on a discrete GPU even in a laptop.

Add to that its a quad core for $900, its a nice deal.
Actually all I said was my friends amd HP was better visually than my intel his was half the cost I paid north of 1000$
i never said anything about discrete against igp
 
Probably to try to sell more before IB hits. Not a bad move on AMDs part. I am willing to bet the HD7K series will see a price drop before and after Keplers release, to try to stem sales.

with that same logic I suppose
SB is reduced to sell more before Trinity arrives
 
I hope PD has the same or more IPC as the Phenom II, If they can't beat that something is wrong!
Amd stopped engineering BD probably Around 2-3Q of 2010 and saw the samples around the 4Q 2010, which means they had all this time to Figure out the issues with BD and i hope they did the right thing and put a team on it and patched things up to create PD.
 
jimmysmitty wrote :



My point is that he was trying to state Intel was more expensive and AMD was not. While you can game on the APUs, you cannot game nearly as well as you can on a discrete GPU even in a laptop.


Add to that its a quad core for $900, its a nice deal.

Except ... Intel is more expensive. Intel practically requires a dGPU to game while the APU can do it just fine. All other components being equal, the AMD will win at the lower price points.

As many other posters have alluded to, anywhere you go to buy a notebook you get $1k+ Intels and $550 AMD's. You can scour the internet looking for some deal on a specific model in an attempt to prove otherwise. I just link to HP's website where you can customize whatever model you want and adjust it to your own price range. They even sell "Intel" laptops, the DV6t series is a mostly direct comparison to the DV6q series. Now I'm normally not a fan of HP, this time they really made a quality product. A6/A8 @1366x768 for $550 USD, you can't really beat that. A6/A8 + 1920 + 7690M + BluRay for ~$800, again can't really beat that offering. It's when you hit $1K+ that the Intel offerings start to be the better deal, especially if your going into 17-inch territory.
 
I know I keep referencing the HP DV6, this is because I own one and can give first hand experience on it. Samsung also makes a bunch of A6/A8 laptops, I see them at all the office / computer stores here in South Korea. I do not know if their offered in other countries nor what price points they would use in other countries. Here their kinda pricey, but their also cheaper then their Intel counterparts (all electronics are kinda expensive here). Also I think IBM and Sony might offer some, but don't know their model numbers. I just like the HP because you can configure and customize it.
 
Pileriver may get DDR4 memory now that Samung has produced the first DDR4 sticks

No it won't. First off, AMD hasn't had the time to make a DDR4 memory controller. Secondly, they would greatly increase the cost of their platform, removing potential sales. Thirdly, there aren't consumer DDR4 chips for sale yet, making the use of DDR4 moot at this point.
 
and where did you read that from.?

He's most prolly thinking: "ok, AMD needs faster memory and since DDR4 chips are being produced right now (eng samples, prolly), they'll most likely use it with PD".

Like gamerk said, it is very unlikely, since AMD hasn't announced anything about a new IMC for PD using DDR, even less, they haven't said anything regarding a new socket. Not that they might not be on it...

Instead of Yoda, you need the Sword of Omens, mal 😛

Cheers!
 
What you fail to acknowledge is that AMD's igp works in tandem with discrete while Intel's igp doesn't
you compare discrete to igp which is meaningless.

A: You typically don't see many people who have the IGP enabled go out and buy a discrete GPU, as it defeats the purpose of the IGP in the first place.

B: Discrete and IGP are operationally exactly the same, except for performance. Comparing performance between them is quite valid, especially at the mid/low range of the spectrum.
 
I hope PD has the same or more IPC as the Phenom II, If they can't beat that something is wrong!
Amd stopped engineering BD probably Around 2-3Q of 2010 and saw the samples around the 4Q 2010, which means they had all this time to Figure out the issues with BD and i hope they did the right thing and put a team on it and patched things up to create PD.

And...what? Are you seriously telling me AMD has had time to make a fundamental redesign of the BD architecture in just over, what, twelve months, get the first test chips fabricated, tested for performance, then put into mass production?

The fact PD is comming out so fast tells me AMD was very well aware of BD's shortcommings, and since PD looks to basically be a speed increase [covering up its IPC issues], I'm not expecting any significant architectual changes, and thus about the same IPC. While faster by nature of clock speed increases, its IPC will still stink, which means it will continue to trail SB/Nahalem in anything that does not scale beyond 4 cores [95% of all SW].

I'll say it again: Until someone finds a way to keep CPU's running cool above the 4.2GHz mark or so, CPU's will be clock limited due to thermal constraints. As thus, AMD's clock speed approach is doomed due to having no headroom to actually increase clocks.
 
won't piledriver+ddr4 require complete platform change for amd which will result in increased cpu/chipset price, reduced backwards compatiblity, new memory controller etc.?
if trinity supports ddr3, pd is unlikely to support ddr4, which i don't think will debut before late 2013-14. hell, pd(-compatible motherboard chipsets) might even skip pcie 3.0 support. ddr4 will cost more than ddr3 at launch.
edit: if samsung makes the cpus and the ram both for amd, it could be different. still, it seems unlikely with pd. their newest platform (am3+) isn't a year old yet.
 
won't piledriver+ddr4 require complete platform change for amd which will result in increased cpu/chipset price, reduced backwards compatiblity, new memory controller etc.?
if trinity supports ddr3, pd is unlikely to support ddr4, which i don't think will debut before late 2013-14. hell, pd(-compatible motherboard chipsets) might even skip pcie 3.0 support. ddr4 will cost more than ddr3 at launch.
edit: if samsung makes the cpus and the ram both for amd, it could be different. still, it seems unlikely with pd. their newest platform (am3+) isn't a year old yet.
Considering how long it took AMD to move to DDR3 after Intel started using it, thinking PD will be using DDR4 is absolutely ludicrous.
 
Considering how long it took AMD to move to DDR3 after Intel started using it, thinking PD will be using DDR4 is absolutely ludicrous.

And AMD still hasn't used RDRAM! Crazy, huh? :pt1cable:

Come on, until AMD itself says so, it's not impossible for them to be working on DDR4 for PD, it's just very improbable. We all agree that a new socket (maybe new chipset) would not hurt them at all, but I'll just feel a lil' butt raped, lol. Plus, the APUs will be very happy to get DDR4 once they get the IMC working.

Cheers!
 
And...what? Are you seriously telling me AMD has had time to make a fundamental redesign of the BD architecture in just over, what, twelve months, get the first test chips fabricated, tested for performance, then put into mass production?

And why exactly, in your expect, professional and informed opinion, would AMD need a *cough* "fundamental redesign of the BD architecture"?

They need to fix cache latencies and branching, this has been gone over already.

Or is this more of "SIMD isn't good for parallel and the integer pipelines are choking the SIMD units" talk?

AMD engineers knew early last year there was problems with BD's performance. Engineers are highly paid professionals, their not university students throwing darts at a wall to determine the source of a problem. I believe it's safe to say they've identified the source (or set thereof) and have been working to manually redesign those parts of the die that need redesigned. It is not implausible that those same engineers would of already fixed some if not all of those problems and implemented them on the next revision of the die. When we can get our grubby hands on a die for testing, then we'll know, until then I'll wait and withhold my judgement on the matter.

I realize the urge to bash and hate on anything with the "AMD" logo is strong, but a little bit of rational and objective thought should be used. It's not easy designing a microprocessor, it's even harder to do so on a limited budget. So show a little respect for those engineers.
 
Status
Not open for further replies.