AMD Piledriver rumours ... and expert conjecture

Page 95 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
It looks interesting. I still think reviews will be the best as closed doors means they control everything you see so there could be many factors such as the rest of the specs etc.

The 29% better in CPU still is one thing I question but it is being compared to a Athlon II, not Phenom II in the CPU part.

After the Barcelona hype and the Bulldozer hype, I think most intelligent people will wait for reviews 😛..

Hafta find the link, but a preview in late December pegged Trinity's CPU performance as averaging around 7-15% over BD's which if true means AMD has not solved a lot of the architectural problems yet.

What are the latest rumors about Trinity's release date? I've seen anywhere from Q2 to Q4..
 
dont mind at all.


on a maximum stable or reliable overclock? I can reach 5.1 ghz but the temps and voltage required are more than I like. 1.46v, 56C after 1h of prime 95, even though thats still lower than some of toms testing, imo not worth 400 mhz, but doable.


Stable OC, To me that's all that matters. What cooler are you using i'm only doing the 212+.
 
AMD stock price is up nearly 50% in the past 3 months. That can't all be from graphics (the 7k series), anyone think they have some trinity samples showing some nice numbers?

Also, completely off topic, did anyone see the gtx 680 article? I'm calling lies on that. No way is Nvidia undercutting AMD's power use that much.
 
4! The WideIO REVISION 1 spec is for 1 to 4 die stack. Again, this is just 1 of many types of 3D memory technology. IBM, Micron, Intel, Hynix and others are working on their own versions.

Please share with us what your smoking cause that sounds a whole lot like "640KB will be enough for anyone".

Funny that Palladin should quote one of Bill Gates' more infamous statements, while making his own prediction that stacked memory will never ever ever happen 😛..

Blanket statements about something being impossible or 'will never happen' in the big R&D budget, hugely competitive CPU design arena demonstrates both ignorance of current trends and a massive failure to learn from history 😀..
 
After the Barcelona hype and the Bulldozer hype, I think most intelligent people will wait for reviews 😛..

Hafta find the link, but a preview in late December pegged Trinity's CPU performance as averaging around 7-15% over BD's which if true means AMD has not solved a lot of the architectural problems yet.

What are the latest rumors about Trinity's release date? I've seen anywhere from Q2 to Q4..


I pray its at least as good as the phenom per clock(PD would need to be 10-15% better per clock then the BD), And is overclockable up-to 4.8-5.2Ghz(with a heat-sink) with the mesh tech. This would be great for piledriver, Plus i hope it has great price/performance these may sound crazy but i don't think i'm asking for to much.Heck i even bet Intel can get 10-15% faster per clock with ivy and i think it will probably overclock to 5.0-5.2Ghz with a good heat-sink. Heck Sandy does 4.4-4.6 pretty easily and ivy is going to have fantastic tdp numbers. But i feel amd is being to quiet about PD like they were with BD.
 
Funny that Palladin should quote one of Bill Gates' more infamous statements, while making his own prediction that stacked memory will never ever ever happen 😛..

Blanket statements about something being impossible or 'will never happen' in the big R&D budget, hugely competitive CPU design arena demonstrates both ignorance of current trends and a massive failure to learn from history 😀..

Talking about putting words into people's mouths.

I said, and you can go back and quote me, that it will never replace main memory. Caz said it would. This was in reference to desktop CPU's and possibly notebook / laptop CPUs.

I've stated that it will replace memory in the mobile phone / tablet markets.

But please continue to misquote and strawman all day long. We can all see you.

To Caz,
Samsung is already working on 16 layer stacks, going to 20+ it's one die.

And you didn't know about horizontal until I told you about it. I've been following this for quite some time now, I happen to know Samsung people. Your attitude would of had you releasing all those pictures the very moment I disputed your claim that it would replace desktop memory. Instead you've used your goggle-fu to post stuff while pretending you knew about it all along. You can't retcon in real life you know.

And you still haven't said would run on this 512MB x86 CPU with Windows 7?

You can't use it on the "low end" because by definition it will cost more then a CPU without the memory and perform worse. Even if you move the stack to the side your still sharing TDP with the main CPU. So now you've got a more expensive and slightly less performing CPU that can't expand its memory and doesn't have enough memory to run a modern OS nor applications.

I can just envision it selling like hotcakes.
 
jdwii was right. Far more 32bit CPU's are shipped per year than 64bit.

Windows 7 is increasing 64bit numbers but even close to half of Win7 clients are running 32bit.

You're too focused on the desktop and server market.


Umm this thread is about Trinity / PD / Next AMD CPU. We're discussing everything here in the context of the desktop / laptop world of x86. There hasn't been a 32-bit x64 CPU made in years, think Via made was the last ones, possibly older Atoms.

Nearly every home PC has a 64-bit CPU, Windows 7 x64 is outselling the 32-bit OS. OEM's aren't even offering x64 versions anymore.

Office Applications, Webbrowsers, and Media encoders are all offered in 64-bit versions. The only software that hasn't made the transition is video games, strangely the segment that could benefit from it the most.

64-bit it's tomorrow, it's today. It's happening right now.

Like I said, what rock have you been living under.
 
number 1 not everything needs to be in 64bit,

Wrong. x86-64 offers several improvements over x86. More registers and faster memory model are just a few. NTx64 kernel offers several security features for 64-bit programs running in native space. NTx64 must run 32-bit programs in an emulated environment space to preserve compatibility. This requires a mode change from Long Mode to Compatibility Mode every time a 32-bit thread is scheduled to run and a mode change back when the thread has finished it's time slice.


And number 2 it takes to long to make a 32bit program into 64bit and the end results are barley worth it half the time,

Very wrong, horribly incredibly wrong. So wrong I now KNOW you have no clue about ISA's, specifically the x86 one.

Your just recompile the add with a different compiler target. 90% of applications will cross compile just fine. It only gets dodgy if the programmer was lazy and improperly used the data type long. That AMD document I linked was from AMD demonstrating exactly what to look for when recompiling, namely the poor usage of the long d-type. It also lists ways that a programmer can get more performance by tweaking certain data sets and loops for newer x64 instructions. One of my hobbies is getting x86 programs running on my SunBlade 2000 at the house. It's got 2 UltraSparcIII CPU's @1Ghz inside it. I'm constantly compiling code of common Linux programs. I've even got DosBox 0.74 working, yes an x86 DOS emulator working on a Sparc CPU, even got Windows 3 to start. I've compiled Firefox, OpenOffice, SDL, and a ridiculous amount of common software for Sparc. NONE of that was written for Sparc, which is a 64-bit uArch, yet it magically compiled.


And this is the major reason why-companies are cheap and some programmers are lazy.
Programmers are incredibly lazy, too many use's of "undocumented features" and cutting corners thinking it won't hurt anything in the future. Good thing is that gcc is fairly smart and fix's most of the laziness when you do a compile.

Not to mention 32bit can support 4GB of ram and usually only 3.25GB on windows

Wrong. 32-bit on x86 is 2GB of memory not 4GB. You need to recompile with LAA support to use 4GB and you lose the ability to access kernel memory when you do that. On a 64-bit OS LAA will auto-magically work, the OS needs no special notification. On a 32-bit OS you can only get 3GB and are required to boot the OS with the /3G boot option. *WARNING* doing this will limit your OS's kernel memory to whatever is between 3 and the cap on your system. If your at 3.25 of usable memory then your kernel must work inside a 256MB window, this is very dangerous and will often result in an unstable OS. If your limit is 3.5GB then your kernel will run in 512MB of address space, this isn't good but chances are you won't be crashing constantly. LAA is dangerous on 32-bit Windows and that's the reason it's ill advised from MS to use it.
 
Actually... People/Devs/We know the advantages of coding for 64bits since a long long time, but most consumer level applications are still stuck with 32bit libraries, which kill most of the 64bit momentum gained so far.

It's really expensive (REALLY REALLY expensive) to re-do code for systems that are no longer in the dev cycle/scope. It's like building it all over again and most companies won't leverage that cost so easily.

That's why Java and C# with most of the interpreted langs (e.g: Perl) are becoming more and more important/popular. They let you focus on your problem at hand and they handle the "backend" of it (I think it was called additional abstraction layer? don't remember XD).

Well, that's from my own experience so far (3 years worth only) of being a Dev using Java and C (yes, the original, lol) mostly.

If we take into account that moving from 16bits to 32bits took like 20 years, we'll... You can do the math of this one. First windows, full 32bits, was 2000/XP with the NT5.0 IIRC. I'm sure Palladin remembers it better, hahaha.

Cheers!
 
Toms forum ate my post -.-

In addendum to my above post. I've gone so far as to recompile Quake for UltraSparc III. My SunBlade has an XVR-1200 PCI-X OpenGL GPU, it's basically two WildCat IV's working in tandem with about 512MB of memory. I've installed a Sound Blaster Audigy II PCI sound card and used OSS, it works wonderfully. System has 8GB of Memory, think it's SDR but I'll have to check, it was back when Sun made their own *special* memory. Two 10K FC-AL disks installed.

So yes, I actually have a Sparc "gaming" rig that I screw around on, just for the lulz.
 
Actually... People/Devs/We know the advantages of coding for 64bits since a long long time, but most consumer level applications are still stuck with 32bit libraries, which kill most of the 64bit momentum gained so far.

It's really expensive (REALLY REALLY expensive) to re-do code for systems that are no longer in the dev cycle/scope. It's like building it all over again and most companies won't leverage that cost so easily.

That's why Java and C# with most of the interpreted langs (e.g: Perl) are becoming more and more important/popular. They let you focus on your problem at hand and they handle the "backend" of it (I think it was called additional abstraction layer? don't remember XD).

Well, that's from my own experience so far (3 years worth only) of being a Dev using Java and C (yes, the original, lol) mostly.

If we take into account that moving from 16bits to 32bits took like 20 years, we'll... You can do the math of this one. First windows, full 32bits, was 2000/XP with the NT5.0 IIRC. I'm sure Palladin remembers it better, hahaha.

Cheers!

*Shudder*

16-bit reminds me of DOS4GW and needing an extender to access memory over 1MB.

Recompiling a program from 32-bit to 64-bit isn't hard, it wont' take the full advantage of the 64-bit uArch but it'll at least run natively in windows and won't crash if it pass's 2GB.

Here is a good link to a method to fix many older programs that don't watch their memory allocation rates.

http://www.techpowerup.com/forums/showthread.php?t=112556

You can patch any windows executable for LAA so it won't crash to desktop. Won't do anything if the program consciously doesn't go about 2GB of memory usage.


My point with 64-bit is that it's happening now, not next year. All the library's have already been compiled over. I know this because I have tons of 64-bit software installed on my home PC. Windows programs are actually one of the easiest to port as MS's compiler will take care of the improper usage of the long d-type for you. MS already provides 64-bit libraries for all their products and developer material.

Two years ago I would of agreed that we weren't at 64-bit yet, but not now. Flash and Java have both 64-bit library's and plugins. IE9 and Firefox both have 64-bit versions available. Office 2010 is now 64-bit. OpenOffice (what I use at home) has been 64-bit for awhile. x264 and various other media tools are now 64-bit. Heck adobe is now 64-bit on their programs. 7-zip has been 64-bit for ages.

Of course all old programs are 32-bit, nobody's going to go back and recode / recompile a program from a few years ago. But all newer programs are coming out as 64-bit. All except games, which would benefit from it the most.
 
Pics or it didn't happen =o

Cheers!

Dude you don't know the half of it.

I was trying to recompile Duke3D from the E-Duke repository and those guys are horrible. They used some inline asm in their code. Inline asm has become the bane of my existence. Which is funny because I absolutely loved it many years ago. I loved being able to design a quick and simple function to perform some operation many times faster then C would natively do it. Unfortunately asm doesn't translate from one ISA to another very well, x86 asm will not compile on a sparcv9 system. I have to go through their code and manually recode it from x86 to sparc, and that is crazy annoying. Now I'm learning the errors of my previous ways.
 
Are you torturing yourself with that? xD

And at the end, how did it go with Duke3D? Did you manage to re-do the whole assembler code? Is it running? 😵

Cheers!

Haven't finished it yet, not even close. Like I said this is all a side project for me, it allows me to learn more and more about how these systems work. I'll eventually finish it, or might get some support from the EDuke guys. I can get regular Duke3D to compile but then it's the same old software 3D engine. I want the high res 1280x1024x75 OpenGL video engine, complete with the "Hi-Res" graphics pack they release. No kidding, someone actually took the time to redo all the Duke3D textures into higher resolution and recode the engine to support advanced OpenGL features. Quake was easier as it already supported OpenGL, SDL was painful as it expects a linux like Audio subsystem while Solaris use's /dev/audio as a symlink to /devices/psuedo with a different coreaudio subsystem. Biggest reason I put in the SB Audigy was that onboard e-bus ES sound unit didn't work well with OSS and SDL.
 
The SunBlade 2000 was made back in 2000~2002, EOL'd in 2004. Was expensive as hell workstation for 3D CAD and graphics. The XVR-1200 was a dual-slot GPU with two OpenGL 3D chips on it. 128MB of framebuffer memory, 256MB of texture memory and 32MB of display list memory, so 416 of total memory not 512MB.

So it s a fun box to play with, got enough kick to actually do stuff, even though its now 10yrs old. Got to build it from parts, got the base unit from some reseller who had no idea what they had on their hands. Bought new CPU's from ebay along with the memory and went to town.
 
I was thinking 70 or 80 series to look at first, twice now with the 100 series.
is it that much better.?

100 is much better. Double the radiator size, better heat dissipation. Also you can add another set of fans for push/pull. If you have a Corsair case, it will fit right in the top area. If not it should fit in most cases that can take a 240mm water cooling radiator.

Umm this thread is about Trinity / PD / Next AMD CPU. We're discussing everything here in the context of the desktop / laptop world of x86. There hasn't been a 32-bit x64 CPU made in years, think Via made was the last ones, possibly older Atoms.

Nearly every home PC has a 64-bit CPU, Windows 7 x64 is outselling the 32-bit OS. OEM's aren't even offering x64 versions anymore.

Office Applications, Webbrowsers, and Media encoders are all offered in 64-bit versions. The only software that hasn't made the transition is video games, strangely the segment that could benefit from it the most.

64-bit it's tomorrow, it's today. It's happening right now.

Like I said, what rock have you been living under.

I think you meant 32bit.

And thats not quite true:

http://www.newegg.com/Product/Product.aspx?Item=N82E16834200453

http://www.newegg.com/Product/Product.aspx?Item=N82E16834200437

http://www.newegg.com/Product/Product.aspx?Item=N82E16834110482

While its hard to find, I come across quite a few laptops with 32bit 7 and 2-4GB of RAM. All of those have 7 32Bit and are second gen Intel Core i CPUs.

We do get a few people who ask for 32bit, mostly they are idiots as they think they will have compatibility issues when they wont but we still install it when they ask.
 
100 is much better. Double the radiator size, better heat dissipation. Also you can add another set of fans for push/pull. If you have a Corsair case, it will fit right in the top area. If not it should fit in most cases that can take a 240mm water cooling radiator.



I think you meant 32bit.

And thats not quite true:

http://www.newegg.com/Product/Product.aspx?Item=N82E16834200453

http://www.newegg.com/Product/Product.aspx?Item=N82E16834200437

http://www.newegg.com/Product/Product.aspx?Item=N82E16834110482

While its hard to find, I come across quite a few laptops with 32bit 7 and 2-4GB of RAM. All of those have 7 32Bit and are second gen Intel Core i CPUs.

We do get a few people who ask for 32bit, mostly they are idiots as they think they will have compatibility issues when they wont but we still install it when they ask.

I said OEMs, as in you go the Dell or Hp website and try to order a product. And while they have it as an option on a few models, the default is Windows 7 home x64 and the 32-bit isn't any cheaper. Basically their not defaulting to 32-bit and unless the customer specify's that they want a 32-bit version their going to get a 64-bit one. We're power users we will pick and chose what we want, the majority of hte world will just take what their given.

The whole point is that 32-bit desktops are shrinking and shrinking fast. As a big box sales person you should already know this, most of your sales should now be 64-bit systems just because it comes that way from the OEM. Compare this to what as on the shelves two years ago.

64-bit computing is now, not tomorrow.
 
I'm willing to bet you're one of the few people on earth that would know what to do with any kind of Solaris/SUN hardware as a home system, lol.

Hell, when I got into University we had the SPARCstation 10 and 20 rooms. That was my first time messing around with Solaris and I didn't like it TBH. Damn those things were cool but hard to manage. Then we got the Blades... Damn those things were fast and cool.

Cheers!
 
I said OEMs, as in you go the Dell or Hp website and try to order a product. And while they have it as an option on a few models, the default is Windows 7 home x64 and the 32-bit isn't any cheaper. Basically their not defaulting to 32-bit and unless the customer specify's that they want a 32-bit version their going to get a 64-bit one. We're power users we will pick and chose what we want, the majority of hte world will just take what their given.

The whole point is that 32-bit desktops are shrinking and shrinking fast. As a big box sales person you should already know this, most of your sales should now be 64-bit systems just because it comes that way from the OEM. Compare this to what as on the shelves two years ago.

64-bit computing is now, not tomorrow.

I never disagreed with you that 64bit is not the current, just that it is still possible to get 32bit. I think anyone who goes 32bit is full of it, just like those who say 7 is a bad OS.

Had a guy at the shop the other day say 7 was horrible and that the company he works for wont upgrade to it, even though MS is stopping support for XP.

I think he was full of it since he said it was a multi-billion dollar company but who knows.
 
I'm willing to bet you're one of the few people on earth that would know what to do with any kind of Solaris/SUN hardware as a home system, lol.

Hell, when I got into University we had the SPARCstation 10 and 20 rooms. That was my first time messing around with Solaris and I didn't like it TBH. Damn those things were cool but hard to manage. Then we got the Blades... Damn those things were fast and cool.

Cheers!

I take your talking about SunBlade 100 or 150s? Interesting devices, Sun tried to do the HP. Their UltraSparc II's but with IDE drives. If you look inside them they look just like a regular flat desktop PC. Their one big flaw was that DMA didn't work with their IDE drives so intense disk access would low the system down to a crawl. We put SCSI HBA's in ours with SCSI disks to solve that problem. Of course we later decommissioned all those in favor of NT clients, smart decision.

Solaris is archaic as hell. Solaris 8 and prior is really bad, trying to get around the OS to actually get something done is difficult, you just end up using a terminal for everything. Solaris 10 tried to fix it, they use the JDE (Java Desktop Engine) and Sun really pushed people to use Java to build all their desktop applications and user interfaces. Everything OS related is still command line driven including mounts. At least their graphical file browser understands SMB network file systems. You can actually browse to a windows file share and copy / paste files to and from it. Also if you can configure the box to use ldap for user authentication, so you can authenticate to an Active Directory account. At home I log into my Sunbox using my home AD and do magic from there.

Like I said, this is a big hobby of mine, used as a learning tool.
 
I never disagreed with you that 64bit is not the current, just that it is still possible to get 32bit. I think anyone who goes 32bit is full of it, just like those who say 7 is a bad OS.

Had a guy at the shop the other day say 7 was horrible and that the company he works for wont upgrade to it, even though MS is stopping support for XP.

I think he was full of it since he said it was a multi-billion dollar company but who knows.

Yeah he's full of it. Company's held off of Vista but most want to move to Windows 7. I just takes time, lots of time.

Where I'm at our NT clients are still XPSP3, that's due to our approved / certified / tested / hardened client images being based on XPSP3. Our next full release will be Windows 7 x64 but that's not slated till early to mid next year. Right now our primary client application consumes 1GB+ of memory and with what they want to add it will consume more. The 4GB client memory limit is kicking us in the nuts. I'm glad I only have to worry about the server back-end where we don't have any of those arbitrary limitations.
 
Corsair case is most optimal for the Corsair Hydro series..?

Meh any midsized tower case can do that. Just make sure there is a big enough back fan mount. Actually try to get measurements from the fan to CPU mount so you can be sure of clearance.

Are you only trying to cool the CPU? Those "all in one" type packages are usually crap compared to what you can build yourself.

Also whatever you do, do NOT use prepared coolant / dye, it'll gum up the waterblock eventually. Use pure medical water with a silver kill coil. If you can't fit a kill coil inside your line / res then get a silver lined fitting for the CPU block. It'll ensure that no bacteria or algae develop inside your cooling line.
 
Corsair case is most optimal for the Corsair Hydro series..?

In a way, yes. Pretty much all the Corsair case series are fitted to carry the Hydro series with no issues. My 500R will mount it right up top with no real work unless I want a push/pull setup.

But most mid sized cases will work, its just the Corsair cases were made around other corsair products.

And the Hydros are crap compared to real water cooling, but they also cost half the price as well. We have some nice high end water cooling kits at work but they start at $230 while the H100 is $119.

But I think I will stick to air. Tried and true and very little maintenance. I wanted to go to the Zalman 12X but it doesn;t seem to outperform the 9900Max enough or in some reviews, at all.
 
Status
Not open for further replies.