AMD Piledriver rumours ... and expert conjecture

Page 106 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
If more people ran linux, there would be a market for their games.

Demand creates supply. No significant demand for games on linux, no production of significant amounts of games on linux. Basic market economics.

And I honestly can't believe people consider Ubuntu to be a serious OS...

I want to say that the Linux community cannot support the latest hardware and that is why games aren't made to run on Linux but I do not know that is actually true. At one time, drivers could not be made because the hardware manufacturers would not supply the needed information. Maybe it is the other way around now? I doubt it. I think that, as gamerk316 has said, with whom I usually disagree, I think that the game developers do not see profit in making their games run on Linux. This does not quite make sense to me because were they to do so, a market would develop. Indeed, my best guess right now is that perhaps the Linux community needs to reach out somehow. This is why Microsoft has an advantage, it's their job to see that games happen on their OS and they will be paid for it.

Linux is wonderful. I know this from a user perspective and a developer perspective. My conclusion has been to run both Windows and Linux because there are good programs for each. It is worth saying, I think, that people who run UNIX and Linux understand what a computer is for. That is, lots and lots of things. It is so that computers are made and sold with Windows and are used as appliances. This has led to a misconception of what a computer is. How people can think so shallow is beyond my thinking but it is clear that this has occurred. Some have even said that the computer as we know it will be replaced. I see that happening only by something that offers more power than what we know now as the desktop computer/workstation, certainly not by appliances and "The Cloud." I know Linux users will never give up the power they have.

Precisely why this has a place in this discussion? Ok if I have to... Bulldozer was made for a computer not an appliance. That's why I like the design. Too bad it didn't turn out so well, maybe Piledriver will. Most disturbing is AMD's attitude "Ok, we give up."

BTW, does Direct X work on Linux? That's what needs to happen. The reasons why this is not happening, those are the reasons we don't game on Linux. If I'm wrong, please tell me. It's been a few years since I looked at this.

Finally, good software engineering makes porting to other architectures possible. I'm not saying good software engineering is easy, I'm saying it's the right thing to do and the responsible thing to do. Just because you can write code quickly that works doesn't mean it's the right way.
 
It didn't perform to the hype and doesn't perform to the price.

The fact that Phenom II beats it in many benchmarks is the primary reason why it gets so much hate. Sure, it will get the job done, but so did Phenom II. What's the incentive to upgrade when it's not really an upgrade. Anytime a company releases a product that doesn't beat performance of previous products then there will be those that claim it is a bust.

If you want a general purpose PC, what's the point in going BD? To support AMD. That's about it.



Agreed but I think people will be fine with a A8 or a new trinity, 90% of the people don't need a 8 core BD or a I5 a I3 is enough and i don't care what people tell me more people in the general public would want better graphics then a CPU. I will always feel this way! My A8 in my laptop is way enough for me but a I3 would not come close why because i play games and such. Plus i feel Amd's graphics look better then Intel on videos which is what i do most on this Laptop! I like the options in the software for settings a lot.
 
Agreed but I think people will be fine with a A8 or a new trinity, 90% of the people don't need a 8 core BD or a I5 a I3 is enough and i don't care what people tell me more people in the general public would want better graphics then a CPU. I will always feel this way! My A8 in my laptop is way enough for me but a I3 would not come close why because i play games and such. Plus i feel Amd's graphics look better then Intel on videos which is what i do most on this Laptop! I like the options in the software for settings a lot.

I'm excited to see how Ivy Bridge vs Trinity plays out. I wonder if IVB's graphics will be "Good Enough" like Llano's are now.
 
I'm excited to see how Ivy Bridge vs Trinity plays out. I wonder if IVB's graphics will be "Good Enough" like Llano's are now.

Read the reviews. I think that the new Ivy Bridge graphics will be good enough for many uses. Ivy Bridge is going to be great. My guess is that Trinity graphics will be better and the cpu component not as good (comparing to Ivy Bridge.)
 
32nm Phenom III/III+ w/RCM
I cringe every time I think of what should have been if they had just kept BD for the server line and continued a true desktop chip.

Llano's are basically Phenom III's. Their refereed to as K10.5 haha.

I would like to see AMD take the GPU out, put in inclusive L3 and using the RCM technology to get it over 4~5Ghz. It's already at 32nm with 1MB of L2 to each core. Instead they used the BD uArch which makes me nervous. BD works well for servers needing a wide processing profile (many independent threads), especially if its running Linux / Solaris which don't have psychotic ADD schedulers. Maybe future NT schedulers will try to minimize the constant shifting of threads across different cores.

My prediction is that AMD will eventually drop the core BD uArch in favor of a hybrid Stars / BD / Fusion uArch. Something most people don't know is that GCN and the latest SIMD ISA's can process fast integer math, this makes logical compares the only x86 instructions that can't be off loaded to another device. It would be interesting to see the "GPU" inside a trinity / ~whatever~ being turned into a coprocessor when a dGPU is added.
 
I cringe at Ubuntu. It's nice for beginners, but any time real work needs to be done it's on Redhat or CentOS (At least where I work).

Ubuntu does have a nice GUI I suppose.


CentOS is my linux of choice at home. Use it for my router and whatever other projects I have going on. Like that it has so many enterprise class features out of the box and extremely customizable.

For a "real man's OS", Solaris hands down. Archaic and hard to understand, but once you got a grip on it you realize how powerful it is. ZFS, zones, Sun Link Aggregation, fss and how the SMF works. Glad its on x86 now so newer people can try it out and play with it.
 
Wine is ~meh~. It's just emulating the NT environment with binary DirectX DLL's and an OpenGL wrapper. Much better to have games coded in OpenGL and native POSIX binarys. I would absolutely love gaming on a Linux Platform, so much more that I could do with it.

Drivers are no longer an issue with Linux. That was a problem back in the days of "Win modems" and specialty USB drivers for devices. Now almost everything use's a generic standard which makes drivers easy to build / port. Video devices are now supported directly from ATI(AMD) / Nvidia. Nvidia for one is always releasing binary drivers for Linux / Solaris (x86 and Sparc).

Yes Nvidia actually makes the video cards in the newer SPARC systems.
 
DERP
Wine is the windows game emulator duh
I have been trying to remember the name since I posted that
I havent tried it yet
Ubuntu is a challenge for me LOL
I dont have a brain for pathlists and coding
I am just a simple hardware and Windows installation tech
right now I dont have the hard drive space for a proper Ubuntu install but when I do I will dual boot it again
I do have Ubuntu 11.04 on a bootable USB flash drive in case I have a corrupted windows I can still get to my files
 
Llano's are basically Phenom III's. Their refereed to as K10.5 haha.

I would like to see AMD take the GPU out, put in inclusive L3 and using the RCM technology to get it over 4~5Ghz. It's already at 32nm with 1MB of L2 to each core. Instead they used the BD uArch which makes me nervous. BD works well for servers needing a wide processing profile (many independent threads), especially if its running Linux / Solaris which don't have psychotic ADD schedulers. Maybe future NT schedulers will try to minimize the constant shifting of threads across different cores.

My prediction is that AMD will eventually drop the core BD uArch in favor of a hybrid Stars / BD / Fusion uArch. Something most people don't know is that GCN and the latest SIMD ISA's can process fast integer math, this makes logical compares the only x86 instructions that can't be off loaded to another device. It would be interesting to see the "GPU" inside a trinity / ~whatever~ being turned into a coprocessor when a dGPU is added.

That was actually one of the more interesting things of it to me. Its almost Larrabee like, while not being a bunch of x86 cores it still is able to do more stuff like a CPU than before which makes it interesting.

Of course it being useful is dependant on many factors like software devs to actually take advantage of it, much like Quick Sync.
 
CentOS is my linux of choice at home. Use it for my router and whatever other projects I have going on. Like that it has so many enterprise class features out of the box and extremely customizable.

For a "real man's OS", Solaris hands down. Archaic and hard to understand, but once you got a grip on it you realize how powerful it is. ZFS, zones, Sun Link Aggregation, fss and how the SMF works. Glad its on x86 now so newer people can try it out and play with it.

I've been using Gentoo since 2006, after RedHat 7.2 died on me, lol. I think I got a degree fixing broken dependencies with emerge, hahaha.

And like I told you, Solaris didn't leave a good impression the first time, so I never got into it. I've got friends that love it though.

That was actually one of the more interesting things of it to me. Its almost Larrabee like, while not being a bunch of x86 cores it still is able to do more stuff like a CPU than before which makes it interesting.

Of course it being useful is dependant on many factors like software devs to actually take advantage of it, much like Quick Sync.

Since AMD bought ATI, I was hoping for a faster development of the "fusion" technology. It's been a lot of time since then; too bad AMD doesn't have the R&D budget Intel has.

Well, at least it seems they're onto something big with GCN and their HPC layer (or was it another acronym? lol).

And no one has expressed amazement for that video! I mean, come on! It's FIRE in your desktop. FIRE I tell you! And no, it's not Fermi nor RV600 inside, lol.

Cheers!
 
Well unless you've lived with Solaris for a few years doing anything is hard. Some commands and syntax's are similar but lots of things are different. I started my "unix" life working on RHEL and HPUX systems, then moved on to Solaris.

Well, the next step in fusion would be to have the front-end decoder / scheduler dispatch x86 / SIMD instructions to the internal GPU instead of the ALU's (FPU would be merged with the GPU). No software recoding would be required, software optimizations would be desired for maximum performance. The linch pin would be the front end scheduler or the OS scheduler deciding which sets of code should run on the fast / narrow ALU or the wide / shallow GPU.
 
Remind me of something, palladin...

How did the drivers interacted with the CPU instruction wise? Driver -> Kernel -> Scheduler -> Instruction -> BUS call to the "thing"?

I have this big open space in my mind regarding the driver layer, hahaha. And this is related to the front-end decoder being shared for the GPU and CPU in the APU.

Cheers!
 
Are we talking binary machine code or shader / 3D language? Different types of code have different paths they take till they get processed.

Binary code goes through the Windows kernel, it never touch's drivers unless it makes a call to that particular subsystem. The windows kernel scheduler is what's responsible for placing binary code to a processing target and also what's responsible for triggering interrupts to take control of that target. This was all back in the day when having only 1 processor was common and OS's had to ensure that nothing would lock the OS out from the CPU.

Program / Library -> NT Executive -> Kernel -> Scheduler -> CPU

For 3D and OpenCL stuff that's non binary

Program -> Library -> NT Executive -> Kernel -> Driver -> Hardware (GPU / APU)

Better explanation of NT executive and how window's treats stuff

http://en.wikipedia.org/wiki/Architecture_of_Windows_NT
 
Llano's are basically Phenom III's. Their refereed to as K10.5 haha.

I would like to see AMD take the GPU out, put in inclusive L3 and using the RCM technology to get it over 4~5Ghz. It's already at 32nm with 1MB of L2 to each core.

They did release a 32nm Athlon II but the few people I read about trying to overclock it didn't get very far. Locked multipliers and such.

AMD Athlon II X4 638
AMD Athlon II X4 641
AMD Athlon II X4 651

This passmark bench shows the 651 (32nm/3.0 Ghz) tied with the performance of the 955 (45nm/3.2 Ghz). A slight IPC increase.

http://www.cpubenchmark.net/cpu_lookup.php?cpu=AMD+Athlon+II+X4+651+Quad-Core

I'm guessing they were too defective to fully disable the GPU so it still has some power draw. Otherwise they should be the most overclockable chip AMD has.

Doesn't look like AMD will have any more powerful Llano/Bobcat chips out this year either. They're stuck at 40nm for cost reasons/problems with TSMC.

TSMC capacity at the cutting edge is being bought out. Meanwhile Intel is ramping up 22nm production for their low end chips.


 
The Athlon II's were just defective Desktop Llano CPU's that had their GPU part disabled. There were clock locked and didn't feature turbo boost.

Turbo boost is important only because it allows you to dynamically control the CPU's clock speed. My 1.9Ghz 3530MX was able to run all four cores at 2.6Ghz for the entire CB 11.5 benchmark. It got really hot but it didn't crash or seize up. The biggest limiting factor would be the laptop's cooling characteristics. I would say the Athon II's GPU component is drawing full power due to it's high TDP. The DV6 doesn't exactly have a big PSU yet I'm running the GPU and all four cores @2.6 without overloading it.

Cut out the GPU and replace it with an inclusive L3 to stave off the psychotic scheduler issue. Put the RCM technology to further reduce power draw and ramp up clocks.
 
Now that you mention it, if Microsoft actually does what it's thinking, the dynamic scaling will use a lot of resources, making the CPU useless for rendering at real time fluidly.

Compiz in Linux has demonstrated that a GPU can do pretty neat stuff in the desktop as eye-candy and some useful features for power users (like the smooth android transitions and stuff). If Microsoft can do that as well, but using Kinnect and GPU processing for it, I'd say our GPU necessity will skyrocket.

It's up to nVidia and AMD to make MS think about such possibilities... Although this is a very far fetched thought, hahaha. More like a dream at this time. And yes, I've seen Minority Report a lot 😛

Here's what I mean:

[flash=640,360]http://www.youtube.com/v/MN5VIVNSJ5I?version=3&feature=player_detailpage[/flash]

Cheers!


I thought that was a really great video
that is what a modern OS should be doing
 
Status
Not open for further replies.