AMD Piledriver rumours ... and expert conjecture

Page 107 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
This does not quite make sense to me because were they to do so, a market would develop

*might* develop. And again, with Linux <1% of the market, explain to your manager why it would make sense to recode the game in OpenGL, port over to linux, retest the game under all viable hardware configurations, and the like. Theres no money in it. Sure a market may develop in the future, but that doesn't benefit the game you just released. Its a money looser.

If anything, you *might* see a revival in Mac gaming in the near future, especially if the OSX OS becomes the same as the iPhone OS. Even thats a hard sell though, as you'd basically have to have the industry move back to OpenGL again.
 
This does not quite make sense to me because were they to do so, a market would develop

*might* develop. And again, with Linux <1% of the market, explain to your manager why it would make sense to recode the game in OpenGL, port over to linux, retest the game under all viable hardware configurations, and the like. Theres no money in it. Sure a market may develop in the future, but that doesn't benefit the game you just released. Its a money looser.

If anything, you *might* see a revival in Mac gaming in the near future, especially if the OSX OS becomes the same as the iPhone OS. Even thats a hard sell though, as you'd basically have to have the industry move back to OpenGL again.

As I had said earlier they rarely off game development courses in most colleges on Linux and osx leaving the game development studios to do the training. Almost everything that most students are trained on is only DX maybe one or two openGL project assignments but that is about it. That is the first big reason why gaming is so limited to windows and a few consoles. Companies rarely train their employees for anything outside windows. I hope that changes with in the decade and most migrate to Linux for gaming. I am tired of how bloated and dated windows is while at the same time how poor security is on osx but bound to windows due to applications and older games. Gaming in emulators isn't the real thing.
 
As I had said earlier they rarely off game development courses in most colleges on Linux and osx leaving the game development studios to do the training. Almost everything that most students are trained on is only DX maybe one or two openGL project assignments but that is about it. That is the first big reason why gaming is so limited to windows and a few consoles. Companies rarely train their employees for anything outside windows. I hope that changes with in the decade and most migrate to Linux for gaming. I am tired of how bloated and dated windows is while at the same time how poor security is on osx but bound to windows due to applications and older games. Gaming in emulators isn't the real thing.

Companies train their employees for where they are going to develop. Thats typically windows based OS's. Until significant share develops, companies will continue to target windows based OS's, because thats where the money is.

The issue is one of market share, and nothing else. No market share, no attention. I still think the Amiga OS was at its time FAR superior to everything else out there, but at the end of the day, they couldn't push enough product, and support dropped off.

I'll say it again: Windows isn't getting replaced until either we run virtualized OS's, or until X86 gets totally replaced by a new arch AND windows drops compatability with its older X86 based applications. Until either happens, we're stuck with windows.
 
true
but OEMs putting Ubuntu on some of their netbooks have helped the situation
What I would like to see is the craigslist hustlers instead of sellling non legit Win7 machines putting Ubuntu on instead LOL
I did see a few guys selling their towers that way
it is a shame that most people dont realize Ubuntu is free and for a beginner right out of the box usable for common tasks like browsing,email and word processing
no reason to spend hundreds on OSes or do pirate OSes
really for most of my customers I could do an Ubuntu install that would do everything they need easily
just that Microsoft Windows brand image is so powerful
 
^^ I got news, the "general user" doesn't know how to debug problems. While Ubuntu is kinda stable, the first time they go to a linux support forum and hear about compile time switches or recompiling the kernel, they're done.

And frankly, I simply don't see Ubuntu as a serious OS. Too much windows dressing, not enough functionality.
 
windows dressing is what the "general user" wants
very true if they have a problem ubuntu will be tougher for them
but also imagine if a "general user" has to do a registry edit or even msconfig LOL
Ubuntu isnt there a 100 percent yet but it is headed the right way to be a user friendly OS
and as far as "serious OS"
I dont have a power user in mind when I talk about Ubuntu
sounds like most of your power users will be on other distros like Solaris,Red Hat,Gentoo etc
and though Unbuntu is not a serious OS it still beats Windows 7 in most benchmarks :)
 
most people don't like trying new things unless they have money invested. Not many people would use OS-x if it was free to download on windows machines and you can choose to buy macs with windows on them.
"I payed $400 extra for it, it must be good." then the placebo effect takes in and they just can't stop defending their purchase.

Linux isn't any less stable than OSX or windows. It has all the functionality and the only thing stopping people from using it is every computer you buy comes with an OS. People also perceive linux as old and non functional. OEMs don't want to risk trying to sell Linux because they probably get good relations with Microsoft. They also don't want to deal with people calling in and asking where the start button is.

I have seen no problems with ubuntu. The only thing tying me to windows is gaming and also familiarity.
 
they have been working on Ubuntu with making it "out of the box" friendly
and that gets it some hatred I guess you would call it
alot of hard core professionals and enthusiasts kind of cherish Linux as the operating system of the Elite
which I understand
there is this cool feeling of using Linux and feeling like you are in a special class of user
but as Ubuntu is progressing through new versions it has gotten more and more user friendly as far as GUI is involved
but being user friendly seems to go hand in hand with losing more powerful features
that is what is so awesome about Linux is that ther is a distro for every kind of user or need
 
Bulldozer really isn't as bad as people say it is. Its not amazing but it does perform decently where people would need it for. Only place it doesn't do well is gaming.
People act like you can't even game with bd, truthfully its 1-3% slower than PII in a few games [strike]and 5-10% faster in others[/strike], but mostly dead even. 1 fps difference could be that particular run or some os goings in the background. Even running multiple passes on metro 2033, I can vary 1-3 fps with the same cpu, and up to 5 fps depending on wether its a fresh boot or after being in windows for 2-3 days. People ignore the good aspects and only focus on the negative.

I don't just play games, but do in my spare time, esp on the weekends. I have yet to find a game that forces the cpu (fx 8120) into a bottleneck (100% usage) constantly even after upgrading to 2-CF 6970 PCS+ cards. I really wish reviews would start adopting the techspots cpu usage charts when benching. shows so much more information seeing how a game handles multiple core cpus and multiple gfx cards.

One interesting aspect of that is metro 2033 for example. running 1 gfx card I had 2 cores ~75% usage and other 6 ~20-30%. I thought it was a "dual core optomized game" until I upped to CF. now it runs 8 cores ~60%, with the same 2 cores running 75% as before (95 fps vs 48). Conclusion : no wonder metro 2033 bottlenecks gpus, its a very well coded game.

But I do agree to an extent, poorly written games have a tendecy to run better on Intel systems.
 
I run a PHII Deneb at 3.5ghz but if overnight the "Computer Fairy" swapped out my CPU with a FX-8120 I wouldnt freak out LOL
it was way overhyped
really is more of a workstation/server part then a enthusiast gaming CPU
if marketed properly without the hype for the purposes it could be used well on
(for example workstation apps like Maya,virtual machines,encoding/rendering video etc)
then it might have not left such a bad taste in enthusiasts mouths
really AMD should have made two lines
a Phenom III with clock mesh and larger L3 (12mb) and faster memory controller for enthusiast desktops
and a BD /PD with clock mesh arch for server/workstations and general use
but I guess AMDs R&D budget couldnt handle having too many different arch designs


 
windows dressing is what the "general user" wants
very true if they have a problem ubuntu will be tougher for them
but also imagine if a "general user" has to do a registry edit or even msconfig LOL
Ubuntu isnt there a 100 percent yet but it is headed the right way to be a user friendly OS
and as far as "serious OS"
I dont have a power user in mind when I talk about Ubuntu
sounds like most of your power users will be on other distros like Solaris,Red Hat,Gentoo etc
and though Unbuntu is not a serious OS it still beats Windows 7 in most benchmarks :)

I only keep two distros on hand, Mint and Suse.
 
If I can quote myself "that is what is so awesome about Linux is that ther is a distro for every kind of user or need"

lets hope Ubuntu or another user friendly distro takes off with the general user
unlikely but would really benefit the Linux community
if it happened then maybe we would see some major developer games for Linux

 
...slow day @ THG I assume?
Got a few months of waiting for anything new now. GCN and Keplar are out, and stating to wage a price war I'm sure. Trinity, Ivy, and Piledriver are all a few months out, if not more (well, not that long for ivy). News is slow as usual. AMD seems to be keeping quiet about Piledriver, but there is much talk of Trinity's improvement over Llano. The waiting game never gets old.
 
While we are on this Linux sidetrack, I've played with several of the distros (ubuntu, SUSE, CentOS) and while they were fine one major problem across all of them was drivers. Up until one of the newest kernels I think, there was a major driver bug where Linux would load the wrong wired ethernet driver, rendering the wired connection useless. It wasn't difficult to fix, but it was annoying (as it took out my internet, so I had to use a second computer). Also, USB wireless adapters are generally a disaster (varies by kernel version, 3.2+ is OK). And finally, the release kernel (what is actually in the distros natively) is generally 6+ months behind new hardware. I know it took forever to get AMD APUs working in Linux, though that was mostly AMD's fault.
 
But I do agree to an extent, poorly written games have a tendecy to run better on Intel systems.


Intel learned with Itanium that you can't just build a better chip.

You need a mix of accelerating what compilers already spit out, and making new instructions to really increase performance. New instructions are slow to be adopted. Even with Intel's pull it took a while for Microsoft to get HT to be useful.
 
"

AMD Embedded G-Series APU Platform Adds Real-Time Operating System Support

AMD today announced a collaboration with Green Hills Software, the largest independent vendor of embedded software, that brings its industry-leading INTEGRITY real-time operating system (RTOS) to the AMD Embedded G-Series Accelerated Processing Unit (APU) platform. The combination of INTEGRITY RTOS with the AMD Embedded G-Series APU creates a high performance, reliable and secure embedded computing solution or use across a range of applications including industrial control systems, consumer, networking, military/aerospace and medical.

The INTEGRITY RTOS offers support for multi-core x86 CPUs in the AMD Embedded G-Series APU with its v10.0.2 SMP release. The AMD Embedded G-Series APU offers an advanced, low-power, multi-core x86 CPU and a discrete-class DirectX 11-capable GPU on a single chip. This specific family of APUs was created expressly for the requirements of embedded systems, many of which require the precise, deterministic timing of an RTOS"
source- http://www.techpowerup.com/tags.php?tag=APU


not very useful for us desktop users but it is all the latest news I could find
 
maybe AMD Piledriver inside..."

http://kotaku.com/5896996/the-next-playstation-is-called-orbis-sources-say-here-are-the-details
That means AMD got a clean sweep of the consoles(gpu), right? From what I remember I believe that's true.
Add on that they have the cpu in one of them, looking good for some more $ for AMD.
 
Intel learned with Itanium that you can't just build a better chip.

You need a mix of accelerating what compilers already spit out, and making new instructions to really increase performance. New instructions are slow to be adopted. Even with Intel's pull it took a while for Microsoft to get HT to be useful.


Itanium was absolutely horrible. A VLIW architecture is great at doing vector math but absolutely sh!t at doing anything else.

Intel designed it with the promise's that their compiler would be able to do branch prediction during compile time, this never happened and your stuck with a chip that executed 4+ operations per cycle with half of them needing to be discarded. Even their semi-recent updates don't fix this. The chip can now reorder instruction blocks but not the instructions themselves.

Whatever it is you do, do not ever think that the Itanium was a good chip.
 
Status
Not open for further replies.