AMD Piledriver rumours ... and expert conjecture

Page 82 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
http://semiaccurate.com/2012/03/02/sony-playstation-4-will-be-an-x86-cpu-with-an-amd-gpu/

With the current wave of Playstation 4 leaks, it is time to spill some of the beans on what we know about the console. The short story is that it completes the AMD clean sweep of all the next gen consoles.

Yes, you heard that right, multiple sources have been telling SemiAccurate for some time that AMD won not just the GPU as many are suggesting, but the CPU as well. Sony will almost assuredly use an x86 CPU for the PS4, and after Cell in the PS3, can you really blame them? While this may point to a very Fusion/Llano-like architecture we hear that is only the beginning.

I am an Intel fan, but I really love the Playstation and all I want is to see it survive. I really hope AMD succeeds with this.
 
I again stress that the whole "console porting" thing is a myth made up by unhappy PC gamers who feel the need to be the center of attention. The massive increase in graphical quality some are predicting when we get new consoles will not happen, unfortunatly. [I doubt we'll see much improvement until we move to Ray Tracing, which is still a few years away]

Like publishers really care about graphics more than just bragging and marketing rights... You of all people should understand what means to have a known platform to move around (port). The license for the Unreal Engine is quite expensive, so building your in-house frameworks are usually cheaper when you're an indie and want to make your game for multiple platforms. I'm pretty sure most c or c++ devs will love using x86 when programming a game for a console an then having their bosses tell them to port it to PC. My point is not a tech-based only; it's mostly business-based.

Cheers!
 
http://semiaccurate.com/2012/03/02/sony-playstation-4-will-be-an-x86-cpu-with-an-amd-gpu/

With the current wave of Playstation 4 leaks, it is time to spill some of the beans on what we know about the console. The short story is that it completes the AMD clean sweep of all the next gen consoles.

Yes, you heard that right, multiple sources have been telling SemiAccurate for some time that AMD won not just the GPU as many are suggesting, but the CPU as well. Sony will almost assuredly use an x86 CPU for the PS4, and after Cell in the PS3, can you really blame them? While this may point to a very Fusion/Llano-like architecture we hear that is only the beginning.

I wonder how they won. If Sony was impressed with BD, then they must not see the truth. Maybe its the 8 core thing.....

I am an Intel fan, but I really love the Playstation and all I want is to see it survive. I really hope AMD succeeds with this.

Hopefully but it depends on who gets what out first. If MS gets the next XBox out first again then Sony may have troubles again.
 
Dunno if I'd call it bad news - AMD seems to be covering more bets such as possibly using large ARM arrays for server, similar to what Intel supposedly wants to do with Atom-based servers. Anyway, this should give them more options, esp. in the 'non-competing with Intel' arena..

The problem is they still need a low power server CPU to put in a product like this. The SeaMicro technology is neat but it's just a custom PCIe port for virtualizing peripherals.

http://semiaccurate.com/2012/03/01/why-purchasing-seamicro-was-important-for-amd/

"From there, the breathless pundits don’t understand what a SeaMicro server is, a BBOSNS (Big Box Of Shared Nothing Servers). This means that a SeaMicro server isn’t really a server, it is hundreds of servers in a box, and they don’t talk to each other any more than two laptops sitting next to each other do. They can talk over a network, but there is no direct interconnects between the sockets like a two or four socket server."


The purchase is being hyped up but you'd think if it was really THAT good a company like IBM/Oracle/Dell/HP/Google would pay twice as much for it. Google has cash to burn and they're keen on "Green" compute farms too.

All these companies have "blade" type servers that minimize peripheral power per CPU. Intel's Haswell is going to have a single chip solution where the "chipset" is pulled into the CPU itself. So before the end of 2013 you're only going to need a CPU + RAM for a blade anyway. Boot from network technology has been available for over a decade if you want to run cheap linux boxes.

It will help combat the tide of ODM's that have been scaling back on Opteron solutions in favor of Xeons but what they really need are more skilled engineers in the APU/CPU field.
 
the cell was canned by IBM. I don't know the exact reasons but it was generally regarded as a failure and nearly nothing uses it outside of the ps3. Mostly because it was just too darn hard to program for it to get its performance to be effective.

edit: well apparently it was used in super computers effectively to push peak performance and was very good performance/watt but its time has come and gone it seem.


They had a 32 SPU version which was canned, but the Cell processor lives.
It's gone through 3 die shrinks from 90->65->45->40nm. These all just went to power savings to make the slim versions of PS3/XBox360.

I'm sure they could get quite a bit more performance out of it in a 22/28nm process if they wanted for 2014. IBM had shipped Power7 chips running @5Ghz in 2010.

BTW, the PS2 CPU only had 10.5 Million transistors. Now we're dealing with CPUs with over 1 billion.
 
AMD and the Visual Studio 11 Beta..."
http://blogs.amd.com/developer/2012/03/01/amd-and-the-visual-studio-11-beta/

Tying It All Together..."
http://blogs.amd.com/work/2012/02/29/tying-it-all-together/
 
I agree that more cores looks good. We still have to get past the software engineers who don't think programs can be written to use many threads. But we will.

Another problem is their way of thinking is often half a decade or more behind when it comes to threading their applications. They often end up making highly error ridden bloatware that barely runs on the highest end machines at any given time. At least with the consoles they are forced to cut some of the fat to work with the limited system ram available. Dual and quad cpu machines have been around in different ways for the past few decades yet many don't even go beyond a single core like iTunes.
 
@g4144rd0- good reads-very interesting-though your avatar freaks me out man
I hate clowns especially the ones from "Attack of the Killer Clowns"
I just want to shoot you in the nose LOL :)

Try watching the movie It when you are like 5 like I did. I just hate clowns. Not afraid, just hate them.

Then again fear is not something I often face. Probably because my mom let me watch movies like It, Friday the 13th and Nightmare on Elms street when I was pretty young. Desensitized.

Another problem is their way of thinking is often half a decade or more behind when it comes to threading their applications. They often end up making highly error ridden bloatware that barely runs on the highest end machines at any given time. At least with the consoles they are forced to cut some of the fat to work with the limited system ram available. Dual and quad cpu machines have been around in different ways for the past few decades yet many don't even go beyond a single core like iTunes.

That is true. The one benefit of consoles is the fact that they are forced to work with what they are given and don't have to worry about the thousands upon thousands of hardware configs that are out there like with PCs. But then again that also holds PC gaming back unless the company develops for PC then ports to consoles, like VALVe and a few others.

Even though we haven't seen a major graphics jump like when HL2 came out. That was a huge jump. From block characters with low res textures to smooth characters with high res textures and realistic physics etc.

I doubt we will ever see that again. Even Crysis, while in some ways it had a nice jump it didn't in other ways (still had pretty bad facial animations) and then there is Crysis 2, another example of consoles holding back games. Crysis 2 had at release half the res textures of Crysis, yet was on a "newer" engine. Yea the HD pack was released afterwards but probably because PC gamers kept complaining.

If software devs would get their acts straight we could probably push better. I mean a Core 2 is still a viable processor and thats 5+ years old. A 2500K will probably be viable for at least 5 years due to lack of software development.

I guess we shall see.

While AMD may not have been relasing amazing CPUs in the past few years, they are at least well ahead of software. A FX CPU will probably hit its prime in a few years where it will perform the best with software at that time. Or maybe not. Its hard to say really.
 
Try watching the movie It when you are like 5 like I did. I just hate clowns. Not afraid, just hate them.

Then again fear is not something I often face. Probably because my mom let me watch movies like It, Friday the 13th and Nightmare on Elms street when I was pretty young. Desensitized.



That is true. The one benefit of consoles is the fact that they are forced to work with what they are given and don't have to worry about the thousands upon thousands of hardware configs that are out there like with PCs. But then again that also holds PC gaming back unless the company develops for PC then ports to consoles, like VALVe and a few others.

Even though we haven't seen a major graphics jump like when HL2 came out. That was a huge jump. From block characters with low res textures to smooth characters with high res textures and realistic physics etc.

I doubt we will ever see that again. Even Crysis, while in some ways it had a nice jump it didn't in other ways (still had pretty bad facial animations) and then there is Crysis 2, another example of consoles holding back games. Crysis 2 had at release half the res textures of Crysis, yet was on a "newer" engine. Yea the HD pack was released afterwards but probably because PC gamers kept complaining.

If software devs would get their acts straight we could probably push better. I mean a Core 2 is still a viable processor and thats 5+ years old. A 2500K will probably be viable for at least 5 years due to lack of software development.

I guess we shall see.

While AMD may not have been relasing amazing CPUs in the past few years, they are at least well ahead of software. A FX CPU will probably hit its prime in a few years where it will perform the best with software at that time. Or maybe not. Its hard to say really.

You are right, I remember when Firefox used to run on any thing and still be almost butter smooth now look at. An excellent example of bloatware with it's massive memory leaks that easily suck up 1.5-2gb of ram then another 1-2gb+ in the page file. Many developers just don't care and most doubtfully even bother to look over their code. Could probably do better with a room full of toddlers or blind monkeys pounding away on the keyboards. Most of the good talent has either left the industry or have retired from old age or exhaustion. I guess one can say that all or most of this is just one more symptom of the sub par quality of the education that is available today but on the other end sweatshop conditions don't make quality either. Back in the day people enjoyed writing good efficient code or the incentive was great enough to at least make sure that it worked. Looking at the games today most are just graphics and little or no story line at all. Just point and shoot with no objectives to work towards then forget. It is like what Atari had done towards the end of it's life in the early 80s when game after game was nothing but crap.
 
You are right, I remember when Firefox used to run on any thing and still be almost butter smooth now look at. An excellent example of bloatware with it's massive memory leaks that easily suck up 1.5-2gb of ram then another 1-2gb+ in the page file. Many developers just don't care and most doubtfully even bother to look over their code. Could probably do better with a room full of toddlers or blind monkeys pounding away on the keyboards. Most of the good talent has either left the industry or have retired from old age or exhaustion. I guess one can say that all or most of this is just one more symptom of the sub par quality of the education that is available today but on the other end sweatshop conditions don't make quality either. Back in the day people enjoyed writing good efficient code or the incentive was great enough to at least make sure that it worked. Looking at the games today most are just graphics and little or no story line at all. Just point and shoot with no objectives to work towards then forget. It is like what Atari had done towards the end of it's life in the early 80s when game after game was nothing but crap.

Pretty much. But at least some games are good. Skyrim, Batman and most VALVe titles. But a lot of them are rare and going the way of the Dodo thanks to companies like EA.
 
Pretty much. But at least some games are good. Skyrim, Batman and most VALVe titles. But a lot of them are rare and going the way of the Dodo thanks to companies like EA.

Yep, I got a few favorites on my list through but they are old enough for most now days to have never knew they existed. I miss the Drakan series and shame after Sony had bought the rights they pretty much made sure that it never seen the light of day.

Anyone still remember Westwood and their games?

http://www.youtube.com/watch?v=acBnLMRIAD0
 
Yep, I got a few favorites on my list through but they are old enough for most now days to have never knew they existed. I miss the Drakan series and shame after Sony had bought the rights they pretty much made sure that it never seen the light of day.

Anyone still remember Westwood and their games?

http://www.youtube.com/watch?v=acBnLMRIAD0

Was playing C&C on DOS and Win 95 way back in the day over 28.8K modems. Loved Westwood. Then they got bought out by EA and now C&C is dead, well at least the original series.

Hell for fun I ran C&C Win 95 edition on my Q6600 and 7. Had no frame limiter so it went uber fast.
 
Was playing C&C on DOS and Win 95 way back in the day over 28.8K modems. Loved Westwood. Then they got bought out by EA and now C&C is dead, well at least the original series.

Hell for fun I ran C&C Win 95 edition on my Q6600 and 7. Had no frame limiter so it went uber fast.

LoL, I got a old comp that I pulled from a dumpster while taking out the trash. Turned out it was a old super 7 build so I put in a K6-3+, upgraded the ram to 512mb, upgraded the boot drive to 40gb, replaced the psu, and best of all I installed a 3DFX voodoo 5 5500agp. Runs all those classics without a hitch 😍
 
the ol bait and gouge. I guess all those people who bought it early feel a bit peeved. Much like the ones who jumped on the HD7970s will feel when the G600 series is released from nVidia.


It's been over 2 months since the 7970's was released. 600 series will be about late q2. So, if the releases are about 6 months apart, why would someone care if the 600 series is slightly better when they had to wait almost half a year? Then will people who buy the 600 series be mad when 8000 series comes out by AMD?

Where is the logic?
 
It's been over 2 months since the 7970's was released. 600 series will be about late q2. So, if the releases are about 6 months apart, why would someone care if the 600 series is slightly better when they had to wait almost half a year? Then will people who buy the 600 series be mad when 8000 series comes out by AMD?

Where is the logic?

I was talking mainly about price which the H7900 series will see a drop of when Kepler hits.

As for the HD8K series, no real info on it but I am willing to bet its just a refresh of the HD7K series with a few tweaks. The major change in the HD7K was 28nm which allowed for a larger memory bus and more SPUs which accounted for the majority of performance gains over the HD6K series.

The new arch probably wouldn;t have made any difference in gaming as it looks to be geared towards Stream processing, much like CUDA is.

And I feel that the HD7K is over priced honestly. $550-$600 is way too much for a top end GPU when the HD5870 wasn't even near that upon release.
 
Status
Not open for further replies.