AMD Piledriver rumours ... and expert conjecture

Page 80 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
If IPC means more than cores, whatever the scenario, then why doesn't intel make a 1 billion transistor single core, and just improve IPC on that one core every generation?

Sure, if of course Intel miscounted that billion transistors to the tune of, say, 600 million 😀.

Seriously though, it's a matter of diminishing returns. For example, Intel or AMD could stick in more than 4 decoders/pipelines into each core, but might realize only a small improvement, not worth the extra transistor and power budget. Intel has previously stated a policy of not sticking in any features unless it realizes at least a 2x performance increase for every 1x increase in power usage. AMD, I dunno what their policy is but I would think something similar.

Not to say that BD was a good product, software just isn't ready for it, no. BD is bad, and to make up excuses for why it is, that's just a waste of time. It has been said a hundred times before, but it is true: AMD should have made a product that would be good now, not when software allows it to be.

Historically AMD has missed the current market by a wider margin than Intel, at least since Netburst days anyway. For example, AMD touted the 'native' quad-core Barcie with hypertransport as being better than the 'double cheezeburger glued-together' MCM with front-side bus approach Intel took with C2Q. However Intel's approach worked just fine for desktop and even up to 2-socket server, before the front-side bus became too limiting and didn't permit performance to scale with additional cores or sockets like Barcie's hypertransport did. Intel had the advantage of working closely with the software devs and was probably far better able to measure what software typically needed at the time, and model accordingly.

In contrast, AMD's insistence on native quads cost them delays and then performance issues when Barcie did release. Sorta like BD 4 years later, with now-current software unable to make proper use of the design in too many instances. At any rate, Intel had the quad-core CPU market all to themselves for about a year in view of Barcie's delays and then having to stop shipments due to the TLB bug. That's not particularly good execution 😛, esp. when AMD just went into the hole for ATI..

I speak to those who believe that BD will always be bad because software will not go beyond 4 cores.

By the time the software and OS is tuned to take advantage of BD's strengths, both AMD and Intel will have increased core count. You can always buy a Sandy Bridge Xeon 8-core if you need 8 cores and have kangaroo pockets full of cash 😀..
 
I will interject here and say yes.

http://images.anandtech.com/graphs/graph4083/35047.png
http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/20

There are 4 of 10 cases where HT hurts performance. 40% of games actually appear to run better with HT off, so why would any gamer run a HT machine...

rather than adding another reply, I will edit here, yes, its fairly minor difference in performance, but $100 isn't a minor price difference.

And 60% of games run better with HT. Almost anybody would agree that the 2500K and 2600K are neck and neck for gaming. But anybody who uses their computer to make money will tell you that the 5 to 10 to 20 seconds that you save here and there adds up to well over $100 worth of productivity.

I guess what it comes down to is how valuable your time is. If all you are doing on your computer is playing games, your time isn't very valuable.
 
And 60% of games run better with HT. Almost anybody would agree that the 2500K and 2600K are neck and neck for gaming. But anybody who uses their computer to make money will tell you that the 5 to 10 to 20 seconds that you save here and there adds up to well over $100 worth of productivity.

I guess what it comes down to is how valuable your time is. If all you are doing on your computer is playing games, your time isn't very valuable.
I wouldn't necessarily say better

35041.png


that is only a clock speed advantage, really there are only 3 clear cases of HT actually helping. About the only major benefit with HT truthfully is in 3d rendering and encoding, interesting enough, the same place BD is useful.
 
And 60% of games run better with HT. Almost anybody would agree that the 2500K and 2600K are neck and neck for gaming. But anybody who uses their computer to make money will tell you that the 5 to 10 to 20 seconds that you save here and there adds up to well over $100 worth of productivity.

I guess what it comes down to is how valuable your time is. If all you are doing on your computer is playing games, your time isn't very valuable.
FYI, 90% of games with more threads than physical cores run worse with hyperthreading and you don't save much time with hyperthreading which really only works on very specific programs that a general user will never use. Unless you do rendering/ video editing, you won't be saving any time with hyperthreading. The time you do save, really isn't cutting into your money if you do video editing or editing since you would be doing this overnight generally.

I guess what it comes down to is how much you know what you are doing. If you know nothing about how computers work, you would think spending an extra $100 gets you better performance.
 
Radix-8 division, third-generation "Bulldozer" architecture Secret !?
radix16.jpg


The architecture of the roller is still based on hearsay evidence of the stage, a paper published recently in AMD's research department, by David M. Russinoff confirmed Steamroller will be using the radix-8 SRT floating point module, a week of running instruction from the current radix- two of four units to three.

The Steamroller change the design of the divider (Divider) in the CPU unit, who are interested can refer to a wiki explanation . David M. Russinoff involved in the design of the Llano APU, its DIV Unit is different from the previous generation K10 the DIV hardware support for the design, bulldozers inherit the design of the K10, FMAC (cumulative floating-point multiply unit) division function is limited. Steamroller's design is similar to Llano, of course, will not be 100% the same, because it uses the radix-8 instead of Llano radix-4, the number of instructions executed per cycle from two to three.

Contrast to the pace of progress of Intel's, AMD but it is still slow in many Intel back on the Penryn architecture Core using the radix-16 divider, a week of instruction suddenly increased from two to four The lower data latency, floating point unit and integer unit can benefit from.

Analysts believe that AMD is no reason why the radix-16 with a compromise of the radix-8, FP floating point unit in the modular structure of the two integer units share the radix-16 is too complex, higher cost."

Google Translate_expreview.com
Source http://www.planet3dnow.de/cgi-bin/newspub/viewnews.cgi?id=1330556976

 
So we should pretend that what intel did didn't happen and give them a big hug? :ange:

Time can never be recouperated, sure they can focus on r&d now, they are still on a delayed time table from where they should have been had intel not broken the law. AMD can't go back in time and undo their fab spinoff.

The truth of the matter is that in the cross liscence agreement that amd made public, they pay royalties on every cpu sold to intel, and their reward is to get stabbed in the back for figuring out how to make a better cpu than intel.

I will never forget that is how intel does business, what's to stop them from doing it again.

You can remember it, but using it as an excus to not buy a better product? Thats what I find strange. Ahy would you buy a inferior product? Buying it just because of this or that makes no sense. If you can get a better product for the same price, get it. Supporting a company who is not producing better products stifles innovation.

AMD needs a kick in the pants, now more than ever, to make a better product. They can do it, they just seem to not want to.

AMD has had their share of mistakes, GF was a big mistake in my opinion as they lost control over the process and when that happens, there is n 100% that its going to be good. AMD couldn't delay BD again to wait for the process to get better so they launched with a bad process and now will rely on whatever gains PD makes, which I still bet will be due to clock speed more than IPC, to compete.

It is not a great strategy. So if I have $500 for a mobo/CPU/RAM and am looking at it, will I buy a FX 8150 or a 2500K? 2500K. Its an easy choice for most who don't put blinders on.

So no, don't forget what Intel did. But just don't let it stop you from getting the best performance for your price range.

BTW, if thats how you feel you might want to stop buying a lot of products, pretty much any from national brands as they also like to fight dirty.

 
FYI, 90% of games with more threads than physical cores run worse with hyperthreading and you don't save much time with hyperthreading which really only works on very specific programs that a general user will never use. Unless you do rendering/ video editing, you won't be saving any time with hyperthreading. The time you do save, really isn't cutting into your money if you do video editing or editing since you would be doing this overnight generally.

I guess what it comes down to is how much you know what you are doing. If you know nothing about how computers work, you would think spending an extra $100 gets you better performance.


I get paid to fix/repair/build/upgrade/video work (home videos VCR-DVD and titles,transitions,fades etc)
there are times a video project needs to be done during the day you know
In my personal opinion as a computer professional HT might not always benefit but definitley doesnt hurt to have
unless you spend all day gaming......


PS your OS is multithreaded
 
You can remember it, but using it as an excus to not buy a better product? Thats what I find strange. Ahy would you buy a inferior product? Buying it just because of this or that makes no sense. If you can get a better product for the same price, get it. Supporting a company who is not producing better products stifles innovation.

AMD needs a kick in the pants, now more than ever, to make a better product. They can do it, they just seem to not want to.

AMD has had their share of mistakes, GF was a big mistake in my opinion as they lost control over the process and when that happens, there is n 100% that its going to be good. AMD couldn't delay BD again to wait for the process to get better so they launched with a bad process and now will rely on whatever gains PD makes, which I still bet will be due to clock speed more than IPC, to compete.

It is not a great strategy. So if I have $500 for a mobo/CPU/RAM and am looking at it, will I buy a FX 8150 or a 2500K? 2500K. Its an easy choice for most who don't put blinders on.

So no, don't forget what Intel did. But just don't let it stop you from getting the best performance for your price range.

BTW, if thats how you feel you might want to stop buying a lot of products, pretty much any from national brands as they also like to fight dirty.
That's all fine if you have no moral obligation to do what's right. Buying based on performance is a tool people use to justify themselves. Whose to say what I'm using my comuter for doesn't work at 100% efficiency for me? Just because you feel that everyone should follow the flock off the cliff, sorry I will go my own direction.

Bd doesn't work for you, that's fine with me. I could care less what everyone else in the world is doing, I have my own moral obligation not to support breaking the law, and that's by not buying your products.

Ya, I may not get the best but who cares about 5% loss in some benchmark that has nothing to do with what my use is.

As for your 500 question, microcenter is no longer exclusive to only offering intel deals. Find a 2500k+higher end mb for 219.
http://www.microcenter.com/specials/promotions/AMDbundlePROMO.html

Just for the record, I've never recommended the 8120 to anyone who is afraid to overclock it to at least an 8150 cpu.
 
That's all fine if you have no moral obligation to do what's right. Buying based on performance is a tool people use to justify themselves. Whose to say what I'm using my comuter for doesn't work at 100% efficiency for me? Just because you feel that everyone should follow the flock off the cliff, sorry I will go my own direction.

Bd doesn't work for you, that's fine with me. I could care less what everyone else in the world is doing, I have my own moral obligation not to support breaking the law, and that's by not buying your products.

Ya, I may not get the best but who cares about 5% loss in some benchmark that has nothing to do with what my use is.
I thank you for your sacrifice.

A newbie who is asking for help, should never be led astray and directed towards a dud processor because of someone's pseudo-philosophical position, but you know what you are costing yourself, so there is no problem with you making the decision you have.
 
I thank you for your sacrifice.

A newbie who is asking for help, should never be led astray and directed towards a dud processor because of someone's pseudo-philosophical position, but you know what you are costing yourself, so there is no problem with you making the decision you have.
I also know for a fact that there isn't a game I have that forces my cpu to run at 100% even on a single core, so why not recommend it? Bd is not a cpu bottleneck.

The only dud is in your own mind because that's what you want it to be. People generally believe what they want to, so for bd, you can only see the negative reviews. The good reviews are fake because you don't believe them.
 
I get paid to fix/repair/build/upgrade/video work (home videos VCR-DVD and titles,transitions,fades etc)
there are times a video project needs to be done during the day you know
In my personal opinion as a computer professional HT might not always benefit but definitley doesnt hurt to have
unless you spend all day gaming......


PS your OS is multithreaded DERP!
depending on the software it can hurt. Hyperthreading often lowers performance in scientific and professional software. you have to make sure what you are using works with it.

the time saved on things isn't much with best cases being 20% and generally less than 10%. You'd not notice much

unless you spend all day watching the progress bar.....

PS your OS doesn't scheduled to the virtual cores unless it has to DERP!
 
depending on the software it can hurt. Hyperthreading often lowers performance in scientific and professional software. you have to make sure what you are using works with it.

the time saved on things isn't much with best cases being 20% and generally less than 10%. You'd not notice much

unless you spend all day watching the progress bar.....

PS your OS doesn't scheduled to the virtual cores unless it has to DERP!

The benchmarks out there we see are for short clips or time periods. 20 seconds in a 1 minute clip is quite a bit if you consider an hour. That means it takes 20 minutes less to encode a 60 minute video. If its a 2 hour film, thats 40 minutes less.

Sure people wont be sitting and watching the progress bar but still, the faster the better in those markets. Same with servers.

Same with rendering. In those businesses, the faster you get things done, the faster you can move to the next project. Time is money in those arenas. In gaming, a 1FPS difference is nothing for the most part. 20 is a bit but its rare these days as most games are normally GPU limited and only lower resolutions will show the CPU bottleneck.
 
If IPC means more than cores, whatever the scenario, then why doesn't intel make a 1 billion transistor single core, and just improve IPC on that one core every generation?

So, yes cores are important. I think that anyone who says software will not scale past 4 cores is crazy. It will take time, but we cannot sit on 4 cores forever. People probably thought the same thing when they went to dual cores. Contrary to what was probably marketed (Two cpu cores is like have two cpus!) people who knew a lot about cpu's probably thought the idea was ridiculous.

I think so anyway, i can't say for sure, I was pretty young back then, and I didn't know what a cpu was. The only reason i say this, is because of what people say now, and that is that going over 4 cores is not going to happen, as it is not necessary.

Of course, what happens after we move on from transistors is up in the air, but software will scale past 4 cores. In fact. it already does.

Not to say that BD was a good product, software just isn't ready for it, no. BD is bad, and to make up excuses for why it is, that's just a waste of time. It has been said a hundred times before, but it is true: AMD should have made a product that would be good now, not when software allows it to be.

I speak to those who believe that BD will always be bad because software will not go beyond 4 cores.

It seems some people can't read very well...So I'll say it again:

The majority of software is linear in nature. And since the majority of people out there do no more then one heavy application at a time, you quickly run into diminishing returns.

Now, for embarassingly parallel problems, yes, more cores = more performance. Problem is, those tasks all are better suited to a GPU-like architecture. Hence why Rasterization, Video Encoding, and [to a lesser extent] Physics are all gradually being offloaded the CPU in the first place.

I'm speaking as a Software Engineer: The overwhelming majority of software simply will not scale well in most cases. After about 3-4 cores, any performance increase one would expect from using more cores is basically mitigated by bottlenecks caused by synchornization/locking or other IO bottlenecks.
 
depending on the software it can hurt. Hyperthreading often lowers performance in scientific and professional software. you have to make sure what you are using works with it.

the time saved on things isn't much with best cases being 20% and generally less than 10%. You'd not notice much

unless you spend all day watching the progress bar.....

PS your OS doesn't scheduled to the virtual cores unless it has to DERP!


hee hee
you know I am just messing with you for fun dont you :)
nothing personal
and if you look I edited out my DERP on that post yesterday LOL

okay
Windows has been optimized for HT since XP
Windows 7 will run over 500 threads as an average
just look at Task manager to see how many threads your Windows is running
How can the ability to handle more threads be anything but an advantage?

and to stay on topic
having a FX 8 core or a new Piledriver with 8 cores would also be beneficial in the running of an OS also
having 8 cores to handle all those threads should make it smoother
 
what about a 10-12c pd running at 4-4.5ghz!
that will surely replace the sandy bridge e as they are the cpus that are the choices of users running multiple heavily threaded tasks. (as intel is not going to release any ivy bridge e soon)
don't you think.
 
personally I am a big fan of more cores
my next CPU upgrade will be a Thuban six core on my AM3 main rig
while single apps are not optimized for more than 2 cores usually
I use software like Handbrake,Cyberlink Power Director and other video encoding/rendering that do use more cores
also I like to multitask and have many apps open at a time
I will have Photoshop,various MS Office,Powerdirector,Handbrake,browsers etc all open at the same time
so having more cores will only help

there was time that they said you didnt need a dual core for gaming
that a really fast single core would work better
then they said you didnt need a quad core for gaming
that a really fast dual core was better
and now they say you dont need more than a quad core
well I bet 5 years from now
that a quad core will be the minimal system requirement
just my opinion
I could be wrong....
 
I thought the 5-module PD was axed.

It was. They decided to stick with the G34 socket and improve the cores over just adding more. And by the looks of it there are lots of areas to tune the performance.

Improving the cache latencies and adding the clock mesh technology will help the IPC and TDP which were the bigger hurdles.
 
Sony Playstation 4 will be an x86 CPU with an AMD GPU..."
http://semiaccurate.com/2012/03/02/sony-playstation-4-will-be-an-x86-cpu-with-an-amd-gpu/

I very much doubt it; there's a reason why PPC has been the dominant console architecture for so long. Only reason to go X86 was if you wanted tigher integration with the PC market, which makes far more sense for MS then it does for Sony.
 
I very much doubt it; there's a reason why PPC has been the dominant console architecture for so long. Only reason to go X86 was if you wanted tigher integration with the PC market, which makes far more sense for MS then it does for Sony.

I'm guessing it's a business call. Developing in and x86 platform is way cheaper to do and then port. I kinda like the approach, since we'll have better ported games in the future, lol.

Cheers!
 
cancelled, and the whole MORE CORES concept is also out the window..


fan of weak to mediocre more cores..?
why.

not speaking about the Thubans' but more of FX.
Piledriver needs to be 4 to 8 HARD HITTING cores.


I completely agree
I would consider a Thuban later on
but not current FX series

If PD can just show a 10 percent improvement in IPC over Phenom II at same core speed
then it will be a success since with its potential for clock speed it would be good
if a PD @ 3ghz vs a PHII at 3ghz is just 10 percent better
then with the clock mesh tech and the potential for them to run at 5ghz
they would be competitive as long as pricing is okay
 
Status
Not open for further replies.