AMD Piledriver rumours ... and expert conjecture

Page 110 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
Still though, it is Intels compiler. I can't expect them to optimize their compiler for AMDs CPUs nor would I expect AMD to optimize for Intel if they had their own or vice versa.

Do a lot of companies actually use the Intel compiler though?

I thought Microsoft compilers were more widely used, or GNU GCC.
Especially now with Visual Studio Express being free.

 
Do a lot of companies actually use the Intel compiler though?

I thought Microsoft compilers were more widely used, or GNU GCC.
Especially now with Visual Studio Express being free.

Based on my observation, Visual Studio is used by most businesses, which makes sense, given how most of them already have deals in place with Microsoft. [Remember, these are the same companies still holding onto Visual SourceSafe for dear life, despite the fact VSS is a horrid Version Control Software]
 
I fail to see why this compiler issue is such a big deal. Am I missing some key component that puts Intel firmly in the wrong?
misleading customers to believing Intel cpus are far superior (in gaming)

I had this discussion several times but it gets labeled a conspiracy for some reason when you tie this to games.

http://game-on.intel.com/eng/games/default.aspx
http://en.wikipedia.org/wiki/Havok_(software)

and the results: http://www.tomshardware.com/reviews/gaming-fx-pentium-apu-benchmark,3120-6.html

All amd chips suck on intel's money backed games. This is a much better explanation describing why tho.

StarCraftII.png


Thus review sites love to use the list of Intel optomized games to prove how bad AMD sucks.

More interestingly, http://www.tweaktown.com/reviews/4350/amd_fx_8150_bulldozer_gaming_performance_analysis/index.html

The only game that sucks on AMD is farcry 2, and the only game on the Intel lists.
 
misleading customers to believing Intel cpus are far superior (in gaming)

I had this discussion several times but it gets labeled a conspiracy for some reason when you tie this to games.

http://game-on.intel.com/eng/games/default.aspx
http://en.wikipedia.org/wiki/Havok_(software)

and the results: http://www.tomshardware.com/reviews/gaming-fx-pentium-apu-benchmark,3120-6.html

All amd chips suck on intel's money backed games. This is a much better explanation describing why tho.

http://media.bestofmicro.com/G/P/324601/original/StarCraftII.png

Thus review sites love to use the list of Intel optomized games to prove how bad AMD sucks.

More interestingly, http://www.tweaktown.com/reviews/4350/amd_fx_8150_bulldozer_gaming_performance_analysis/index.html

The only game that sucks on AMD is farcry 2, and the only game on the Intel lists.

You have listed a series of articles and benchmarks that I've seen before. None of this is surprising to me.

Intel has spent a great deal of time with developers to optimize games for IA, and as a result, they perform better on IA. This is evident in the superior performance of Intel parts in the games that are listed on Intel's "game-on" page. The Tweaktown benchmark reveals exactly what I would expect. You overclock the two CPUs to high hell so that you are now benchmarking the GPU. Guess what? Congratulations, we have discovered that the GTX 580 performs roughly equal to the GTX 580. The key thing to take away from this is that the 8150 doesn't really bottleneck the GTX 580 when you overclock it to the limit (Not even on Farcry 2). This proves that AMD made a decent chip.

I don't see how this is misleading at all. Maybe instead of being angry at Intel for working with Software Devs to optmize for IA, you should be angry at AMD for not doing the same.
 
You have listed a series of articles and benchmarks that I've seen before. None of this is surprising to me.

Intel has spent a great deal of time with developers to optimize games for IA, and as a result, they perform better on IA. This is evident in the superior performance of Intel parts in the games that are listed on Intel's "game-on" page. The Tweaktown benchmark reveals exactly what I would expect. You overclock the two CPUs to high hell so that you are now benchmarking the GPU. Guess what? Congratulations, we have discovered that the GTX 580 performs roughly equal to the GTX 580. The key thing to take away from this is that the 8150 doesn't really bottleneck the GTX 580 when you overclock it to the limit (Not even on Farcry 2). This proves that AMD made a decent chip.

I don't see how this is misleading at all. Maybe instead of being angry at Intel for working with Software Devs to optmize for IA, you should be angry at AMD for not doing the same.
how about the review sites that only use those specific games that make AMD look bad?

"GTX 580 when you overclock it to the limit (Not even on Farcry 2). "

umm .. no

http://www.tweaktown.com/reviews/4350/amd_fx_8150_bulldozer_gaming_performance_analysis/index11.html

Now think about market position and how much it costs.

Intel approaches game dev, offers up $100M (thats chump change to intel) dollars to optomize this game solely for the Intel cpu. Game dev sais "what about AMD?" Intel .. what about it, thats only 5% of your gamer market. Dev .. oh ya, done.

AMD approaches game dev: How much would it cost to optomize this game solely for the AMD cpu? ... Dev : $500M ... why you ask? because we would lose 95% of our sales since you only own 5% of the gamer market.

Thats why its misleading and wrong. Its the same thing Intel did with HP, Dell, compaq, ect. just on a smaller scale so far.
 
how about the review sites that only use those specific games that make AMD look bad?

"GTX 580 when you overclock it to the limit (Not even on Farcry 2). "

umm .. no

http://cdn5.tweaktown.com/content/4/3/4350_28_amd_fx_8150_bulldozer_gaming_performance_analysis.png

Now think about market position and how much it costs.

Intel approaches game dev, offers up $100M (thats chump change to intel) dollars to optomize this game solely for the Intel cpu. Game dev sais "what about AMD?" Intel .. what about it, thats only 5% of your gamer market. Dev .. oh ya, done.

AMD approaches game dev: How much would it cost to optomize this game solely for the AMD cpu? ... Dev : $500M ... why you ask? because we would lose 95% of our sales since you only own 5% of the gamer market.

Thats why its misleading and wrong. Its the same thing Intel did with HP, Dell, compaq, ect. just on a smaller scale so far.

Your picture did not link correctly, so I will not address that issue.

In terms of market position. That's exactly the point. Intel is in a far better position to be able to optimize games for their CPUs, so many games run better on them. End of story. I don't see what's so wrong with that? Should Intel give up and stop increasing performance and working with software devs so that AMD can catch up? No, that doesn't make any sense. AMD may be facing an uphill battle, but that doesn't mean Intel should slow down and wait for them to catch up. It doesn't put Intel in the wrong because they are in the lead.

Competition is great and AMD needs people to root for and support them, but that doesn't automatically put Intel in the wrong.

EDIT: I meant at higher resolutions. At 2560x1600 the performance different is ~10%. Somewhat significant but I wouldn't exactly call that a bottleneck.
 
Its important to note that you can enable the compile time switch of /QxO to force SSE 2/3 support for non-Intel architectures. And the SSE limitation was clearly defined in the documentation.

Not on Intel's compiler you can't. You have to encode your own method of checking flags and determining code paths, Agnar build some sample code to do exactly that. GCC and MS both will build binary's that attempt to run SSE3+ on AMD CPUs, Intel's will only check the VenderID and force it to run on the SSE2 path. Agner did a thorough analysis of different functions and parts of not only Intel's compiler but their common libraries too. He determined exactly to what level the compiler would run code. He even built his down benchmarks to test instruction timing and what optimizations were being done. It's very damning to Intel. So damning that Intel was forced to not only update their documentation but also compensate the costs to any of their customers who wanted to recompile their code with AMD compatible optimizations.

Intel's documentation most certainly did NOT mention that their compiler wouldn't even try to optimize for anything without an Intel Vender String. They had to update their documentation, AFTER the litigation, not before.

If you actually toke the time to go read what Agner wrote then you wouldn't of posted what you just did.
 
how about the review sites that only use those specific games that make AMD look bad?

"GTX 580 when you overclock it to the limit (Not even on Farcry 2). "

umm .. no

http://cdn5.tweaktown.com/content/4/3/4350_28_amd_fx_8150_bulldozer_gaming_performance_analysis.png

Now think about market position and how much it costs.

Intel approaches game dev, offers up $100M (thats chump change to intel) dollars to optomize this game solely for the Intel cpu. Game dev sais "what about AMD?" Intel .. what about it, thats only 5% of your gamer market. Dev .. oh ya, done.

AMD approaches game dev: How much would it cost to optomize this game solely for the AMD cpu? ... Dev : $500M ... why you ask? because we would lose 95% of our sales since you only own 5% of the gamer market.

Thats why its misleading and wrong. Its the same thing Intel did with HP, Dell, compaq, ect. just on a smaller scale so far.

Noob, stop it, seriously. Your just embarrasing yourself at this point.
 
Noob, stop it, seriously. Your just embarrasing yourself at this point.

Not really, although neither Intel nor AMD pay companies money for their support. Intel just provided a better SDK and developer support then AMD did. It took AMD too long to realize that they needed to directly work with AAA game studies to ensure their products worked well with AMD products. Their finally putting effort into it but it's going to be along hard battle for them to win over developer loyalty.


And yes when it comes to game performance it's more about developer brand loyalty then anything else. Key Pounders (our name for Coders) are a finicky lot, they develop loyalty to certain products and techniques and once rooted are nearly impossible to change. If a particular lead developer is familiar and trusts the Intel compiler then they'll only work with that compiler unless forced by their management to change. This isn't so different from my world, engineers have loyalty's to various brands and platforms, and once we prefer a certain way to do things we don't like change.
 
Noob, stop it, seriously. Your just embarrasing yourself at this point.
What do you think would happen if intel convinced every software developer to use their compiler? It would be no different than bribing vendors not to sell AMD products.

How is that line of thinking wrong?

I only used the money to show how easy it is for Intel do do something AMD can't afford. That will never change, hence why AMD is "no longer competing with Intel"
 
Not really, although neither Intel nor AMD pay companies money for their support. Intel just provided a better SDK and developer support then AMD did. It took AMD too long to realize that they needed to directly work with AAA game studies to ensure their products worked well with AMD products. Their finally putting effort into it but it's going to be along hard battle for them to win over developer loyalty.
Fanbase tends to help drive this loyalty. as i stated, if a dev is building a game for 95% to 5% marketshare ... its obvious who they will program for. AMD needs to be able to do their part as well.

The problem is still the review sites that cherry pick what games they use in their articles.
 
Fanbase tends to help drive this loyalty. as i stated, if a dev is building a game for 95% to 5% marketshare ... its obvious who they will program for. AMD needs to be able to do their part as well.

You say 95% - 5% but it's closer to 73% - 27%.

I think you are taking this too personally. Are you suggesting that these software developers should ignore "95%" of their customers to cater to the other "5%"? That makes no sense.

The problem is still the review sites that cherry pick what games they use in their articles.


Click on the link you posted and look at the games TH benchmarked:

http://www.tomshardware.com/reviews/gaming-fx-pentium-apu-benchmark,3120-6.html

Not one of those games is on the list of games you posted:

http://game-on.intel.com/eng/games/default.aspx

EDIT: In your other link, there are 3 of the 6 games on the Havok list. But do you really think that TH should ignore 3 of the biggest titles because they are optmized for some set of hardware?
 
You say 95% - 5% but it's closer to 73% - 27%.

I think you are taking this too personally. Are you suggesting that these software developers should ignore "95%" of their customers to cater to the other "5%"? That makes no sense.




Click on the link you posted and look at the games TH benchmarked:

http://www.tomshardware.com/reviews/gaming-fx-pentium-apu-benchmark,3120-6.html

Not one of those games is on the list of games you posted:

http://game-on.intel.com/eng/games/default.aspx

EDIT: In your other link, there are 3 of the 6 games on the Havok list. But do you really think that TH should ignore 3 of the biggest titles because they are optmized for some set of hardware?
Do you really think the reason amd sucks at game benchmarks is solely because of the cpu and not the software code? That's the conclusion everyone uses, but what is the more realistic reason?

Amd cpus rune perfectly fine against sb in any game that's not "optomized for intel" so the problem must be _________.

But the conclusion is "AMD sucks"
 
Do you really think the reason amd sucks at game benchmarks is solely because of the cpu and not the software code?

What? What are you talking about? Have I even said one word about CPU hardware performance? Even one word? No.

The closest I came was "AMD makes a decent chip" which is the opposite of what you are claiming I said. Go back and read our discussion. It's like you are quoting me but replying to somebody else.


That's the conclusion everyone uses, but what is the more realistic reason?

Amd cpus rune perfectly fine against sb in any game that's not "optomized for intel" so the problem must be _________.

The problem is that the games in the benchmark that use the havok engine are CPU bound games. Intel chips would perform similarly to AMD's except Intel has provided an SDK to the developers which is optmized for Intel CPUs. It's a combination of CPU bound games running optmized code that allows Intel to perform so much above AMD.
 
you more pertining to everyone who thinks just what I said. Only replied to you to keep track of the discussion I am talking about.

But you yourself do come across to seem to feel its fair to pick and choose benchmarks based on software optomizations for one company and imply that the problem lies only with AMD. AMD can't control what they don't own or what is owned by their direct competition (havok).

Is it fair to cherry pick programs to prove a point. To an extent, if its your website, do what you want. I seek more than just numbers, I seek the undrelying poblems
 
the problem lies only with AMD.

Yes it does

AMD can't control what they don't own or what is owned by their direct competition (havok).

Yes they can. Let me link this again. Read it this time. AMD absolutely can work with Intel to optimize the Havok engine for AMD. They have been for years for their GPUs. This is an article written by AMD and posted on the official Havok website in 2011.

http://www.havok.com/news-and-press/releases/amd-and-havok-optimize-physics-gaming

Is it fair to cherry pick programs to prove a point. To an extent, if its your website, do what you want. I seek more than just numbers, I seek the undrelying poblems

No, it's not fair to give a biased review and try to claim that it is unbiased. You just quoted 2 different reviews. Neither of them are cherry picking programs at all.

In the TH review 6 games were benchmarked. Only 3 of them used the Havok engine. 2 OF THOSE GAMES WERE STARCRAFT 2 AND SKYRIM. Are you seriously trying to tell me that SC2 and Skyrim are obscure games that TH cherrypicked to show the Intel performs better? Hell no. Those are two of the most popular games on the planet, rightfully chosen for their CPU bound nature. It's equally unfair to specifically choose games that have not been optmized for Intel. It's AMDs responsibilty to work with Bethesda and Blizzard to make sure that these games run well on their chips. Intel has ZERO moral or ethic responsibility to NOT help Bethesda or Blizzard optmize for their CPUs, so that AMD won't look bad.
 
If AMD want keep some costumers (loyal costumers), the release should be before IB or at similar time, and more important than that, should perform similar or better.

Heh, many of us have been giving AMD similar advice for years now, esp. Bulldozer coming out before Sandy Bridge. Unfortunately, AMD depended on GF and weren't able to deliver on time. Not that it would have mattered much given the performance gap between SB and BD..

Sometimes I think AMD purposefully goes after the 'sympathy' vote - the much smaller company valiantly struggling against the overpowering ogre, suffering many painful blows, and everybody secretly hoping for a miraculous recovery and then delivering a death blow to the ogre. But then I spend far too much time playing D&D RPGs.. 😛
 
thank you FOS

been reading the other links I think viridian gave me

at this point the FX-8120 is a match up for the I5-2400 at the USD 189 price point
so I find that an interesting comparison from the workstation point of view
though the only thing is that workstations are usually not a budget based buying decision in some ways
if you have projects that have serious money riding on them than going cheap on workstations is foolhardy

I saw reviews of the FX 8150 against the 2500k in workstation apps
and the 2500k is beating the 8150 in most productivity benches

but also the 8150 uses some new instruction sets related to encoding/rendering that are not being used by devs yet
http://benchmarkreviews.com/index.php?option=com_content&task=view&id=831&Itemid=63&limit=1&limitstart=14

also in that article most of the benches have the FX 8150 IPC the same as a 1100T in workstation apps

hmm need to do more reading LOL

for a home user or small business user on a budget that wants to do workstation apps plus multitask
then BD might be viable but still hard not to recommend Intel or going with Thuban six cores

if PD is done right and at a reasonable price point it might save the day

I wonder if AMD is thinking very long term or just screwed up with BD

my opinion on that is still open

Dunno about PD - time will tell all, maybe this quarter 😛..

However, the 39xx and 38xx are the Intel enthusiast or 'workstation' CPUs with the quad memory channel controllers, 40 PCIe lanes, etc. so that is what you should be looking at for contrasting to BD for productivity purposes. Or you could look at the Xeon 2-socket reviews for that matter..

Personally I decided last fall not to wait on AMD anymore and will be building an IVB gaming PC in May. Waiting on AMD is like waiting on Godot 😛..
 
I wasnt around reading the hype on the BD before it came out
I have no bias towards AMD or Intel
I own an AMD (PHII Deneb 3.5ghz) now and before that an Intel (C2D 3ghz)
I want AMD do well just so Intel has competition

if pricing was more in line with performance than I doubt people would complain as much about BD
now looking at this

http://www.reuters.com/finance/stocks/chart?symbol=AMD.N


since BD came out AMD stock price has doubled and is close to last year's number of $9 a share

so if you bought shares of AMD in Oct 11 at 4.50 you almost doubled your money in six months

not sure if rising stock price is directly tied to BD sales
would love to see sales figures of BDs

because really whether BD is a success is not measured by IPC performance or BF3 benches
it is sales and stock price in the end that matters
what we think of as a failure in our eyes could be a success in stock holder eyes

I wouldn't tie AMD's stock price to much of anything, except the fact that the stock market in general climbed pretty well since Nov. 2011. .

However if I had to attribute their stock performance to anything, I'd consider the graphics end and Llano sales before BD sales, as well as AMD returning to modest profitability from a long string of huge losses. I think AMD's server marketshare is going to drop well below the current 5% with the new E5 SB-based Xeons, and that is the most profitable segment by far. It will be interesting to compare Intel's and AMD's Q1 earnings reports due out in about 3 weeks...
 
Status
Not open for further replies.