AMD Piledriver rumours ... and expert conjecture

Page 289 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 

*hint,hint* adblock.
 

I will agree with the single player testing. hell even a dual core phenom II can compete. I will still say its stupid to test a multiplayer game in single player mode. I don't care if you have to play for 5, 10, 30 minutes in a MP match, post the numbers.

2506
Battlefield3.png


person A bought a Phenom II 560 because the single player benchmark sais it works fine, joing a MP map and runs 10-20 fps on a radeon 6990 ... THATS NOT A GPU BOTTLENECK.

BF3 is very cpu bound in multiplayer and will scale to as many cores as you toss it. My 8120 at 4.7 ghz is pushing 90-120 fps on ultra with 2x 6970 1ghz gpus (a bit faster than the 6990).

Interesting question, how are they testing World of Warcraft? I bet its not online during the demo. I don't care about best case scenario that isn't a real benchmark, I want to see some worst case numbers during online play if its an online game.

 

I agree, its getting to where I can't even go here on my non apple android phone.
 
Start tossing one bench for another is pointless, fact is in BF3 with various equipment I managed to get very good Frame rates, very little lagging with rather extreme level settings, that for many should be enough. I actually prefer GPU bound titles, they are the ones that milk the eyecandy and actually put the expensive cards you own to test. Skyrim with a 2500K and 8800GTX can play at med-high at 16x10 res and the quality looks the same as a 7950 (obviously not playability) the fact is the GPU in gaming should be the be all and end all if not its probably a detail inept title....cue skryim, can't believe bethesda dropped us like that, the game is barely better than Oblivion.
 


Its not just about "good enough"; its about "good enough...at the right price".

And THIS highlights PD's problem:

For general purpose apps:

value-scatter.gif


In games:

gaming-scatter.gif


And lets face it: Who are the people who would most likely purchase an AMD processor? Enthusiasts. Everyone else just gets a pre-packaged Dell/HP.
 


Using Intel's function library results in a ~25% performance penalty for AMD processors in 64-bit mode (generic AVX path vs Intel's, tested using Bulldozer). Mind you that this is apparently due to a simple CPU brand check.

http://www.agner.org/optimize/blog/read.php?i=209
 
Looks like the Phenom II X4 980BE and FX 6200 (preferably 6300) are best bang for buck according to price and all those charts. Even those rare hard to find Phenom II X3 740 with its 4th core unlocked can hold up today and could be found for as low as $40.
 


But the enthusiast argument works both way... Regular econobox-Joe is going to get marketing-slides CPU or a "good" HP/Dell box pre configured somewhere; maybe some will be guided by enthusiasts to get a good price/perf box, but those are like enthusiasts, haha.

Anyway, my point is that for a regular person looking for a "computer", if the overall consensus in forums and places is "at this price, Intel or AMD will do fine for you", I'd say it's just 50/50, not "AMD is dooooooomed!". The current line up, if the person is not worried about the extra consumption in some cases, PD actually IS a good choice, even if you're not an enthusiast. That's the good news this time for AMD with Vishera. If an enthusiast can help you OC your system within respectable power figures, AMD becomes even a better deal at the lower end. A way better deal.

Either going for an A6, A8, A10, FX43x0 or FX63x0. The i3s are getting a run for its money and I'd say the i5s are now as well. That's good news indeed. Think about what I saw at Microcenter. The i5 3570k at usd$189 is a good sign of AMD doing it's job. Hell, the 2700k at USD$220 is also a good indication of that. And I'm sure Intel will have to lower prices here and there over the spectrum now.

Cheers!

EDIT: I mixed up your point now that I re-read your post, gamerk, lol. You're kinda right; the i5 34xx looks very good price wise. Maybe just the enthusiast oriented econo-buyers will get a PD when looking at $130+ CPUs
 


I agree. More cores is the future. However, I would like to take into consideration the fact that computer science is in it's infancy. I am not referring merely to hardware, no indeed. Considering that time as we know it is an illusion and considering the potentials of quantum computing, I would like to think that the issues we see today in writing code which uses hundreds of cpu cores simultaneously can be solved. A new mathematics, perhaps, would facilitate this. A new way, perhaps, of modelling the problem and allowing for different communications between threads. I love programming because there are so many possibilities. It is a mathematicians playground, a way to create models of systems that then function. In a sense, turning programming into a job ruins it because the programmer is so much in a rush. Essentially, a good programmer must overcome the limitations of time and resources, somehow. You have to apply intelligence, not force. It is a creative process and I believe there is a solution for every problem, certainly the problem of how to use hundreds of cores.

I believe now is the time for AMD and Intel to produce desktop/workstation cpu architectures with 16 cores. Some smart programmer will figure it out.
 


Until fairly recently I allowed ads to run in my browser. One day I was annoyed by video ads that would pop up when I rolled my mouse over text. I could not move the video window or stop it until a certain amount of time has passed. From that point on I have been running Adblock Plus and see no ads from any website. I view it as self defense.
 
How does AMD make money on consoles? Do they sell the APU's to the people who put them into the consoles, or do they get money on sell of the console from the console publisher to consumer/store?

Or both with some sort of royalty system?

We know AMD is making the GPU for the WiiU, does anybody know if it is officially making the CPUand/orGPU for any of the other consoles?

All that info would make a big difference on conjecturing how AMD is moving forward moneywise I think. Hundreds of thousands of Consoles sales is a lot of money.

Although since PS3 is making a new comeback in Europe with the latest (and last) slim model; maybe they won't release PS4 there anytime soon because of saturation.. Won't matter much if PS4 is not PS3 compatible though. Harhar wah 🙁

Edit: AMD not making CPU for WiiU
 


PD is a nice step up from BD and it finally, in most areas, takes oin and beats the Phenom II series.

Thats said, it nt what is needed for AMD to do better. That and the fact that some AM3+ mobos wont be able to upgrade to it due toi a issue with the BIOS is kinda dissapointing. And it still doesn't beat out Intel, its still weaker than Sandy Bridge in some areas.

I do find it a tad strange they didn't include Windows 8 benchmarks seeing as its out now and the RC and RTM should be pretty close performance wise.

Its too bad drivers cant do for CPUs what it does for GPUs. The 12.11 betas gave the HD7 series another performance boost from the 12.7 betas making it quite a monster and me glad I got it vs a GTX680.

Still the next year for AMD is crucial, they have to hit their marks and compete with Intel, not themselves in the CPU market be it DT or server. As well they need to push hard into the UMD space as Intel is already there and will only get better as time goes. Plus with Haswell coming up in about 6 months and it supposivley having much better power management it might be hard for AMD to compete again unless Intel screws up again (like with P4), which I doubt they will.
 


Thing is that that still goes hand in hand with performance per clock. If it takes 2x the clocks to do the same job, its not going to have better performance per watt or per $.

It all ties toether. Pentium 4 had some areas where it did shine vs Athlon 64 buit normally it also was running at almost 2x the clock speed and used more power to do so so thats wh the Athlon 64 shined so much. It cost less overall to do the same job, especially in the server market.
 


The problem with multiplayer games is control. One site could use the same hardware as another and get very different results due to the server, amount of players, latency etc.

If there was a way I would love to see it but multiplayer results can only be had by the individual, not everyone.
 

true, but the problem is no one actually even tries. As I said, a PII 560 can handle single player fine, but mp .. forget it. doesn't matter about lag, server, latency. It won't work, but according to toms, because they never even tried, it works fine.

The thing is building a machine based on one single aspect alone doesn't give a clear picture of the unkonwn. Case in point,



PD's problem, TEchreport posted numbers based on one game and solely on minimum fps (they called it 99th percentile frame lag, its minimum fps), but for apps, average it all together.

Lets face it, who buys a computer based on one program alone and only looks at the one single frame that lagged the longest. forget averages, forget highest, only look at the one slowest frame in only one game. After all, thats how they gave Intel an even greater advantage. Skew the results as much as possible.

Why not Batman?
arkham-99th.gif
BF3
bf3-99th.gif

or Crysis 2
crysis-99th.gif


Answer: because skyrim is the only one that gives Intel a bigger lead.

Thats how you do an Anti-AMD review.

 

IIRC then we have heard AMD is making the gpu in all three consoles next generation, and we saw rumors of cpu in the new Xbox.
 


Yep, I've read that as well. My question is, exactly how much of a problem is this, if only 5% or so of all apps are affected due to using Intel's compiler? IOW, AMD complaining about this would be a red herring since it wouldn't really affect them much, plus they could always develop and release their own compiler..



 
Status
Not open for further replies.