AMD CPU speculation... and expert conjecture

Page 512 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810
The "analysis" didn't change the conclusions of the paper in the slightest. The authors had ties to Cisco, Broadcomm, Google, Intel and others. 3/4 making/using ARM chips. But Intel in all their wisdom was able to bribe them, for a paper that maybe 1000 people will ever read. The "analyzer" barely passed the Occam's razor giggle test.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


No doubt they be giants.

"Samsung's revenue was equal to 17% of South Korea's $1,082 billion GDP"

 

colinp

Honorable
Jun 27, 2012
217
0
10,680


Why does everybody here have a buddy at AMD apart from me?

So what exactly did you ask your contact at AMD and what exactly did (s)he respond? No paraphrasing.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Yes they are giant. I want add that Samsung could spend additional 10 billions for "price cut".
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Your non-technical comments didn't change the analysis of the paper in the slightest :p

Not only the head is associated but one of the coworkers of his group works for Intel labs.

Using your logic Intel couldn't bribe Dell to not use AMD chips because Dell also used chips from AMD, but it did happen.

I did read papers from those guys where the acknowledgement section mentions that the work has been sponsored by some Intel grant. I still have to find a paper from them that is sponsored by Google or the others that you mentioned.

Bribing a research paper that will read by some few is not uncommon. It is named "sponsored research". Nvidia likes it. You can find research papers explaining how great CUDA is or why Nvidia GGPUs are needed for X, and at the end you find some disclaimer stating that the work was sponsored by Nvidia.

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
A friend from mine with some internal info offers additional preliminary details about post-excavator architecture.

I assumed that by "high-frequency" Keller did mean at least 3GHz. Friend claims that the architecture will hit up to 4GHz.

I predicted AMD would drop CMT and scale up jaguar. My friend claims that the new x86 core will be a higher IPC version of jaguar/Puma+ scaled up to 2x higher frequencies.

I predicted that the next high-performance APU would include HBM stacked RAM. My friend claims that HBM memory on package will be offered as option.

A interesting note here about "option". My friend says that the post-excavator architecture will be lego-like with interchangeable parts (e.g. ARM cores as base but the option to use x86 cores instead). Different memories options will include from cheap DDR3/DDR4 support to expensive HBM, caches and others parts are also interchangeable.

This lego-like design is motivated by AMD transformation into a SoC/APU company whose 50% of revenue will be from semi-custom projects. Instead designing a custom SoC for each client AMD will offer different combinations of pre-designed components that can fit together to reduce design costs. I like this approach, it seem based/derived from ARM approach.

Other elements include different L2 caches options and L3 cache optional. The new FPU will be also lego-like and will offer different SIMD configurations for consumer than for business.

E.g. I can imagine a 256-bit wide SIMD for consumer laptop APU and a 512-bit wide SIMD for a workstation APU. Time will say.

I like how my broken crystal ball continues to work. :lol:
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
I predicted dGPU will be replaced by APUs, because dGPUs don't scale up well (APUs will be more faster). Some people here disagreed and did strongly (including personal insults). I reproduced a quote from Nvidia Reseach Team agreeing with me on that discrete GPus will be replaced by GPUs on same die than CPU. The quote was ignored and/or deleted in replies.

I found another well-known HCP expert, Jack Dongarra, who agrees with me on that traditional dCPU(socket)+dGPU(PCIe) or even dCPU(socket)+dGPU(socket) doesn't scale up (I guess that he has made the math, like I did):

Another problem that GPUs present pertains to the movement of data. Any machine that requires a lot of data movement will never come close to achieving its peak performance. The CPU-GPU link is a thin pipe, and that becomes the strangle-point for the effective use of GPUs. In the future this problem will be addressed by having the CPU and GPU integrated in a single socket
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


And this buddy of yours specifically said that Carrizo / Excavator was still in the pipe? If so, then I'll stand corrected, and then let's wait and see. Maybe AMD should update their roadmap, since there's obviously no reason to pretend that Carrizo isn't coming!

By the way, minor point of English (I realise that English is not your first language):

When you say something to the effect that "Jack Dongarra, Who agrees with me..." it's not quite correct, since Jack Dongarra has never heard of you; you're just a nobody posting anonymously on a forum; for all we know, not involved in the industry at all. All you do (all any of us do) is recycle what we've read elsewhere, interperate (and maybe misinterperate) what was said, like the armchair experts we are. Then some of us have nerd rage arguments where "I've always been right since the dawn of the semi conductor, and you've always been wrong" while the rest of us sit back, eat popcorn and wait for some real news.
 

harly2

Distinguished
Nov 22, 2007
124
0
18,680



Not directly, but from a business aspect yes, ARM powered devices are hurting x86 (PC sales) Because people keep there old xp/vista/7 machines and buy a tablet, The consumer does not know what exactly tablets are capable of until they buy one. Now you have PC sales bottoming out with XP refresh and tablet buyers being limited by there devices for a multitude of reasons. Plus the rise of the developing world actually needing affordable PC's for actual productivity. So you see x86 holding ground but still shrinking, but the bleeding has stopped for now. Next problem is ARM gaining 64 bit functionality in devices that can actually do some meaningful tasks. IOS moving up in functionality along with Android shortly. ARM will continue to run these platforms even if Intel gives away free Baytrails. Because Baytrail kinda sucks, has poor graphics compared to PowerVR and Adreno. Android has wider app functioality and better performance on ARM. Mobile is firmly in the grips of ARM, People still need PC's

 

8350rocks

Distinguished


You cannot even copy paste right!!!!

TRY READING MY POST NEXT TIME.

1.) K12 is not ARM only, that was my point. Yes, they refer to the entire subsystem as K12 for BOTH uarches, but the new x86 cores are K12. FYI: x86 is still not going anywhere...

2.) HEDT is growing: http://www.rockpapershotgun.com/2014/02/04/the-pc-gaming-market-is-flipping-enormous/

3.) http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown/14
At the end of the day, I'd say that Intel's chances for long term success in the tablet space are pretty good - at least architecturally. Intel still needs a Nexus, iPad or other similarly important design win, but it should have the right technology to get there by 2014. It's up to Paul or his replacement to ensure that everything works on the business side.

4.) I said specifically, "AMD will not use HTT". What part of that did you not understand? You said that they would. I can promise you they will not license HTT from Intel, in case your confusion is confounding you, let me link you to SMT: http://en.wikipedia.org/wiki/Simultaneous_multithreading. There are many types of SMT. DO NOT confuse HTT as being the only version of SMT, and CMT is a version of SMT.

5.) LOL...right...10 TFLOP APU...ok...I will believe that when I see it...maybe when they get to picometer nodes...(you still provided no source). Besides, when there is a 10 TFLOP APU, there will likely be a 40-50 TFLOP dGPU that uses only 2x power of the APU.

6.) No source from you...and my SDK says ALL 8 CORES are available for game loads...

7.) I said the patents registered mentioned the highest frequency on the APU to be 2.75 GHz. Additionally...the PS4 is clocked about 2.0 GHz...so your prediction was wrong. Where is your crystal ball now?
 

harly2

Distinguished
Nov 22, 2007
124
0
18,680





Intel is losing less then 1 billion per quarter just in it's mobile division, But is overall very profitable, their margins are so high that they might be able to keep this up for some time. Even though I agree it won't work. In reality they might have an ARM design license themselves, and we just don't know.
 

lilcinw

Distinguished
Jan 25, 2011
833
0
19,010


I think the reason you are getting so frustrated while preaching your APU salvation to the heathens is that you are forgetting your audience. All of your arguments for the superiority of an APU over a dGPU are based off the comments of experts focused on HPC.

The average Tom's reader/forum lurker cares nothing about HPC workloads. They are concerned with whether or not their R9 780X will run BF7: Lunar Warfare. The Tom's editors know this which is why the System Builder Marathon every quarter devotes so much time to gaming benchmarks and analysis.

Gaming and HPC have different requirements and the PCIe bandwidth is not a bottleneck in typical games. Until this becomes a significant issue for games it won't make sense for an enthusiast to move to an APU. Yeah you crammed 3000 shaders with HBM cache into an APU but if you hadn't reserved 1/3 of the die for the CPU you could have 4500 shaders which will perform better.

Putting 250+ watts into a MB socket will present its own problems in the consumer segment and may require a break with the ATX form factor to adequately cool with acceptable acoustics. AMD does not have the resources required to create a new platform in a market they control less than ten percent of.

APUs are excellent for certain use cases; if you are concerned about total system power you probably should use an APU. Enthusiast class desktops are not one of those cases.
 

harly2

Distinguished
Nov 22, 2007
124
0
18,680



I think you're off on the average tom's user...........way off. That's irreverent though, this conversation for some time has turned into the future of computing (generally AMD specific) And it is clear that in 2016-18 you have integrated graphics from AMD and Intel beginning to kill of the dGPU significantly. The advancement of integrated GPU's at the expense of dGPU is already heavily underway now in laptops, (gaming capable mid to upper end laptops included). I agree with your concerns, likely the very upper end of graphics is off chip till when...18-19 at the latest. But the "beasts" in dGPU become lower volume specialty hardware due to the available power of built in graphics.
 

con635

Honorable
Oct 3, 2013
644
0
11,010
I posted this answer on another thread but I'll post here too as I think it relevant to this discussion, maybe you could comment on it. I think juans right about alot of stuff btw, more than he gets credit for and I hope he does a blog on carizzo soon. I dont know when the dgpu will be 'killed' but the near future imo is apu + dgpu, I think the dcpu is nearly dead now as we speak.

'There could well be an fx apu, even intel hasn't improved massively since sandy bridge on the cpu side, the future seems to be apus with gpu assisted software. That's where we'll see the next big jump in benchmark and software performance, lets also see what the game makers do with the ps4, it has an x86/gcn apu with huma.
People recommend (I thought the same) the fx8xxx as future proof because of the 8 cpu cores and the ps4 having 8 cpu cores as that would mean games get better threaded but that was possible with mantlle alone and dx12 soon. The ps4 uses the 8 cores for multi tasking eg 2 for the os and I now think amd were trying with the ps4 to get developers to use hsa not the many x86 cores. So if you wanted to *gamble* on future proof imo an apu like kaveri/carizzo with hsa/huma would be a cheap bet.'
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


wow ... just wow ... I remember someone trying to convince me that Jaguar was going to be replaced with ARM cpus and that Puma+ was going to be scrapped ... and now you predicted puma+ and its product line is the future ...

your broken crystal ball only works because its broken, when your proven wrong you try to claim that's what you said all along.

This is why this thread has become useless.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


only partially correct. PC sales are faltering because that 4 year old system you bought is still good enough for any of today's programs. 10 years ago a 4 year old system was ancient technology and had to be replaced. When was the last time we had an upgrade that offered double the performance? Whats the point of buying a new system today for a measly 5-10% boost in processing power? Most overclockers would probably still opt for a 2600k over the 4770k.

ARM is in the segment that pcs were 10 years ago. That phone you bought 2 years ago is ancient in today's standards. Sales are out of control because the tech is outdated in a very short period of time. Whats going to happen when ARM advancement slows to a crawl? People will hold on to their current tablet/phone longer and not buy another one.

Tablets and phones will do the exact same thing that pcs are doing now. OMG THE MARKET IS DIEING, lets start pushing computer implants.
 

lilcinw

Distinguished
Jan 25, 2011
833
0
19,010


Maybe I have been hanging out in the gaming sub-forum to much lately. The issue does come up almost every SBM however, 'why did you sacrifice budget for productivity improving hardware X when you could have spent more on gaming performance with GPU Y?'.

I agree that AMD will kill off the lower tier GPUs, I still don't think Intel has the talent or IP necessary for their iGPUs to be relevant yet. I think that the majority of the GPU products below ~$200 are redundant considering that every $20-40 gives serious improvements in performance.

It would be interesting if AMD's 'lego block' K12 design was capable of removing the x86/ARM block and replacing it with shader blocks for a dGPU. I'm not sure if that would work with the bus, IMC, etc.
 

8350rocks

Distinguished


K12 != cat cores...not for ARM or for x86. They may have started with that in the beginning...but that is like saying a Volvo S60 is a Ford Taurus because both started life using the same 4 pieces of steel making the frame.

EDIT: removed information that I likely should not have posted in frustration at someone who is not worth time...*I know nothing about K12*
 

harly2

Distinguished
Nov 22, 2007
124
0
18,680
@Noob

I think you give the consumer to much credit, fewer people buy 2600k/4770k level chips then you think. The majority of people don't have a clue what chip from what generation is in their machine. If it runs then fine, a huge, huge chunk of the market is still XP.

People only have so much disposable income. The crash of 2008 actually plays into this as well but that's a side point. Before tablets/smartphones a lot of that money was going into PC refresh, new laptops, whatever was new. That dried up but people actually have to update PC's now, they can't run modern workloads, they are legitimately old. Also the smart phone makers face trouble before that point of diminishing performance returns. Already the highest growth markets are developing country's. They will simply not have the money to refresh high end devices, they will hang on to them for a long time and or refresh with very cheap devices. But yes I agree in many ways and I think we are already seeing some good enough performance from phones to hang on to them longer than the previous generation, Thus Apple trying to add more functionality to iOS, soon to be followed by Android. Windows mobile is a mess.
 

unlike fx, 2500k-4770k (and a10 5600k-5800k, 6400k-6800k) have deeper reach into oem pcs as well as retail buyers. it's partly due to intel's dealings and partly due to consumer mindset. that's why you'll see most fx owners being aware and well-researched while most core i owners raging why hd3k won't play bf4 or crysis 3, or why pentium g630 won't fit h97 motherboard (1 star and a scathing review(!) to some poor h97 motherboard on newegg).
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


"Support for this research was provided by NSF grants CCF-0845751, CCF-0917238, and CNS-0917213, and the Cisco Systems Distinguished Graduate Fellowship"

The National Science Foundation and Cisco (makers of ARM and MIPS chips, and they sell X86 too).

Yet you insist the paper should be ignored because of your completely unfounded assertions that Intel bribed the authors. You're essentially calling the authors liars with nothing more than mere speculation because it doesn't agree with your views of ARM superiority complex. Yet their paper survived the review process by industry veterans, was presented in front of a group of their peers and was published. I take their credibility over yours any day of the week.

 


It is a IEEE paper, that alone is enough proof for me. Sorry Juan, but you're drunk on this one; go home with your argument. This paper explain very well the methodology used and how it proceeded with the data the collected.

Cheers!

EDIT: Something went wrong with quoting... lol
 

8350rocks

Distinguished


LOL!!! Yeah, that paper had far more vetting to get IEEE approval than juanrga's crystal ball...lmao.
 
Status
Not open for further replies.