Does AMD has some future?

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


I really doubt "engineers" have that kind of power in decision calling. I'd put the blame directly in the CEO of that time (Ruiz) and the board of Directors. Specially when the biggest sign they gave about no not wanting to enter low power was the license deal with Qualcomm after buying ATI. I think even Ruiz said it a couple of times they didn't want to chase low power, but that would have to be confirmed.

I'm 100% sure the engineers knew about BD's shortcomings, but it was an strategic call to continue; not a technical call. I really dislike when you and everyone else puts the blame on the engineers instead of the head managers and up.

Cheers!
 



You have inside information the design team was fired? Care to give us a shred of evidence? These men who design microprocessors weren't "decent" engineers? Like the engineers who designed Netburst? These men have more knowledge in their toes than you have in your whole body. But then again you argue with Linus Torvalds that is ample evidence of your being delusional.
 
Juan i was under the impression that engineers knew it wasn't a good idea back in 2005-2006 but management brought it back anyways. As for Amd engineers being unskilled in Arm designs well duh they aren't Arm engineers, most probably took classes on it and that's about it. Jim keller has experience with it.
 


Bulldozer is the result of a mixture of technical and management mistakes. It did cost the CEO, most of the management team, the vice president of engineering, and several engineers their job.

The engineers that designed Bulldozer weren't the same team behind K8 but consisted of "people that had never achieved any success" according to one internal source.

You are 100% right on that engineers knew that the Bulldozer was not hitting goals, but the management decided to continue anyway and the marketing dept did the rest.

Yes, selling mobile graphics API to Qualcomm was another strategy mistake. But the decision to don't push high-performance ARM was a consequence of a group of x86 engineers convincing their chiefs that ARM never could caught x86 on servers. I suppose it is the same group of x86 engineers that recently believed had found a error on ARM architecture, when the problem was in their tools.

Note: I cannot edit post in the other thread, but recently I got the same problem in this. Anyone else?
 
I showed before that AMD is reducing R&D (despite Lisa Su claims otherwise). I showed data up to year 2009, but then some did claim that on previous years (not shown in the graph) AMD was very competitive with only a small percent of Intel R&D. I have found larger historical records that discredit that

ycharts_chart_AMD_vs_INTC_zpsccd1f993-1.png

ycharts_chart_AMD_vs_NVDA_zps794cbcf1-1.png


AMD was competitive in the past only when the R&D was close to its competitors. Just before the year 2000, AMD spent nearly the half than Intel. Today, AMD spends less than the 10%.
 


I read some posts written by AMD's ex-employee's and according to them Bulldozer looked well in 2006. Around 2009 they discovered that real performance is much worse than planned.
 


Yes, check my reply to Yuka for more details.



Agree, the problem is that those unskilled x86 engineers did silly claims about the evolution of ARM and now AMD is late to the party thanks to them, because their claims were taken seriously by the management.

Keller has experience with A32, Thumb... I am not sure if he has experience with A64. Apple cyclone is not his baby.
 
Manufacturing for one. For example amd doesn't even know the exact amount of transistors over complexity. Also I didn't mean global! Foundries but I'm guessing that makes things harder for them as well since they don't own their own foundries anymore.

Here is a link about getting the transistors wrong and its only a estimate

http://techreport.com/news/22100/amd-corrects-muffed-bulldozer-transistor-count

Now the architecture has gotten WAY more complex as well.

Also as for Juan getting the ratio wrong fine doesn't really mean anything they still had 1/3 the budget. I'd rather take 1/3 the amount of my house then 1/10
 


Charts show that your previous pretension that AMD was competitive in the past with 1/10 of the R&D was just invented. In the years 1997/98 Intel did spend about 2 billions, whereas AMD was close to the billion and in previous years the gap was even smaller (about one half).

Also then Intel did spend more R&D on other markets, whereas AMD was entirely focused on CPUs/chipsets.

The chart clearly shows a company in decline and nobody (except you) believes that with a 1/10 of the R&D, AMD will caught (much less surpass) Intel.
 


Exactly, obtaining a 20% IPC gain was much simpler then than now. The designs are more sophisticated now and require much more money and time.
 
Meh not really. I could see Microsoft and Sony ultimately funding them to the bare minimum to keep them producing processors and graphics for their consoles, but as for PC... I think it's the end. They are holding on to teenagers using their parents money to buy something like an FX 4300 or A6 and the nostalgic 90's user. The rest of us have moved on. i5s and i7s are the future. The bleak, grim, and blue future...
 


Dude, even Pentiums and i3's are the (budget) future.
 


1.) Streaming services are only great if you have the bandwidth, much of the world still does not.

2.) Irrelevant? Heard of a Nook? LOL...yeah.

3.) AT&T now used to be Cingular, which was Pacbell, which then bought up SWBell, NBell, SBell, EBell, etc. AT&T now, is what AT&T was before Antitrust in 1982. Also, I am old enough to remember when they busted up the AT&T monopoly. How savvy are you on current events? Since Cingular became AT&T several years ago, all of the Bell companies have been reconstituted under one roof.

4.) How is 500 mbps internet for half the cost not relevant?
 


They would rather switch to Nvidia. They have now some experience with APU (Tegra) and some future ARM can be ok for consoles. Well, breaking compatibility wasn't that painful.
 
3.) AT&T now used to be Cingular, which was Pacbell, which then bought up SWBell, NBell, SBell, EBell, etc. AT&T now, is what AT&T was before Antitrust in 1982. Also, I am old enough to remember when they busted up the AT&T monopoly. How savvy are you on current events? Since Cingular became AT&T several years ago, all of the Bell companies have been reconstituted under one roof.

To be fair, Verizon did survive on it's own. It's the only ATT spinoff to not get gobbled though.
 


Loongson's creator. Rather too small to buy AMD, but I bet they you would be happy to use AMD's resources to develop their CPUs.
 


Do you mean as semicustom client?
 


The biggest chunk of intel R+D are manufacturing, if you want to be fair, you would add in GF's R+D post 2009. Intel also competes in fields like SSD's, Ethernet PHY's, wireless communication, wifi modules, etc where AMD doesn't participate. You won't find publicly released numbers on what AMD vs Intel spend on x86 R+D, but I'd be willing to bet it'd be something like the 1:2 or 1:3 that it's always been.
 


If you wanna talk about stocks:

AMD sells for 2.61 dollars per share
Intel sells for 35.81 dollars per share

Intel folds AMD almost fourteen times over. Unless AMD keeps increasing thei share price like you say, I'd still want to be the CEO of of Intel.

EDIT: Wait, nevermind, I didn't see the R&D post you were responding to earlier.
 


Which is the company that correspond to this analysis?

The company's weaknesses can be seen in multiple areas, such as its deteriorating net income, disappointing return on equity, poor profit margins, generally high debt management risk and generally disappointing historical performance in the stock itself.

Answer: DMA si ynapmoc ehT
 


Adding GF doesn't close the 10x gap. Yes, you can subtract fields like SSD's, Ethernet PHY's, wireless communication, wifi module... but that are peanuts compared to the enormous inversion and time that you need to design a 8-issue superscalar OoO x86 core with wide FP/SIMD units.

Jaguar is 2-issue. Haswell is 8-issue. 8/2=4 but Haswell is not 4x more complex and difficult to design, because complexity rises non-linearly with issue wide. Thus Haswell is about 6--8x more complex and difficult to design.

Precisely the reason why AMD did push CMT and "moar cores" is because couldn't design a core so complex as the cores that Intel was designing and that was in the past, when AMD finances and their R&D budget was in safe mode. And now will happen the same: Zen/K12 will be a simpler core than Skylake due to lack of funds.

If you want split hairs then you would also subtract from AMD budget: Seattle, Cambridge, K12, Mantle, HSA, semicustom... The result will not change significantly: Intel spends much more R&D on designing CPUs than AMD and, as a consequence, AMD will not caught Intel much less will surpass it.
 
Status
Not open for further replies.