AMD CPUs, SoC Rumors and Speculations Temp. thread 2

Page 48 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The key is on if he said dGPU or not. WCCFTECH doesn't give quotes, only report their own interpretation of what he supposedly said instead.

Seeing that WCCFTECH claims in the same article that AMD said them that APUs will have a "fast discrete graphics chip", I guess Raja was describing the future multidie APUs. Those future APUs break the die into components: one for iGPU another for iCPU another for I/O,...

picture.php


And I guess the guys at WCCFTECH got confused and believed that discrete die in the interposer is a dGPU. It wouldn't be the first time that WCCFTECH guys are confused about tech stuff.
 


Nobody in this thread said "soon". In fact we said long run about 2020 or so. Nobody said this will happen at the 16/14nm node this year. And Koduri was talking about Polaris products that will be released this year on Finfet node.
 
So...anyone see the AMD earnings report? Revenue continues to tumble, even as AMD gets its costs under control. Which basically means AMD is saving money through layoffs, but that won't matter if people stop purchasing their product.

AMD better not be late with Zen.
 


Call me when the products are out the door earning significant revenue for AMD.

At this point, AMD is basically gutting itself to stay alive until Zen. Even if Zen is a success, I don't see how AMD can compete in the future with so many cuts and declining R&D spending.
 


AMD has said in various ways that they are putting R&D in only those areas where they can get monetary benefit. Is putting money in cpu and gpu R&D not included in monetary benefit?
It just means that there won't be anything flamboyant from AMD. What you can expect from the cuts is that you won't be seeing AMD chips driving cars anytime soon or products like quantum project shown last year.
http://www.kitguru.net/components/graphic-cards/anton-shilov/amd-unveils-project-quantum-ultra-small-pc-with-ultra-high-performance/
 


But here's the thing, those "chips in cars" are just derived from products NVIDIA has already developed. You have a situation where NVIDIA is spending 33% more in R&D then AMD, and Intel is spending 100% more. I don't see a situation where AMD can continue to compete against both NVIDIA and Intel and not drive itself into bankruptcy in the process.
 


The R&D cut is in reality affecting the whole product offer. It is the reason for the frozen of Mantle, cancellation of Mantle for Linux, cancellation of Steamroller/Excavator Opteron CPUs, the cancellation of Skybridge project, the two year delay of Seattle, the delay (cancellation?) of K12, the cancellation of Mullins and other Puma+ products for 20nm, the externalization of some parts of R&D (e.g. chipsets) to third companies, the cancellation of tablet plans, the reason why no more will invest in "Customized Process Technology", the reason for existing the Seamicro business, the common socket (AM4), the generalized absence of HSA applications, the reason most of the 300 GPU series was a rebrand,...
 
All of what you mention were due to market conditions not R&D cuts.

Mantle wasn't cancelled. It was birthed as twins and adopted into the world as Vulkan and DX12.
SR/EX Opterons were cancelled due to insufficient processes to fabricate them.
AM4 is a leftover from Skybridge. A common socket.
Seattle/K12 was delayed due to having to update the platform ecosystem.
20nm designs were cancelled due to weak 20nm process.
Tablet plans cancelled due to other companies flooding the market (free Intel and cheap ARM subsidized by governments).
Investing in customized processes is something no fabless company does.
Outsourcing to 3rd parties for design is something ALL companies do already.
Seamicro was doomed to failure because of the likes of HP, Supermicro and others. (Horrible purchase by Rory)
The 300 series was largely unchanged because 28nm is at its limit. (A minor re-spin was involved)
HSA is a slow ramp requiring the tool chains to be built and matured. (Rome wasn't built in a day)



 


Raja said the demand for dGPU will always be there. always > 2020

 


I think Juanrga has a point actually. There were lots of planned releases that didn't happen as AMD have moved all their remaining R&D resources onto a few core products.

On the other hand, I think it's also worth keeping in mind that with Intel and nVidia, both companies are spending much of the R&D budget on areas outside of the core products. So it may not be so far fetched for AMD to compete. Intel develops multiple cores for all sorts of different segment s (for example did you know they make a range of micro controllers for embedded applications and have been pushing programmable ICs for wearable tech?), it also develops chip sets, graphics, the Xeon Phi accelerators not to mention is process tech and then they have all sorts of random side projects going on.

nVidia is also developing in multiple markets, much of they're recent increase in R&D spend is on custom ARM cores and socs, and entering into new markets (e.g. cars).

My point is even with the reduction in AMD's R&D spending, they could well be matching Intel and NV in the areas they are choosing to compete in. What's been happening is AMD have simple reduced the number of markets it's involved with. This actually strikes me as a smart move. Intel have to move sideways to new markets as they are already dominant in pc's and servers and those markets (particularly client side) have peaked and are starting to decline (although I don't buy the 'PC is dead' cool-aid going around, I think it's more a case of a saturated market coupled with the fact that *some* of the older customers are making do with tablets for media consumption). The same is true for NV, they have to spend money on moving into new areas as they have the lions share of the discreet GPU market on PC.

This actually provides an opportunity for AMD- they have such a small proportion of both markets they can double down everything into competing there and they have lots of room to grow. A few % point swing in their favour in any market could make all the difference to their financials.
 


"Soon" includes 2020-2025, probably even to 2030-2035.

In other words...I meant soon to be a widecast net into the future.

I will likely be retired with grandchildren in college before dGPUs go anywhere at all.
 


AMD will be bankrupt by 2020 if Zen does not deliver
 


Probably. There's a lot of optimism built into their stock right now, but if Zen doesn't both turn a decent profit (enough to pay their debt payments) and turn market share, the investors will panic and bolt. AMD NEEDS to raise $600 Million by 2018. They are literally on a two year timer to essentially raise half their companies total worth. That's a very tall order in the best of times.

Remember, Zen can also perform well, but AMD needs to sell, at a profit. And that could be the key point here.
 


Are we talking stocks again already? What about the architecture?

The new cache system and the implementation of said cache system is quite interesting, if they pull it off, that could potentially be something along the lines of a 10% improvement in performance alone.
 


To be fair, it's not exactly like they could possibly make their cache latencies worse then they were in BD. I remember when the slides leaked about the latency numbers for the L1/L2 caches, and even I disregarded them because there was no way they could possibly have been that bad.

So yeah, AMD could quite easily improve their cache by a decent percent. Because it was HORRID last time around.
 
So, let's say AMD does go bust... What happens then? Does Intel get split up in some kind of antitrust action, or is it given a monopoly on x86 because the rationale is it has competition from ARM etc?
 


Between ARM, Power, SPARC, MIPS there are plenty of alternatives to x86. Besides the government doesn't mind monopolies. It only minds monopolies that abuse their position too greatly.
 


Yeah but then again we *are talking about Intel here*.... if any large firm has a proven track record of abusing their position it would be Intel 😛

One thing I do know- if AMD goes under it's going to be bad news for consumers. We will progressively get less and less for the money. Thankfully with the spin off of gpu back into a discreet business it looks likely that in the event AMD does go the way of the Titanic at least we will keep Radeon Group which should help keep nVidia honest.
 
Amd will not die 100% for example like juan said a long time ago they will break up the company we might even still have the radeon team. Its X86 division might go away though, either way lets wait i actually think they might be OK if they can also get the K12 out and find a market for that Intel will simply never get rid of their ego when it comes to X86 meaning Amd might be able to really make a great Arm CPU for the server market

 


This. X86 is actually a minority when it comes to CPU sales. There's nothing stopping you from running a Linux distribution on an alternative CPU architecture after all; you do it with your mobile devices all the time.

X86 is limited pretty much to the Desktop/Server space at this point. Controlling one market segment does not make one a monopoly.
 
When you think about it, there are only two providers of x86 chips, and one of them is at a risk of going away. Maybe that is the reason why Microsoft decided to build an ARM Windows kernel, to get out of this fragile situation.

And maybe that is why AMD is (?) developing an ARM chip, to set itself free from this bad deal. If AMD eventually leaves x86 (in a not-soon future), Intel is alone in an abandoned market.

Maybe that is very heavy speculation, but it makes sense that other companies don't want x86. AMD may have delayed K12 because of financial issues, but it may be their future escape from its dependency on Intel.
 
X86 is frankly a really poor processor architecture; there's a reason it's sometimes called "the 30 year stop-gap". Problem is, all it's replacements have failed in the marketplace, mostly due to lack of Windows support. Classic PowerPC was a very efficient architecture, but IBM ran into a clockspeed wall. Itanium was probably the best chance to replace X86, but AMDs X86-64 implementation won the day by simply expanding the instruction set.

Also remember Win NT used to support dozens of CPU architectures (X86, X86-64, Itanium, SPARC, PPC, POWER, DEC-Alpha, and a few others), but as those architectures failed, they stopped being supported. Lack of SW support was another mitigating factor.

An ARM build made sense in order to jump on the mobile bandwagon, as ARM is more competitive at the low end. Of course, the same SW support issues reared its head, making Win-RT not very attractive compared to the larger SW available to iOS/Android.

At the end of the day, the one common thread here is X86's legacy SW support. The performance hit doomed Itanium on the Desktop, and lack of compatibility killed pretty much every other CPU arch that attempted to replace X86 in the Desktop space. That's why I've always been pessimistic K12 is going to do anything for AMD.
 


Considering the lack of competition from AMD recently I think Intel has done well in giving up what improvements they can. Currently they are hitting walls in process and performance and even with a more competitive AMD they wouldn't be able to go much faster.



It really was just a stop gap. Intel had a whole other uArch they were going to come out with after the 8008 and the 8086 was just the in-between until they could get the iAPX 432 uArch out but it proved so popular that it stuck and nothing they tried could get them out of it, even if they had a better uArch.

If anything we can also blame AMD because instead of trying to push a fully 64bit uArch out, they extended the x86 life by adding the 64bit extensions to it.
 


Hmm, push a uArch you have a license to produce...or push a uArch you have no rights to produce...?

I think, from a business perspective, we can see why AMD did what they did.
 
Status
Not open for further replies.