Discussion: AMD's last hope for survival lies in the Zen CPU architecture

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Remember that is the reason why Ruiz got the boot in the first place. He sold all of the low power assets from the ATI purchase to Qualcomm and basically said "we're not interested in low power devices" at the time.

When they realized the mistake, it was too darn late.

Cheers!

EDIT: Typos.
 
I think this here is key "On the graphics side, the company can compete in the higher end market but it doesn’t really had much new to offer in mainstream and entry level."

A lot of people do own AMD gpu's however I expect they not upgrading every 1-2 years like nvidia owners are because with high end aside AMD are just doing rebrands of old chips. The money is in the mainstream not high end.

A question is tho do AMD make a profit in the console market as they have the console market sown up.
 

AMD's semi-custom division (the one that handles console SoCs) is the only part of AMD that still makes a profit but it only earned 27M$ net in the last quarter while AMD as a whole lost ~140M$.

AMD does make a profit from console chip sales (around $10 per chip) but it is nowhere near enough to offset AMD's losses everywhere else.
 
AMD is not doing very well in the GPU side of their business. As of mid-2015 they only control 18% of the dedicated GPU market. That is down from 38% back in mid-2014. In one year AMD lost about 53% of the dedicated GPU market they once had.

http://www.dualshockers.com/2015/08/21/amd-owns-only-18-of-graphics-card-market-share-nvidia-rises-over-the-80-mark/
 


The numbers I've heard are roughly half of that, or about $5 in profit per chip sold, before factoring in the R&D and production costs. So I'd imagine AMD is pretty close to break even. Embedded components typically aren't high yield items.
 
I wish ati had continued to exist as its own entity. People talk about intel and nvidia attempting to put a stranglehold on the market yet intel didn't go snatch up nvidia. What's worse is when a company goes and buys up the competition just to tank it a few years later. Now if amd goes down, their cpus and gpus go down. Had ati remained its own company then at least they might still be around if that happens. Sort of like yahoo running around snatching up various web services just to turn them into failed mediocrities.
 

If AMD goes chapter 7, their GPU IP could still be liquidated as a separate set of assets from their other stuff. The problem is: who would buy ATI? Intel has their own in-house IGP design, ARM chip designers use either ARM's Mali or Imagination's PowerVR except for Nvidia who uses their own.

Microsoft and Sony might pair together to maintain a skeleton crew of AMD's SoC designers together long enough to see the XBO and PS4 through the remainder of their market lifespan if they want to do a 14-16nm shrink or just buy the mask designs to have the existing chips fabbed for themselves.
 


Well if they did a die shrink they would want a upgrade. Smaller process node, add more SPUs and better CPU.
 

The whole point of consoles is that specifications that matter from the software development side of things remain close to exactly the same through hardware revisions. They can do the die shrink, integrate multiple chips into one to save power, make the system smaller and cheaper, but they cannot change fundamental performance characteristics by much before running into cases where software behaves noticeably different between the old and new models.

If they add major hardware features, they likely won't be allowed for use in games since that would break compatibility with the standard console. Support stuff like hardware encode/decode for h265 and VP9 for example, which have no effect on game play itself but would enable streaming services and apps to use more efficient video encoding for recording and playback.
 

Isn't the reason behind them being at 28nm, GloFo?
 

IIRC, as part of the fab spin-off, AMD has committed to buying some volume of GloFo's wafer starts. Unless they use up their wafer starts before contracting other chip foundries, they end up having to pay GloFo for downtime.
 


Pretty much this. Trust me, something as simple as changing the bus timings by a few ns would show up in many games; the entire point of consoles is that because you have a fixed HW spec, you can code to a VERY low level. If that assumption is ever broken, you WILL see titles just stop working. There will NEVER be a faster CPU or GPU for the XB1/PS4, best you'll get is a die shrink.
 
Regarding the GPU market: You guys think that maybe they are banking on the supposed superiority of AMD GPUs over NVIDIA GPUs when it comes to DX12? What with the first DX12 games coming out soon-ish. Could that work?
 
DX12 won't be a magic pill, just like DX10.1 wasn't for the HD4800 series.

Game development has become a very complex mess from the point of view of middleware, so when talking about any underlying API you have to think about all of the other stuff Devs use to actually create a game and how they affect the pieces. Long story short: it's not the same using UE with DX12 capabilities than building your own in-house engine using DX12.

AMD needs to get a new GPU arch soon, or at least cut the fat from GCN to be back in the game for real.

Cheers!
 

AMD's dwindling GPU market share says most people do not care about some hypothetical future gains in future games. People are more interested in actual performance in games they either already own or will be playing in the very near future.

All my desktop GPUs have been ATI/AMD starting with a 1MB ATI VGA Wonder but I have a feeling my next GPU might be Nvidia, in part to spite AMD for rehashing mostly the same GPUs with a confusing rebadging scheme that mixes slightly newer and old through 3-4 years.
 


Even with the lower level coding I would think that x86 and DX on the XB1 would provide some flexibility in the hardware since Windows 10 on the XB1 is supposed to be very close to the desktop version.

Of course I wouldn't know but I wouldn't count it as impossible.

Yuka, DX10.1 doesn't do nearly the same level as DX12 though. While it wont be some insane boost it will give developers a lot more room to work with and will increase performance more than 10.1 would have anyways.

GCN is not a bad uArch but it hasn't really been fundamentally changed since introduction. Just minor changes or tweaks to the power config. A new uArch would be nice or at least a much more power efficient GCN because GCN is very power hungry compared to Maxwell.



I haven't owned an nVidia GPU for a very long time but I am in the same boat as you. Disappointing performance returns have made me actually consider a nVidia GPU and I never thought I would TBH.
 

Have you actually seen the performance of the 290X under the Fable Legends benchmark compared to their nVidia counterparts?
 
Fury X is the odd ball. There's a reason I mentioned the 290X and not the Fury X. Compare the 290X directly with the performance of its competitors between DX11 and DX12.

You people jump to the worst example without analyzing what's really going on. Want to know why I did not take the Fury X into account? I'll gladly tell you...

From that same link. Notice something? No? Well... The Fury X has a higher framerate with an i3 than with an i5.

1080pi5.png

1080pi3.png


The i7 is even worse.
What does that tell you?
 
Looking at the Nano (which, like Fury and Fury X, looks more like a proof of concept than a proper product), it looks like DX12 and the improvements in Powertune were what AMD really needed to catch Maxwell. Without any significant arch changes, the Nano is very competitive with Maxwell's GTX 970/980 competitors in power efficiency. I'm not saying that GCN couldn't use some updating- just that it's aged very well when you put these things into context. Keep in mind that GCN, a rather old arch right now, completely closed the power efficiency gap once they got the Powertune software working right and gave it HBM. Under DX12, GCN might not even need HBM to be rather competitive against Maxwell in efficiency. I'd argue that things are looking good for AMD in the graphics front in the next generation.
 
Status
Not open for further replies.