AMD Drops 24% After Low Revenue And Guidance, Crypto Crash

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

I believe the phrase you want is "in the black". Red ink is (sometimes) used to indicated losses, while black ink would be used to indicate profits.
 

I think that's true of their investors and boards, who ultimately make the decisions. However, Lisa Su's husband is a gamer. Many of their employees are gamers and PC geeks.

So, I sort of agree. They are going to generally behave in profit maximizing ways, and competition is a good means of keeping that in check. But there's a lot about their products, SDKs, etc. where you can tell that people do care and have tried to put forth their best effort.
 

The consoles just have two jaguar quads with a custom amount of GCN cores he wasn't talking about having the exact same hardware but about the hardware arc being the same on desktop while also being completely different from nvidia's.
The reason the new consoles got x86 cores was to get rid of porting,nowadays we are lucky if we even get proper keyboard support...
All of this doesn't even matter though, because for windows you are forced to code for DirectX unless you want to support the game only for the version of windows that was out when the game was coded,all the support and compatibility comes from directx (and the drivers that are made for directx) .
 

I wasn't totally sure, so just covering all bases.


I did touch on that. It doesn't seem that porting is truly a thing of the past, but perhaps it's easier. Anyway, I'm not convinced x86 was really about porting so much as reducing hardware development costs and developer learning curve.


All 3 GPU makers also support OpenGL and Vulkan on Windows. Of course, XBox One uses DirectX, while PS4 uses some Sony proprietary API.

A lot of console games probably use an engine that, itself, already supports all platforms.
 

It is based on the same hardware as a PC, just in a customized and incompatible configuration.


I wouldn't go that far. It was cheaper than totally custom hardware/CPUs as the design was pretty much already done. As to porting, its more of a means of simplifying the porting process. Keyboard support is still a bit of a mix of hardware vendor and software support. Microsoft is just starting to realize the demand and step up their game and support PC user input devices. Game vendors, historically had no reason to support them outside of a port to PC or a PC native game.


This is incorrect. Windows offers two other APIs for graphics via hardware drivers: the older OpenGL, and what has been touted as its replacement, Vulkan. Both are available on Linux as natively as they are in Windows. DirectX is, of course, proprietary and, unless you use WINE, not available under Linux or Mac OSX. Apple of course would rather you use their API although the others still exist for their OS.
 


Unfortunately the chips AMD puts in the consoles is EXTREMELY low margin business. Sony and Microsoft squeezed so hard that Intel literally said "not worth it" and chose to not even compete for the platform.


(Not to mention that these Jaguar SOC's are extremely custom, which is NOT cheap to do)
 

They are semi-custom. They already had most of the building blocks before they started designing them. They made money on the custom PS4/Pro and XB1/XB1X chips. Profit is profit, and AMD needs all the profit they can get.

Intel meanwhile didn't have the ability to poop out competitive graphics for a SoC. Maybe by the next gen they'll be able to offer something competitive. Nvidia was in a similar boat only with CPU cores. Their latest designs are a lot faster, but now AMD has Zen so if there was another console SoC bid, I'm pretty darn sure a Zen-based AMD SoC would be favored over Nvidia's Carmel.
 


Intel owns plenty of graphics IP...they could have "pooped out" a console caliber SOC if they wanted to.

And all profit is not equal, especially when you have a finite number of designers, compute grid resources, and EDA licenses to apply to multiple business units and products, some of which are far MORE profitable than others. It's ridiculous how many consoles there are out there and the relatively little profit AMD sees from it.

Heck, even Microsoft and Sony basically look at the hardware as a loss leader...how much profit can there be in that scenario? A quick google suggests not much...but as you say, for a fabless semiconductor outfit in desperate need of sockets, it probably did make sense for AMD.
 

XBox One X has more heavily-customized cores.


I wonder how they bill for this. My guess is they bill an upfront service charge for the design work, and then get some small royalty on each SoC made.


I've actually wondered if this is where Intel's IRIS Pro graphics originated. The timing is about right, and MS has long had a thing for on-chip video memory, which ties in nicely with the Intel GPU's 128 MB of eDRAM. The XBox 360 and original One both had something like that, which they finally ditched with the One X.


In the benchies I've seen, Carmel is oddly lagging behind the Denver 2 cores of its predecessor (Parker SoC; AKA Tegra X2) and not up to par with modern x86.

I don't know if the "pull" factors are there, for Nvidia to get into the console business, at this point. They're so focused on AI, I was starting to think they regarded graphics as a distraction (until they intro'd the RT cores, that is).

Even Nintendo Switch was a case of taking an existing off-the-shelf SoC (Tegra X1) and plunking it down. I'm not aware of any customization Nvidia did for them. The same won't happen with Xavier, which is decidedly AI-focused and not economical to use for any sort of games console.
 


Yeah, if NVIDIA is successfully moving Turing GPU's for the price of 2 or 3 entire consoles then I think we know where the math will lie. And their progress putting these things into the datacenter is no joke.
 
Owning IP isn't the same as having a good design. See Larrabee, Intel HD graphics. Intel lacks graphics talent... well, they did. They're snagging all the graphics guys they can, and are even building a graphics R&D facility up in North York. THEY know they can't just slap some IP into the Gruntmaster Design-O-Matic and summon forth a fast GPU. That's why they're finally getting serious and in a couple years we might see the fruits of their labor.

AMD made enough money to make it worth their while. Look at their semi-custom income over the past several years and add that together. They didn't even have to design a fresh CPU or GPU core for any of the console SoCs they've done in the past couple gens. It doesn't matter if MS or Sony break even or even lose money on the hardware. That has absolutely nothing to do with the money they paid AMD, a fabless designer, for the design (and probably a little extra per-SoC).

Looking at a block diagram I'd say the original Xbox One SoC was more of a custom effort. What with the wide DDR3 controller, the eSRAM, and various custom blocks. Said custom blocks seem to have carried over to the One X design (Scorpio) rather than being original to it. In fact other than those carry-over blocks, Scorpio is a lot more similar to the PS4 / Pro designs than the original XB1 design. At any rate none of the designs utilized new CPU or GPU architectures.


Possibly, but even the most elite Iris chip lags FAR behind what even the GPU in the original XB1 could do. Maybe they had a larger proposed design coupled with Atom cores... but I still can see why AMD's SoC would get the nod. Maybe in the future Intel will have a really amazing GPU design, but that didn't mean much back during the design phase of these consoles.


Ooof. I had seen the opposite in leaked Linux benchies, but either way none of it is remotely competitive with the top x86 designs, which was my point. They would have to spend a lot more effort to build something competitive than AMD, who already has all the major pieces in place. That is what makes it less worthwhile for Nvidia to pursue, if they could even do it in the allotted time.