I wonder if this has anything to do with SoftBank needing to somehow cover the losses on their WeWork investments.
That and their stuff up with Uber.
I wonder if this has anything to do with SoftBank needing to somehow cover the losses on their WeWork investments.
Which ultimately leads down the same road: higher total price and higher profits for Nvidia.
Is that a problem if you're getting more value out of your hardware? The RTX 3080 is pricey, sure, but you'll get your money's worth. That wasn't the case prior to the consolidation of the desktop GPU market. Back then, if you buy Nvidia's TNT, you miss out on all the games written to Glide. If you get a Voodoo, you eventually end up with an obsolete piece of hardware. If you're bargain-minded and pick up an S3 product, you might as well as throw your cash into a trash bin.
That makes my Nuvia quote all the more relevant. Perhaps you're not aware that Apple's lead CPU designer quit in order to found it? Of course that's also why Apple is suing them, but that's another matter.Apple is killing it right now in performance/watt. So much wasted potential. If their chip division was its own company, it would probably be crushing the mobile market.
I don't call Amazon "nowhere".The licensees that will end up competing directly with Nvidia are all spinning their wheels in the sand anyway. ARM servers aren't really going anywhere.
x86 is still lacking long-term health, due to its inherent inefficiency (see the Nuvia perf vs. W chart from the part of my post you didn't quote). ARM just needed to get into the ballpark, and I think with Graviton 2, they've probably arrived at the gates.There were some market interest a few years ago when AMD was on the verge of bankruptcy. Now that healthy competition has returned to the x86 platform, that interest is waning.
I meant current & likely-future licensees, among which I count Intel and AMD.Nvidia doesn't compete with the larger, more successful licensees.
AMD was in a position where it stood more to gain by staying in x86 vs. going elsewhere. The question is how many markets are like that for ARM. I think server isn't there, yet. This move could really hurt ARM's efforts to further penetrate that market.In any event, conflict of interest is hardly an unmanageable problem. AMD has gotten pushed around by Intel for years. It's doing pretty well
Intel already tried x86-based smartphone chips and failed miserably. x86 just doesn't have the power-efficiency.it will make more sense for everybody to make the jump over to x86, making smartphones compatible with windows software will be a great selling point as well, and x86 has no problems running android at no speed penalty.
I'm not saying there weren't problems with that acquisition and how it was managed, but you seem to be missing that laptop and other markets requiring an integrated GPU are now an essential part of AMD's product portfolio. And you don't get that without either acquiring a GPU maker or spending $Billions to create your own, which inevitably won't be as good (see how long it's taken Intel to catch up).I think that this may have been as big a blunder as when AMD bought ATi
Microsoft has invested a huge amount in Windows on ARM, at this point. I'm sure they'd rather not see the industry take a left turn to RISC V or POWER, right now.Microsoft doesn't really have a dog in this fight.
It's funny how you pin that squarely on the ATI acquisition, completely ignoring the fact that Intel launched Core 2 around then. The Pentium 4 years were the last time AMD actually had the leading-edge performance, so of course they had nowhere to go but down.When AMD bought ATI in 2006 it led to 3 years of catastrophically high losses for AMD
Yeah, iGPUs that have maybe finally caught up, after like 1.5 decades? Brilliant!while intel just slowly developed an iGPU.
It was a list of companies with both means and motive. I think Apple lack motive to justify the cost. They just want the ARM ISA and don't really want to be in the chip business, so it has limited strategic value for them. All they'd get is eliminating a potential pricing issue for licensing the ISA, and $40B is much too high an up-front price to pay for that.You forgot Apple.
FWIW, their Quadro-branded cards are made by PNY. Not sure about the Founders or datacenter cards, though.There's zero reason why they couldn't pick one supplier and say "We'll give you 100% of all mfg, if you give us a steep discount to make OEM cards for us with our branding." To the big mfg (like Sapphire, or EVGA, Foxconn, etc...) that's a workable deal because while they make less per unit, they capture a majority of the market, and NVIDIA bears the risk of selling them, thus netting a higher net profit, while also handicapping their competition.
It's a standard supplier dynamic - you want to maximize revenue, which is the product of volume * margins. If you try to pad your margins too much, then your customers' volumes will drop and you'll get less revenue.What the hell did Intel and Nvidia know about their own interest?
There are some open source RISC V cores. You can't build entire chips with just open source IP, due to things like memory controllers (there's a company that tried, and that's what they ran up against).Whelp it's time for a few smart people to work on putting some "open source" CPU intellectual property together.
If AMD hadn't put all of their assets and 2 billion in debt on top of that into acquiring ati they might have been able to make a better CPU that could compete with core 2.It's funny how you pin that squarely on the ATI acquisition, completely ignoring the fact that Intel launched Core 2 around then.
It was brilliant because the iGPUs didn't have to be any good, they just had to be able to display the desktop so that the OEMs wouldn't have to pay for chipset graphics so they would save a few cents on the mobos, instant favor towards making more models with intel.Yeah, iGPUs that have maybe finally caught up, after like 1.5 decades? Brilliant!
That is precisely the kind of thing which gets the attention of regulators and attracts lawsuits. Plus, it's hard to see how you even do that.Nvidia is not going to jack up prices of ARM licenses. What they'll do is make it more difficult to pair an ARM CPU with a non-Nvidia GPU.
They were already behind, by that point. The CPU you're selling today is the one you started designing 3-4 years ago.If AMD hadn't put all of their assets and 2 billion in debt on top of that into acquiring ati they might have been able to make a better CPU that could compete with core 2.
But it's true! It took them until Phenom II to release a CPU that could finally hold court with Core 2, but Intel was already onto Nehalem, by then, and Sandybridge not long after.It's funny how you accuse AMD of just not being able to compete with core 2 at all.
For a lot of uses, sure. However, until recently, graphics was literally the only selling-point AMD APUs even had. Take away that, and they're basically out of the market, like what happened with AMD in servers.It was brilliant because the iGPUs didn't have to be any good, they just had to be able to display the desktop
So, it's a reasonable question to ask whether MS' suppliers were being too narrowly-focused by squeezing them on margins. Not that I'm taking a position on the matter, but just explaining
Right. Because, as the say goes: "once bitten, twice shy."AMD nearly went bankrupt while holding both the PlayStation and Xbox contracts. Sony was making record-breaking profits off software sales. Microsoft was making record-breaking profits off software sales. Did AMD see any of that? Nope.
It's not nonsense - it's a real thing. If your customer sells fewer systems, that volume translates directly back to your own parts sales.An appeal to some sort of common interest in this cut-throat business is just non-sense.
It's not nonsense - it's a real thing. If your customer sells fewer systems, that volume translates directly back to your own parts sales.
That still doesn't mean console makers are bottomless money pits. They have projections of how much parts costs will change over time and probably bake that into their financial plans that underlie their support for certain retail pricing levels.It's complete utter nonsense. We all know how the console market works. Machines are sold at a loss so money can be made through software.
I'm not talking about necessarily better just good enough to compete, if they had any money left during that time they could have come up with at least something.They were already behind, by that point. The CPU you're selling today is the one you started designing 3-4 years ago.
Intel held the FAB advantage for desktop all the time until AMD switched to TSMC's 7nm node which only happened with zen2 last year.Also, don't forget about fabs. Until their 14 nm troubles, Intel had long held a fab advantage.
I wasn't talking about users,OEMs can pick any intel desktop CPU anyone at all (instead of a very small selection of much more expensive CPUs) and make a system out of it without having to provide a GPU.For a lot of uses, sure. However, until recently, graphics was literally the only selling-point AMD APUs even had. Take away that, and they're basically out of the market, like what happened with AMD in servers.
If anyone wants to have rough figure of console margins then look no further AMD's EESC (Enterprise, Embedded, and Semi-Custom) division results for the past few years. That division is responsible for console chip development and sales. Compare the operating income versus revenue.
I think that you misunderstood me. I didn't mean that AMD's acquisition of ATi was a mistake in itself, I meant that the WAY it was done was a mistake. AMD paid something like triple ATi's value because ATi didn't want to be bought out. THAT was the mistake, not the acquisition itself. I'm sorry if I wasn't more clear.I'm not saying there weren't problems with that acquisition and how it was managed, but you seem to be missing that laptop and other markets requiring an integrated GPU are now an essential part of AMD's product portfolio. And you don't get that without either acquiring a GPU maker or spending $Billions to create your own, which inevitably won't be as good (see how long it's taken Intel to catch up).
See also: game consoles. That wouldn't have happened for AMD, without ATI being well-integrated. That stable revenue stream saw AMD through some pretty dark times.