News Nvidia Announces ARM Acquisition for $40 Billion

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
Which ultimately leads down the same road: higher total price and higher profits for Nvidia.

Is that a problem if you're getting more value out of your hardware? The RTX 3080 is pricey, sure, but you'll get your money's worth. That wasn't the case prior to the consolidation of the desktop GPU market. Back then, if you buy Nvidia's TNT, you miss out on all the games written to Glide. If you get a Voodoo, you eventually end up with an obsolete piece of hardware. If you're bargain-minded and pick up an S3 product, you might as well as throw your cash into a trash bin.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
Is that a problem if you're getting more value out of your hardware? The RTX 3080 is pricey, sure, but you'll get your money's worth. That wasn't the case prior to the consolidation of the desktop GPU market. Back then, if you buy Nvidia's TNT, you miss out on all the games written to Glide. If you get a Voodoo, you eventually end up with an obsolete piece of hardware. If you're bargain-minded and pick up an S3 product, you might as well as throw your cash into a trash bin.

RTX 3080 is not pricey , it is on track . the RTX 3090 is the pricey card. but still I dont lke nvidia buying ARM ... This will make all smartphones more expensive .. Knowing how nvidia business model works.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
Apple is killing it right now in performance/watt. So much wasted potential. If their chip division was its own company, it would probably be crushing the mobile market.
That makes my Nuvia quote all the more relevant. Perhaps you're not aware that Apple's lead CPU designer quit in order to found it? Of course that's also why Apple is suing them, but that's another matter.
 

bit_user

Polypheme
Ambassador
The licensees that will end up competing directly with Nvidia are all spinning their wheels in the sand anyway. ARM servers aren't really going anywhere.
I don't call Amazon "nowhere".

The problem ARM had, for so long, was their approach of designing cores that prioritized a low power envelope. That's what held them back in server usage scenarios, which are still more perf/W-sensitive than desktops, but it's not a priority on the same level as when you're building a cell phone SoC.

In the past few years, they've created a separate product segment that's focused purely on higher-power usage scenarios, like cell phone base stations and servers. We've only begun to see those initiatives bear fruit.

There were some market interest a few years ago when AMD was on the verge of bankruptcy. Now that healthy competition has returned to the x86 platform, that interest is waning.
x86 is still lacking long-term health, due to its inherent inefficiency (see the Nuvia perf vs. W chart from the part of my post you didn't quote). ARM just needed to get into the ballpark, and I think with Graviton 2, they've probably arrived at the gates.

Nvidia doesn't compete with the larger, more successful licensees.
I meant current & likely-future licensees, among which I count Intel and AMD.

In any event, conflict of interest is hardly an unmanageable problem. AMD has gotten pushed around by Intel for years. It's doing pretty well
AMD was in a position where it stood more to gain by staying in x86 vs. going elsewhere. The question is how many markets are like that for ARM. I think server isn't there, yet. This move could really hurt ARM's efforts to further penetrate that market.
 

bit_user

Polypheme
Ambassador
it will make more sense for everybody to make the jump over to x86, making smartphones compatible with windows software will be a great selling point as well, and x86 has no problems running android at no speed penalty.
Intel already tried x86-based smartphone chips and failed miserably. x86 just doesn't have the power-efficiency.
 

bit_user

Polypheme
Ambassador
I think that this may have been as big a blunder as when AMD bought ATi
I'm not saying there weren't problems with that acquisition and how it was managed, but you seem to be missing that laptop and other markets requiring an integrated GPU are now an essential part of AMD's product portfolio. And you don't get that without either acquiring a GPU maker or spending $Billions to create your own, which inevitably won't be as good (see how long it's taken Intel to catch up).

See also: game consoles. That wouldn't have happened for AMD, without ATI being well-integrated. That stable revenue stream saw AMD through some pretty dark times.
 

bit_user

Polypheme
Ambassador
When AMD bought ATI in 2006 it led to 3 years of catastrophically high losses for AMD
It's funny how you pin that squarely on the ATI acquisition, completely ignoring the fact that Intel launched Core 2 around then. The Pentium 4 years were the last time AMD actually had the leading-edge performance, so of course they had nowhere to go but down.

while intel just slowly developed an iGPU.
Yeah, iGPUs that have maybe finally caught up, after like 1.5 decades? Brilliant!
 

bit_user

Polypheme
Ambassador
You forgot Apple.
It was a list of companies with both means and motive. I think Apple lack motive to justify the cost. They just want the ARM ISA and don't really want to be in the chip business, so it has limited strategic value for them. All they'd get is eliminating a potential pricing issue for licensing the ISA, and $40B is much too high an up-front price to pay for that.

I was hoping MS or Google would step in. However, I think we hit the problem of insufficient motive. However, as both are platform providers, I think ARM would've been in fairly good hands. Not that I love either of those companies, but I rate them well above Nvidia.
 
  • Like
Reactions: digitalgriffin

bit_user

Polypheme
Ambassador
There's zero reason why they couldn't pick one supplier and say "We'll give you 100% of all mfg, if you give us a steep discount to make OEM cards for us with our branding." To the big mfg (like Sapphire, or EVGA, Foxconn, etc...) that's a workable deal because while they make less per unit, they capture a majority of the market, and NVIDIA bears the risk of selling them, thus netting a higher net profit, while also handicapping their competition.
FWIW, their Quadro-branded cards are made by PNY. Not sure about the Founders or datacenter cards, though.
 

bit_user

Polypheme
Ambassador
What the hell did Intel and Nvidia know about their own interest?
It's a standard supplier dynamic - you want to maximize revenue, which is the product of volume * margins. If you try to pad your margins too much, then your customers' volumes will drop and you'll get less revenue.

So, it's a reasonable question to ask whether MS' suppliers were being too narrowly-focused by squeezing them on margins. Not that I'm taking a position on the matter, but just explaining @digitalgriffin 's point, as I see it.
 

bit_user

Polypheme
Ambassador
Whelp it's time for a few smart people to work on putting some "open source" CPU intellectual property together.
There are some open source RISC V cores. You can't build entire chips with just open source IP, due to things like memory controllers (there's a company that tried, and that's what they ran up against).

The problem with chips is that while they have a lot in common with software, the designs that are cutting-edge in performance or efficiency need to be closely matched with a manufacturing process. And that involves large amounts of time, money, and energy that you need to recoup from that single generation. Also, just getting any chips made involves significant up-front costs and a few respins.

So, the economics of semiconductors is what really constrains the industry to proprietary designs. The closest we've really gotten is open ISAs (like ARM) and even royalty-free ISAs, like RISC V, SPARC and recently even POWER.

That said, IBM recently gave away one of its efficiency-oriented POWER cores.
 
It's funny how you pin that squarely on the ATI acquisition, completely ignoring the fact that Intel launched Core 2 around then.
If AMD hadn't put all of their assets and 2 billion in debt on top of that into acquiring ati they might have been able to make a better CPU that could compete with core 2.
It's funny how you accuse AMD of just not being able to compete with core 2 at all.
Yeah, iGPUs that have maybe finally caught up, after like 1.5 decades? Brilliant!
It was brilliant because the iGPUs didn't have to be any good, they just had to be able to display the desktop so that the OEMs wouldn't have to pay for chipset graphics so they would save a few cents on the mobos, instant favor towards making more models with intel.
 

bit_user

Polypheme
Ambassador
Nvidia is not going to jack up prices of ARM licenses. What they'll do is make it more difficult to pair an ARM CPU with a non-Nvidia GPU.
That is precisely the kind of thing which gets the attention of regulators and attracts lawsuits. Plus, it's hard to see how you even do that.

What they can do is offer bundle deals for ARM cores and NVidia GPU IP. I assume ARM already does that, with its Mali GPUs.

They can also offer discounts on things like NVLink IP, so there ends up being more CPUs with support for it.
 

bit_user

Polypheme
Ambassador
If AMD hadn't put all of their assets and 2 billion in debt on top of that into acquiring ati they might have been able to make a better CPU that could compete with core 2.
They were already behind, by that point. The CPU you're selling today is the one you started designing 3-4 years ago.

Also, don't forget about fabs. Until their 14 nm troubles, Intel had long held a fab advantage.

It's funny how you accuse AMD of just not being able to compete with core 2 at all.
But it's true! It took them until Phenom II to release a CPU that could finally hold court with Core 2, but Intel was already onto Nehalem, by then, and Sandybridge not long after.

It was brilliant because the iGPUs didn't have to be any good, they just had to be able to display the desktop
For a lot of uses, sure. However, until recently, graphics was literally the only selling-point AMD APUs even had. Take away that, and they're basically out of the market, like what happened with AMD in servers.
 

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
So, it's a reasonable question to ask whether MS' suppliers were being too narrowly-focused by squeezing them on margins. Not that I'm taking a position on the matter, but just explaining

AMD nearly went bankrupt while holding both the PlayStation and Xbox contracts. Sony was making record-breaking profits off software sales. Microsoft was making record-breaking profits off software sales. Did AMD see any of that? Nope. An appeal to some sort of common interest in this cut-throat business is just non-sense.
 

bit_user

Polypheme
Ambassador
AMD nearly went bankrupt while holding both the PlayStation and Xbox contracts. Sony was making record-breaking profits off software sales. Microsoft was making record-breaking profits off software sales. Did AMD see any of that? Nope.
Right. Because, as the say goes: "once bitten, twice shy."

Microsoft obviously learned from it's experience with the 1st gen XBox. There were careful to structure the agreement for XBox One so that AMD did the design work by contract, but that Microsoft owned and controlled the resulting IP.

I believe the structure of Sony's agreement for PS4 was equivalent.

An appeal to some sort of common interest in this cut-throat business is just non-sense.
It's not nonsense - it's a real thing. If your customer sells fewer systems, that volume translates directly back to your own parts sales.

One problem is that when both Intel and Nvidia were squeezing MS, it was like a zero-sum game. Neither can afford to be nice, because the other will eat their lunch. So, perhaps they were enganged in a sort of brinksmanship dynamic that might not have happened if either was the sole supplier to MS.

Perhaps that was another selling point in favor of MS and Sony both opting to get their CPU and GPU IP from the same supplier.
 
  • Like
Reactions: digitalgriffin

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
It's not nonsense - it's a real thing. If your customer sells fewer systems, that volume translates directly back to your own parts sales.

It's complete utter nonsense. We all know how the console market works. Machines are sold at a loss so money can be made through software. Volume at zero or negative margin is not something you want unless you're desperate.
 

bit_user

Polypheme
Ambassador
It's complete utter nonsense. We all know how the console market works. Machines are sold at a loss so money can be made through software.
That still doesn't mean console makers are bottomless money pits. They have projections of how much parts costs will change over time and probably bake that into their financial plans that underlie their support for certain retail pricing levels.

Also, break-even is not a business plan. Sony and MS will have certain margin and profit expectations to which investors will hold them accountable. So, even when they're making bank on software and taking a loss on hardware, it doesn't mean they don't still need to limit those losses. Otherwise, they could practically give away the HW to ensure market domination.

BTW, when I was a kid, I had not more than 8 or so console games. The rest I'd rent or borrow from friends. As a young adult, I would mostly shop bargain bins and ebay (mostly used) for my PS 1/2/3 titles. I only ever paid the full, new sticker price on a couple PS3 games. So, not every console owner is a bottomless source of software revenue. There are limits to what consumers can & will spend.
 
Last edited:
They were already behind, by that point. The CPU you're selling today is the one you started designing 3-4 years ago.
I'm not talking about necessarily better just good enough to compete, if they had any money left during that time they could have come up with at least something.
Also, don't forget about fabs. Until their 14 nm troubles, Intel had long held a fab advantage.
Intel held the FAB advantage for desktop all the time until AMD switched to TSMC's 7nm node which only happened with zen2 last year.
Since intel doesn't release desktop 10nm we can't be sure if they still hold it or not, we can only be relatively sure about the nodes they have released.
Intel's 10nm is quite a bit better than tsmc's hpc node but until intel releases it for desktop it's still up in the air somewhat.
https://www.techcenturion.com/7nm-10nm-14nm-fabrication
For a lot of uses, sure. However, until recently, graphics was literally the only selling-point AMD APUs even had. Take away that, and they're basically out of the market, like what happened with AMD in servers.
I wasn't talking about users,OEMs can pick any intel desktop CPU anyone at all (instead of a very small selection of much more expensive CPUs) and make a system out of it without having to provide a GPU.
 

GetSmart

Commendable
Jun 17, 2019
173
44
1,610
If anyone wants to have rough figure of console margins then look no further AMD's EESC (Enterprise, Embedded, and Semi-Custom) division results for the past few years. That division is responsible for console chip development and sales. Compare the operating income versus revenue. :unsure:
 
If anyone wants to have rough figure of console margins then look no further AMD's EESC (Enterprise, Embedded, and Semi-Custom) division results for the past few years. That division is responsible for console chip development and sales. Compare the operating income versus revenue. :unsure:

It's slim. It's very slim. BUT it's an important source of income and it pays for new IP development.
 
I'm not saying there weren't problems with that acquisition and how it was managed, but you seem to be missing that laptop and other markets requiring an integrated GPU are now an essential part of AMD's product portfolio. And you don't get that without either acquiring a GPU maker or spending $Billions to create your own, which inevitably won't be as good (see how long it's taken Intel to catch up).

See also: game consoles. That wouldn't have happened for AMD, without ATI being well-integrated. That stable revenue stream saw AMD through some pretty dark times.
I think that you misunderstood me. I didn't mean that AMD's acquisition of ATi was a mistake in itself, I meant that the WAY it was done was a mistake. AMD paid something like triple ATi's value because ATi didn't want to be bought out. THAT was the mistake, not the acquisition itself. I'm sorry if I wasn't more clear.