News Firm Estimates Intel's GPU Unit Losses at $3.5 Billion, Suggests Sell Off

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yes, that should have said pink slip.

Certainly Raja doesn't personally hire every person below him, but he absolutely has input and say on critical hires, if not outright last word. And while I agree that not all of this is "easy", when you've spent five years and billions of dollars to do thing, you'd better be capable of getting that thing done because somebody is going to get tired of hemorrhaging money and a lack of significant results eventually. Shareholders don't want to hear excuses. They want to see results. And while they are doing fine in other areas, we've clearly seen Intel flat out wipe the pieces off the board for other departments and ventures in the past so it truly wouldn't be that surprising to see them do something to reset the game board here.

Or not. As you say, we don't really know what's going on behind the scenes. I think we're likely to find out before too long though.
 
If intel wanted to sell off their FABs why would they be building even more?
They need fab capacity to build their upcoming products, and the world doesn't exactly have spare capacity, if you haven't noticed. The idea would be that they keep investing in the business, but eventually IFS becomes its own entity and Intel becomes merely a customer.

Also, newer process nodes generally require new fabs. Sometimes, they retrofit old fabs for newer nodes, but that means taking them offline and perhaps they didn't have enough legacy nodes that weren't in demand.

t's an expansion to their business just like ARK is, if they wanted to get rid of the FABs they would downsize that department, make it more viable for a potential buyer.
No, nobody wants a semiconductor fab that's been gutted and stripped of its ability to compete going forward. And Intel still needs those fabs for producing some of its products. So, it really has no choice but to continue investing in them.
 
  1. Why?! Because otherwise all the people that need x86 PCs are going to stop needing them? Even if all of TSMCs output of relevant archs would be used for making AMD CPUs (which would never happen) ,how much of the market would they be able to supply?
  2. Intel has AI on their iGPUs for like 5 years now, I don't know about datacenter but it stands to reason that they would have it there as well.
The less competitive Intel's Xeons are, the lower they have to price them, which eats directly into their profits. We saw this with Ice Lake SP, where it was the first time in probably well over a decade that Intel introduced a newer generation of server CPUs with a lower price structure than what they replaced. Also, they did less market segmentation, in terms of features like AVX-512. In prior generations, lower-priced models had only one AVX-512 FMA enabled per core, even though the silicon had 2. In Ice Lake SP, all models had both enabled.

As for the iGPUs, the Xeon "Scalable" processors don't have those. Intel's plan was to use Xeon Phi for AI (see Knights Mill), but it got cancelled. So, then they added some specialized data types to AVX-512 (see Cooper Lake), and then AMX in Sapphire Rapids.

In the mean time, they bought Nervanna (and killed it), bought Habanna (to replace Nevranna), and launched the dGPU initiative (including PVC).

Low margin stuff pays for the high end stuff, all the celerons and pentiums make it worth it to make CPUs,
They only really help if they sell in high-volume and with reasonable margins. However, Intel's low-end products are under threat from ARM, in the Chromebook market.

optane failed because the customer level stuff wasn't selling enough.
No, it failed because the server products weren't selling enough. You seem to think high-end isn't where the volume is at, but I guess you haven't heard of The Cloud or Hyperscalers.

You might want to spend some time looking over Intel's financials and see which business units bring in how much money. It might cause you to revisit a few of your assumptions.
 
If AMD is gaining market share then that's great but if they only sell to businesses that wouldn't buy intel anyway then how is intel losing market share?
Why on earth do you think AMD is only winning accounts that would refuse to go with Intel? That is quite a claim!

Anyway, let's say AMD is gaining market share by soaking up the expansion in the market, leaving Intel with similar absolute numbers that it had before. That's still bad for intel, because % matters. If Intel isn't growing its revenue at least as fast as the overall market, then it's losing market share! And that's a trend that could accelerate, because with dominance comes influence.
 
  • Like
Reactions: TJ Hooker
Got to love those Wall Street analysts. They'll do or say anything that will make a short-term buck.
They're an independent market research firm. I don't exactly know what's their angle here, but at least it's not like those analysts that work for big investment banks who are themselves trading in the stock their analysts are rating.
 
Would've been cheaper to buy Imagination Technologies Group than follow Raja's delusional money pit.
Well, Imagination didn't exactly have dGPUs, or at least competitive ones, in the recent past.

On top of that, you'd be talking about trying to integrate a completely different company, culture, tooling, and architecture into Intel. And I'm sure Intel really wants the same basic GPU architecture from their iGPUs all the way to their datacenter products, because it lets them share lots of driver code and other parts of their software stack. So, an acquisition like that would've made zero sense, IMO.

So, it's really Apple who should've bought Imagination. I think they probably would've, except that Imagination (probably) just asked too much and Apple didn't want to set a precedent for any of their other suppliers.
 
Why on earth do you think AMD is only winning accounts that would refuse to go with Intel? That is quite a claim!

Anyway, let's say AMD is gaining market share by soaking up the expansion in the market, leaving Intel with similar absolute numbers that it had before. That's still bad for intel, because % matters. If Intel isn't growing its revenue at least as fast as the overall market, then it's losing market share! And that's a trend that could accelerate, because with dominance comes influence.
I'm just saying that they are used for different things, you cant use arm or GPU for something that needs x86, and there are things where ryzen makes much more sense and there are things where intel makes much more sense.

We are not still back in 1999 where number crunching was the only thing a data center would do, while most people here seem to think so. It's still a big part of it but nowhere near the only thing. All the major players are going to have a mix of all the different available techs because they will have a need for them.

If AMD is great at something intel isn't chasing after then the % and influence doesn't change because they are parallel markets and not the same one, even if they are both selling to data centers.
 
Ati/AMD has had awful drivers for 20+ years. If it was a simple case of hiring the right people, why has it taken them so long to get to their current workable position?
AMD started to have financial troubles from pretty much the time they bought ATI until about 3 years ago. For quite a while, it looked like a place that could actually go under. That doesn't exactly put you in a position to attract the best talent.

In contrast, Intel was raking in simply massive profits, for most of that time. They had the resources to hire in pretty much anyone they really wanted. Remember when they managed to get Jim Keller, for a couple years?
 
The VP in charge of drivers was never Raja, but Lisa Pearce. Raja, no matter how much you guys hate AMD or him, is not responsible for the driver mess. He can be responsible for many things, but not that one. He can't also freely point fingers because when you're under the same umbrella, throwing shade to the side is always a bad look from the outside.

I am pretty sure there's some tough conversations happening inside of Intel around this, but this is not Raja's exclusive mess, or even Pat's or Lisa's. This is one of the things, I'm pretty sure, Pat Gelsinger is trying really hard to change.

Regards.

I've analyzed all of Raja's videos and presentations he puts together. He demonstrates a linear mindset. While it can be laser focused, his circle of thought is limited to what he knows without taking outside considerations.

Be it Raja's fault or not, he sits at the top and he is now in charge of several delayed, overly expensive, and underperforming GPU projects. Would you place a bet with that kind of track record?
 
  • Like
Reactions: shady28
I've analyzed all of Raja's videos and presentations he puts together. He demonstrates a linear mindset. While it can be laser focused, his circle of thought is limited to what he knows without taking outside considerations.

Be it Raja's fault or not, he sits at the top and he is now in charge of several delayed, overly expensive, and underperforming GPU projects. Would you place a bet with that kind of track record?
I can say this much: he is the visible face of the ARC initiative, so all darts from the outside will aim at him (most, I'm sure, fueled by bias). From the inside, I'm sure there's plenty more to unravel from this nasty mess of a ball. I am betting on that, even though I don't really bet. If Raja leaves (firing or whatelse) and no one else takes the blame with him (effectively making him a escape goat), I'm 150% sure Intel will kill the division altogether.

Regards.
 
AMD started to have financial troubles from pretty much the time they bought ATI until about 3 years ago. For quite a while, it looked like a place that could actually go under. That doesn't exactly put you in a position to attract the best talent.

In contrast, Intel was raking in simply massive profits, for most of that time. They had the resources to hire in pretty much anyone they really wanted. Remember when they managed to get Jim Keller, for a couple years?
So AMD's problems have been because of money. Except when Raja was there. Then it was exclusively his fault because he was the one in charge. And then after he left, the problem was money again, until recently. Perfectly logical thought process there.
 
If Raja leaves (firing or whatelse) and no one else takes the blame with him (effectively making him a escape goat), I'm 150% sure Intel will kill the division altogether.
I think I heard that, in the world of tech startups, the investors typically try replacing the CEO once. If the second CEO can't save the company, that's when they pull the plug.

Of course, that's dependent on market conditions and whether the tech seems credible. However, if the problem seems as simple as a failure to execute, then restructuring or simply replacing the upper/middle management seems an appropriate solution.
 
I think I heard that, in the world of tech startups, the investors typically try replacing the CEO once. If the second CEO can't save the company, that's when they pull the plug.

Of course, that's dependent on market conditions and whether the tech seems credible. However, if the problem seems as simple as a failure to execute, then restructuring or simply replacing the upper/middle management seems an appropriate solution.
Hm... I kind of understand what you're talking about, but Intel is most definitely not a startup and there's several executives with (supposedly) YEARS of experience in their respective markets and fields. So, taking the bold text into account, I would say that is the crux of the problem and what I'm saying: it's not as simple as to blame Raja, Lisa or Pat. What ARC has shown is simply that Pat has some work left to do inside of Intel and make sure all divisions are aligned and working effectively. From the outside, I can say there's been a big communication problem within Intel: both to the sides (other depts) and up&down the chain of command. That's as far as I can speculate, based on the rumours floating around.

Regards.
 
Same goes for Gelsinger, who is 18 months in. If Kodura is performing poorly, fire him before the new CEO grace period is over, or get fired.

It's simple to fire a person. But for that to improve organizational performance you have to have not just a better replacement but a replacement so much better that they can quickly get up to speed and overcome the disruption caused by the firing of the leader.

They're an independent market research firm. I don't exactly know what's their angle here, but at least it's not like those analysts that work for big investment banks who are themselves trading in the stock their analysts are rating.

If an analyst is publicizing their "insights" rather than just taking private action on it for themselves or their paying clients I wonder why.

I'm sure AMD and Nvidia (and their investors) also agree that Intel should get out of the GPU business.
 
  • Like
Reactions: shady28
If an analyst is publicizing their "insights" rather than just taking private action on it for themselves or their paying clients I wonder why.

I'm sure AMD and Nvidia (and their investors) also agree that Intel should get out of the GPU business.

Free analysis on anything having to do with the markets is in my experience worth exactly what you paid for it.

And TBH, more and more I'm thinking this is overblown, way overblown.

The driver issues are software. The problems are mostly limited to DX10 and DX11 titles, so we know it isn't hardware since it works fine on DX12. Software can be fixed, and it won't take that long.

This A750 for example, looks just fine with DX12 titles, and matches up to a 3060 in performance. Given a 3060 goes for $340 minimum right now, If it were say, $250 it'd be an ok deal even with some lost performance on DX10/11. Below that, it becomes a steal. If they fix their drivers.
 
"Wasn't Kepler a bit "meh"? I remember it was pretty shocking how they managed to deliver 2x performance with Maxwell, which was made on the same process node."

Ehhh, not really compared to the competition---the GTX480 ran way too hot and struggled versus ATI's (yes, still ATI) equivalent Radeon.
 
Free analysis on anything having to do with the markets is in my experience worth exactly what you paid for it.

And TBH, more and more I'm thinking this is overblown, way overblown.

The driver issues are software. The problems are mostly limited to DX10 and DX11 titles, so we know it isn't hardware since it works fine on DX12. Software can be fixed, and it won't take that long.

This A750 for example, looks just fine with DX12 titles, and matches up to a 3060 in performance. Given a 3060 goes for $340 minimum right now, If it were say, $250 it'd be an ok deal even with some lost performance on DX10/11. Below that, it becomes a steal. If they fix their drivers.

yes we can fix software. but it won't take long? not sure about that.
 
  • Like
Reactions: bit_user
yes we can fix software. but it won't take long? not sure about that.

Keeping software working (just not crashing not being efficient) is an continuous process and requires substantial investment of resources. Not a one and done situation.

In a continually changing worldwide environment with new software and hardware always coming I can see why this is a tough challenge.

But I think they have the resources and have to have a viable GPU solution to maintain their position as the default choice for enterprise .
 
  • Like
Reactions: bit_user