News Firm Estimates Intel's GPU Unit Losses at $3.5 Billion, Suggests Sell Off

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Haha... When I first read that Raja was going to Intel, my first thought was wow!! Good luck to Intel. They actually hired him after the mess he created at AMD.

Today, we still have yet to see any decent discrete GPU from Intel. We yet to see a single Bitcoin miner. Basically nothing... Only delays after delays.

However, I do not think Intel should axe their graphics division. But they should realign it's focus. CPUs no longer adequate for high end computing and specialized processors like AI and GPUs are needed. So, I would say Intel should just axe the discrete gpu (basically no gaming GPUs) and focus only on the compute units.

that's what they do after the fail of Larrabee. but the problem is nvidia exist. intel did not enter GPU market because they saw big money in gaming market for discrete GPU.
 
Wasn't Kepler a bit "meh"? I remember it was pretty shocking how they managed to deliver 2x performance with Maxwell, which was made on the same process node.
Kepler is nvidia attempt to make more efficient GPU but still try to retain the compute aspect of it. Maxwell is pure gaming architecture. that's why nvidia end up retaining kepler as their compute solution for 4 years. nvidia first release GK110 for FP64 compute in 2012. it's successor did not come until 2016 with Pascal GP100. hence to fight AMD hawaii at FP64 nvidia end up using two GK210 (further improved version of GK110) in one board (tesla K80).
 
  • Like
Reactions: bit_user
Larrabee was never intended to be a gaming GPU. It was designed for HPC applications. The project eventually became Xeon Phi which did make it to market.

it is more of intel reaction to nvidia tesla. though initially it was supposed to also able to handle graphic related task but without being a GPU. if i remember correctly they once demo Quake 2 with RT on the chip. the idea is to prove that x86 core can also handle graphic task as fast as GPU without being a GPU. one developer that involve in the project said that as time goes by Larrabee actually becoming more and more like a GPU by adding hardware that usually exist inside a GPU. it seems that intel BoD does not like that so they can Larrabee completely to compete with GPU in gaming workload and make it exclusive as a compute card in the form of Xeon Phi. and Xeon Phi actually end up being more successful than AMD Firepro when it comes to compute accelerator market adoption.
 

Ogotai

Reputable
Feb 2, 2021
327
221
5,060
Larrabee was never intended to be a gaming GPU.
sorry spongiemaster, but you seem to be wrong on this one. " The chip was to be released in 2010 as the core of a consumer 3D graphics card " that sounds like it was meant to be a gaming gpu. this is from Wikipedia
also :
" Intel cancelled plans for a discrete Larrabee graphics card because it could not produce one that was competitive with existing GPUs from AMD and NVIDIA in current games " again sure looks like intel also intended larrabee to be a gaming gpu, from anandtech.
 

watzupken

Reputable
Mar 16, 2020
1,020
516
6,070
I guess Intel should have expected this problem from the get go, although the person that made the decision to start their own dGPU department may not be the current CEO. Given how much they have dumped into this, and their competitors having the advantage of both CPU and GPU integration, AMD, ARM, and Apple, they need to pull this off for their benefit in the longer term. Considering that Intel gave Raja a promotion some months back, I feel the product should be here to stay despite the pain they are going through. And to be honest, Intel is in trouble on multiple fronts at this point in time since their core products are all getting delayed. GPU is just one of these problems.
 

watzupken

Reputable
Mar 16, 2020
1,020
516
6,070
sorry spongiemaster, but you seem to be wrong on this one. " The chip was to be released in 2010 as the core of a consumer 3D graphics card " that sounds like it was meant to be a gaming gpu. this is from Wikipedia
also :
" Intel cancelled plans for a discrete Larrabee graphics card because it could not produce one that was competitive with existing GPUs from AMD and NVIDIA in current games " again sure looks like intel also intended larrabee to be a gaming gpu, from anandtech.
I think so as well. Larrabee as I recalled was meant to be a multi core gaming GPU. Perhaps they have other plans for it as well, but it got axed prematurely.
 
  • Like
Reactions: bit_user

Jimbojan

Honorable
May 17, 2017
79
35
10,560
Jon Peddie Research estimates Intel's GPU endeavors have resulting in $3.5 billion in losses and suggests Intel might want to sell it off after multiple failures.

Firm Estimates Intel's GPU Unit Losses at $3.5 Billion, Suggests Sell Off : Read more
I serious doubt Intel will axe the graphic group, it is the future of Intel's business, it may have loss some money, yet it is having a $1B revenue, while its Auto business is deriving $400M /quarter, but it costs Intel $17B up-front. why doesn't Intel ditch the Auto unit as well based on your logic?
 
  • Like
Reactions: KyaraM

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
sorry spongiemaster, but you seem to be wrong on this one. " The chip was to be released in 2010 as the core of a consumer 3D graphics card " that sounds like it was meant to be a gaming gpu. this is from Wikipedia
also :
" Intel cancelled plans for a discrete Larrabee graphics card because it could not produce one that was competitive with existing GPUs from AMD and NVIDIA in current games " again sure looks like intel also intended larrabee to be a gaming gpu, from anandtech.
I stand corrected. That attempt was so poor I don't even remember it. That still would make Intel's current attempt only their 2nd in the last 20 years, not third.
 
  • Like
Reactions: spentshells

watzupken

Reputable
Mar 16, 2020
1,020
516
6,070
I serious doubt Intel will axe the graphic group, it is the future of Intel's business, it may have loss some money, yet it is having a $1B revenue, while its Auto business is deriving $400M /quarter, but it costs Intel $17B up-front. why doesn't Intel ditch the Auto unit as well based on your logic?
I feel Intel tend not to axe core businesses. GPU may be new, but is an essential piece for them because they are clearly lagging when it comes to CPU+GPU advantage. AMD may not have the fastest CPU at this point, but the RDNA2 powered Rembrandt are selling very well, particularly in laptops where it is limited by cooling and may struggle with a dGPU. Intel's XE GPU is no longer able to keep up with AMD's iGPU.
 

hannibal

Distinguished
I serious doubt Intel will axe the graphic group, it is the future of Intel's business, it may have loss some money, yet it is having a $1B revenue, while its Auto business is deriving $400M /quarter, but it costs Intel $17B up-front. why doesn't Intel ditch the Auto unit as well based on your logic?

What intel should do is to concentrate datacenter compute GPUs and forget gaming GPUs if they cut something. But computational accelerators in datacenter are so big busines that Intel can not leave that to the Nvidia and AMD. Gaming GPUs are so driver dependable that it may be wise to keep on building drivers and come back consumer market, when drivers are more ready.
Of course if gaming GPUs starts making profit sooner, why not, but they can not kill discrete GPUs because they need those in datacenter and scientific applications!
 
D

Deleted member 14196

Guest
Which is missing the point that it doesn't make sense to lay the blame for this mess at his feet. What exact effect does a Senior VP have on driver quality? Intel has been coding GPU drivers forever for their iGPU's. By anyone's standards, the gaming side of them have been terrible. Did Intel replace the entire driver team when Raja was hired? How would that even have been possible? There's only two companies that had personnel with the knowledge to help Intel and AMD and Nvidia weren't about to let all their software engineers move to Intel. Raja is not the person who is going to teach Intel's software engineers to code proper DX11 drivers.
It’s called Managment 101. The boss is always to blame. Good or bad. Looks like bad management to me so stop making excuses for a poor actor. The whole team looks bad
 
Is this the same firm that said that intel should go fabless and now intel is building fabs like crazy? Might be a good sign for the GPU division.
Jon Peddie Research estimates Intel's GPU endeavors have resulting in $3.5 billion in losses and suggests Intel might want to sell it off after multiple failures.

Firm Estimates Intel's GPU Unit Losses at $3.5 Billion, Suggests Sell Off : Read more
While you do lose the money you invest from your pocket it is not "lost" until the thing you invested in is dead, investments do need a long time to pay out, closing the business before it can make money would be loosing money.
 
  • Like
Reactions: KyaraM
D

Deleted member 431422

Guest
first round? i believe this is their third time trying for the past 20 years or so. another problem is consumer want third (or even more) player into the market but in the end they have no interest to buy GPU other than nvidia (and maybe a little bit of AMD). they only want other GPU player to increase the competition so they can get their nvidia or AMD gpu at cheaper price. they have no interest at all to sustain the competition to exist going forward.
I'd buy Intel GPU. I need one that's good at 1080p with reasonable FPS. I'm not going to buy old Nvidia overpriced GPU's or castrated AMD. I'm hopeful Intel will deliver. Nothing else to do than wait.
 
Is this the same firm that said that intel should go fabless and now intel is building fabs like crazy? Might be a good sign for the GPU division.
Haha, you beat me to it!

I remember that as well. Obviously, Pepperidge Farm also remembers.

Well... For all that this area looks like a bad business decision in the short term, everyone (analysts and execs) knows it is a good decision in the long term (to keep them, that is). While the "big" bucks are made in the data center and all that, the consumer space is not pennies; it is huge still. Plus, all the potential diversification Intel can get from it once they get their feet in AMD's and nVidia's turf.

That being said, in order to have data center accelerators (ML or not) alongside your CPU offerings, you don't really need a "Graphics" division, which is what could sway this conversation into Intel dissolving the group and being absorbed into other areas within it. Proof of that is Google and even Apple (to a degree) that have put great accelerators and even "graphics" with just generic IP blocks.

As a consumer, I want Intel to keep the dream alive, but I can't deny things are not looking too bright =/

Regards.
 

bit_user

Polypheme
Ambassador
I have such a hard time understanding the 'hire from outside' mentality at big companies.
There are plenty of examples where "hire from outside" has actually worked. At best, it can inject expertise and experience the company lacks. Intel has been making iGPUs for a long time, but they lacked experience scaling them up and dealing with dedicated GDDR memory on them. They also had very little experience with the datacenter or HPC market for GPU compute.

... another startup, which seem to be the only companies that realize the quickest path to getting ahead is to promote the people who know WTF they are doing.
There are also plenty of perils promoting people into a different job function. They might lack the skills and experience to do the new job, or instead just keep trying to do their old job. I've seen a lot of both.
 
Last edited:

bit_user

Polypheme
Ambassador
And nothing you listed has any specific affect on driver quality. You're firing the coach of a sports team because the star player decided to have a terrible season. You can be the greatest strategist in the world, but in the end it is up to the players/engineers to actually do the work successfully.
Not a great analogy. In sports, you can play well but lose because the other team plays better. In the case of GPU drivers, it should be reasonably clear what it takes for them to be competitive, and so it's more a straight-forward problem of execution. And that comes down to having a team with the right resources and competent management. That falls entirely in the remit of senior management.

Decades of experience. Jensen Huang said years ago that Nvidia was a software company, not a hardware company.
He wasn't only talking about drivers, though. He was talking about things like deep learning software, the CUDA ecosystem, and even their self-driving stuff. I'm sure the driver engineers at Nvidia are outnumbered by the hardware engineers.

ATi's were terrible when AMD bought them and they remained terrible for the years before Raja got there. They were terrible when he was there, and they were terrible for years after he left.
Even that much is enough to appreciate the problem and its significance.

I actually asked you what the direct impact of a Senior VP has on driver development.
Obviously, they don't. Nor do they go around telling the hardware designers how to design GPU register files.

the higher you get up in management above the actual developers, the less blame I would put on the person.
Okay, so essentially what you're saying is the executives at the top of a failing company deserve no blame? You have sure some interesting ideas about how to run a business.

Like I said above, no amount of strategy and cheer leading and synergy meetings is going to make up for incompetency by the actual coders.
The main thing senior management does is to set a strategy and make sure the resources are in place to execute it. And if it's not going to plan, they make whatever changes are needed to fix it.

Talking about synergy and cheerleading tells me most of what you know about business is from watching TV and movies.
 

bit_user

Polypheme
Ambassador
Intel's only attempt at a gaming dGPU was the i740 released in February 1998. That's almost 25 years ago. There have been no attempts between that one and the current attempt.
I think there was an obvious Larrabee reference, in there. That was certainly intended to be a gaming GPU (among other things), before they cancelled it and went exclusively down the Xeon Phi path.

Larrabee was never intended to be a gaming GPU.
This is demonstrably false. Even to make such a claim shows a serious lack of diligence.
 
Last edited:
It’s called Managment 101. The boss is always to blame. Good or bad. Looks like bad management to me so stop making excuses for a poor actor. The whole team looks bad
Then good* thing Raja wasn't in charge of developing the drivers themselves then. I do remember a tweet from him which explicitly mentioned the person in charge of that and wishing them the best of luck. It* was another VP IIRC?

I'll try to find it; it was in a Tom's news piece as well.

EDIT: I couldn't find the tweet, but I did find this with the VP's name: https://www.tomshardware.com/news/intel-arc-gpu-driver-deadline-missed

It's Lisa Pearce. If someone needs, ehem, "disciplining", it's her.

Regards.
 
Last edited:

bit_user

Polypheme
Ambassador
initially it was supposed to also able to handle graphic related task but without being a GPU.
No, it was intended to be a full GPU. It even had hardware texture engines & ROPS.

if i remember correctly they once demo Quake 2 with RT on the chip.
I think you're confusing the ray tracing stuff with another project at Intel, which started even earlier. Indeed, they did show ray tracing on Larrabee, but that was really a sideshow, probably while they worked on their Direct3D drivers for it.

it seems that intel BoD does not like that so they can Larrabee completely to compete with GPU in gaming workload and make it exclusive as a compute card in the form of Xeon Phi.
No, they killed it because it wasn't competitive with gaming GPUs of the day.

Xeon Phi actually end up being more successful than AMD Firepro when it comes to compute accelerator market adoption.
That's a low bar. It got cancelled because it couldn't compete with Nvidia's datacenter GPUs.
 

bit_user

Polypheme
Ambassador
Is this the same firm that said that intel should go fabless and now intel is building fabs like crazy?
I don't know, but they weren't wrong about the economics. The key point you're glossing over is IFS, which is the first step towards Intel spinning off its fabs, like AMD did almost 15 years ago. I'm not saying it's 100% going to happen, but the fabs are indeed a boat anchor around Intel's neck.

While you do lose the money you invest from your pocket it is not "lost" until the thing you invested in is dead, investments do need a long time to pay out, closing the business before it can make money would be loosing money.
There's some truth in this, as long as the investment remains on-track and viable. However, at some point, it can turn into a Sunk Cost.
 
Last edited:

edzieba

Distinguished
Jul 13, 2016
434
426
19,060
My goodness, an analyst who has no idea that investing in a new product requires spending money, then releasing the product, and only then making money form the product? And thus that cutting the division due to it 'losing money' just at the start of the 'release product' phase would be the decision of a blithering idiot who cannot count but has been let loose near a copy of Excel?
Ah, wait... 'analyst', already covered that bit.
Larrabee was never intended to be a gaming GPU.
Not really, gaming was always a feature form the outset. Incidentally, Tom Forsyth and a lot of the Larrabee team have since returned to Intel to work on Arc.