News Intel Re-Orgs AXG Graphics Group, Raja Koduri Moves Back to Chief Architect Role

I hate to say it, but Tom from MLID wasn't wrong. At least, not yet. Splitting a division is not a good sign, no matter how they try to spin it and try make everyone drink the koolaid. Credit's due when it is.

As for the news itself... Well, as long as Intel keeps bringing GPUs with some sort of cadence and can compete, I think we all benefit. So that's good. As for the split, as mentioned above, well, whatever they need to do to keep the division afloat, I guess? I can't help but think this is starting to feel like a "division write-off". I hope I'm wrong and so is Tom (from MLID).

Regards.
 
Intel GPU's are fine. What they need is 120-150W card. A750 and A770 are too expensive and over my power budget. A380 is no better than my RX570. Intel needs to work on the drivers and a card between A750 and A380.
 
Are the needs/requirements of discrete consumer graphics and datacenter/AI accelerators converging or diverging ?

I can see that having a big effect on whether Intel finds it worthwhile to stay in the consumer space. If the development and production of discrete can no longer ride the coattails of the commercial space then the whole market segment may disappear.


I think that Intel won't be that effective in changing the graphics marketplace unless they are using their own manufacturing processes and bringing their enormous production capabilities to bear behind the product. That would seem to be their competitive advantage and a way to profitability.
 
Sadly that comes up against opportunity cost. They can make more CPUs at a known profit margin, or make GPUs at break even or losses.

I did my part. I put up with my A380 everyday. Now if I can just get it to stop giving me internet connectivity notifications, I will be happy. If the WiFi drops out for so little time that I don't notice it, I really don't need to be told it has reconnected.

Going to dig through the registry later and see if I can't kill it off, nothing else has seemed to work.
 
Sadly that comes up against opportunity cost. They can make more CPUs at a known profit margin, or make GPUs at break even or losses.
But..economics of scale?! If they make a larger number of GPUs they will be cheaper as a whole to produce so they would make money from them?!
Also the market for CPUs is not infinite, there are only so many CPUs being bought each year.
Let alone that FTC laws probably forbid them from flooding the market with enough CPUs to cover 100% of the need.
 
  • Like
Reactions: rtoaht
Raja is smart, but a poor leader/manager.
What makes you say that? Look how good things ended up for RTG! Now AXG is following in it's footsteps. 😏

I'm not sure anyone would by the consumer graphics division, though, or if Intel would sell it. I mean, Intel will for sure keep doing integrated graphics. Maybe this will eventually not be such a huge cost sink. Maybe. But without the Intel resources and the R&D happening for data center, I don't see that Intel's consumer GPUs would be worth much. You'd maybe get the jump start plus Battlemage, but unlike Intel, whatever company bought this probably wouldn't have as many cross licensing agreements.
 
I hate to say it, but Tom from MLID wasn't wrong.
Neither can anyone claim he is right yet. Let me know when they exit discrete Consumer space. I see Tom continuing to argue that Intel isn't competing at the high end and thus they have given up...

It won't be too much longer to find out what the future looks like for Intel graphics. Especially with GPU annual cadence. Will we see Celestial in 2024? Druid in 2025? Battlemage from the rumors seems to be coming 2023. Hopefully all future Intel GPUs benefit from the current driver optimization work and I expect they can feed these newer GPU's better than GCN Radeons HD 5850 which Alchemist Arc currently matches(Link)... Thats the big issue not many are talking about.
 
But..economics of scale?! If they make a larger number of GPUs they will be cheaper as a whole to produce so they would make money from them?!
Also the market for CPUs is not infinite, there are only so many CPUs being bought each year.
Let alone that FTC laws probably forbid them from flooding the market with enough CPUs to cover 100% of the need.

Considering these are already mass produced, I'm not sure there is much margin left to gain from producing more. See EVGA.

More fab time they consume, more opportunity cost as well, that doesn't just go away. And when purchasing it, even worse, fab can charge more because they have more customers than capacity.

The market for CPUs may not be infinite, but that represents their highest technology fabs, and they are already full. Just as TSMC is not having any trouble keeping 7/6/5nm nodes pumping 24/7. If it were still competitive, Intel could use their older 14nm fabs to make GPUs, but they would also be pretty costly since a lot of other Intel products are made there.

I don't think the FTC would get involved unless Intel did severe undercutting to kill their competitors. Simply making more of something won't hurt AMD directly. As long as the alternative is there, it isn't a monopoly. (Not to mention that I would argue there are options outside of x86) It also wouldn't displace existing orders and fabrication over night.
 
  • Like
Reactions: Sluggotg
I hate to say it, but Tom from MLID wasn't wrong. At least, not yet. Splitting a division is not a good sign, no matter how they try to spin it and try make everyone drink the koolaid.
They split it to merge them back into more market-focused units. Intel's consumer graphics division isn't going away anytime soon since Intel needs them for their IGPs regardless of what happens to discrete.
 
Intel is setting up roadmap for selling off consumer graphics.

Unlikely. A company does this kind of re-org in order to hide a division's dismal performance from shareholders. I think Intel has every intention to stay in the market. It just realizes that with its less-than-revolutionary product it's going to be stuck in unhappy third place for the foreseeable future. Got to shield the tech from short-sighted bean counters.
 
very unlikely, as while consumer grade GPUs don't go far in terms of margins they really do go a long way to towards taking advantage of economies scales and greatly improving margins on datacenter grade portfolio.

even IBM learned this lesson the hard way when they sold off their system X profolio only to see their system Z production cost shooting up very soon after 😉
 
  • Like
Reactions: rtoaht and DavidC1
Hopefully the reorganization goes well and allows them to focus on the gaming/consumer side of things. The Arc launch really showed how little emphasis was put on the realities of the market they were trying to enter. There were hardware design mistakes and assumptions based on the prior driver set coming from IGP which simply don't apply to add in GPUs.

Intel will certainly not sell off the consumer graphics division since they're going to need IGPs and HPC for the foreseeable future. If anything they'd just write down inventory and sell off what they have should that time come.
 
  • Like
Reactions: Sluggotg
Intel Arc control is working for the igpu in my 12700k now. It won't oc like on my A750, but that 12700k is in a b660 itx that doesn't allow overclocking so that might be why. The igpu also doesn't have the AV1 option for capture format.

Intel's drivers are improving. They are already a bunch better than release. Hopefully they keep that up and spread that around their other products. AMD has a significant driver advantage with their igpus that Intel could work to catch up to.
 
Considering these are already mass produced, I'm not sure there is much margin left to gain from producing more. See EVGA.

More fab time they consume, more opportunity cost as well, that doesn't just go away. And when purchasing it, even worse, fab can charge more because they have more customers than capacity.
Yeah, right now intel is buying the chips from tsmc , more expensive, and intel is also building new FABs like crazy, all of these new FABs will have to be doing something in the future and a good part of it will be used to do GPUs, which will be cheaper for intel.
The market for CPUs may not be infinite, but that represents their highest technology fabs, and they are already full. Just as TSMC is not having any trouble keeping 7/6/5nm nodes pumping 24/7. If it were still competitive, Intel could use their older 14nm fabs to make GPUs, but they would also be pretty costly since a lot of other Intel products are made there.
Yes as I said intel is building up FABs like crazy, they won't be able to use all of this new FAB space just for CPUs, even adding the GPUs there will be FAB space left which is why intel planned to be an external FAB for other companies as well.
I don't think the FTC would get involved unless Intel did severe undercutting to kill their competitors. Simply making more of something won't hurt AMD directly. As long as the alternative is there, it isn't a monopoly. (Not to mention that I would argue there are options outside of x86) It also wouldn't displace existing orders and fabrication over night.
Look at what is happening on the Microsoft deal trying to buy blizzard, that is going to be far from a monopoly but the FTC still has huge issues with it and wants to prevent it.
https://www.ftc.gov/news-events/new...oft-corps-acquisition-activision-blizzard-inc
 
Are the needs/requirements of discrete consumer graphics and datacenter/AI accelerators converging or diverging ?

I can see that having a big effect on whether Intel finds it worthwhile to stay in the consumer space. If the development and production of discrete can no longer ride the coattails of the commercial space then the whole market segment may disappear.

I'd argue that they're diverging.

Can you explain why you would argue that they are diverging ?
 
Can you explain why you would argue that they are diverging ?
I don't want to answer on their behalf, but I believe it's simply because a GPU, at a high level, is a good "ML" accessory/accelerator by sheer coincidence, but when you start optimizing for specialized loads, the line between "a good GPU" and a "good ML/AI/something accelerator" gets defined more properly; said accelerators don't need to support the same specs/APIs/ISAs as GPUs, so they become different enough to need their own specialized marketing and tooling. AMD and nVidia already realized this and have specialized hardware now which are aimed at completely different markets and away from consumer. A divergence could be because of that.

Or that's what I think, at least.

Regards.
 
Also what their competitors are doing. AMD has a completely different accelerator line. Nvidia less so, but they still make specialized data center GPUs with their latest architectures.
 
I don't want to answer on their behalf, but I believe it's simply because a GPU, at a high level, is a good "ML" accessory/accelerator by sheer coincidence, but when you start optimizing for specialized loads, the line between "a good GPU" and a "good ML/AI/something accelerator" gets defined more properly
ML heavily favors tensor cores since it uses a lot of large matrix math between inputs and input weighs while gaming heavily favors conventional shaders to calculate the base color of stuff before feeding it into lighting/RT. Although the two may reuse many of the same fundamental building blocks, they scale in nearly direct opposite directions. Intel will likely retain some common core between the two groups to avoid duplicating too much work or have one side borrow from the other where applicable.
 
  • Like
Reactions: rtoaht
They split it to merge them back into more market-focused units. Intel's consumer graphics division isn't going away anytime soon since Intel needs them for their IGPs regardless of what happens to discrete.

And they said they want to monetize the division, which means consumer dGPUs.

They also said they want to monetize ALL divisions similar to iGPUs where there's development but no direct money coming from it.

@Bazzy 505 has got it right. You abandon the consumer market and the datacenter market will fail. The much greater volume and the very picky nature of consumer market floats all boats. Look how all the datacenter based GPU companies like 3DLabs has died. Only the consumer ones are alive and they are being used for datacenters.

You have few reasons why they have an incentive to continue
-Pat Gelsinger got fired for the Larrabbee debacle. He said GPU was one thing that he didn't get to do
-Consumer dGPUs monetize iGPU development
-Consumer GPUs have more value than just the monetary reasons - they are much higher volume and discerning nature of customers in that space means it refines the whole category which in turn will become better workstation GPUs and iGPUs.
-Driver-wise the ARC driver team was located in Russia and had to be flown out. Also the reorganized the entire software team into one group led by their CTO, while previously it was there but all over the place. I think the very rapid turnaround of driver improvements is an evidence of that.

Unlikely. A company does this kind of re-org in order to hide a division's dismal performance from shareholders. I think Intel has every intention to stay in the market. It just realizes that with its less-than-revolutionary product it's going to be stuck in unhappy third place for the foreseeable future. Got to shield the tech from short-sighted bean counters.

Yea then what's this?

In Q1 of this year, Intel reorganized its financial reporting into six business units. Despite the hierarchical adjustments, AXG will continue to report its revenue as one of the six Intel business units in the upcoming January earnings call.
 
Last edited: