News Intel Arc B580 Battlemage GPU allegedly surfaces on Geekbench — with 20 Xe Cores, 12GB of VRAM, and 2.85 GHz boost it falls short of the A580 desp...

vanadiel007

Distinguished
Oct 21, 2015
371
361
19,060
As I mentioned some weeks ago, they should spin off their GPU division. They simply do not have the performance nor the funds to be competitive in the discrete GPU market.
It's not a sound business decision to keep developing and marketing these GPU's.
 
As I mentioned some weeks ago, they should spin off their GPU division. They simply do not have the performance nor the funds to be competitive in the discrete GPU market.
It's not a sound business decision to keep developing and marketing these GPU's.
Considering AI is basically eating Intel's lunch (Nvidia has gone from being worth less than Intel prior to 2020, to being worth 30X as much as Intel, in terms of market cap at least), I don't think Intel can just pretend GPUs aren't important. Intel needs some changes if it's going to stay relevant.

I'm not saying GPUs alone are the solution, but Intel ignored GPUs for 20 years and it's now paying the price. Or rather, it dabbled in GPUs a little bit (i.e. Larrabee) but was afraid it would hurt the CPU division. And now GPUs are indeed killing the CPU division... just not Intel's own GPUs.

Look at how many Chinese startups are getting major state funding to try and create competitive GPUs for AI. If China also sees this as important, why wouldn't Intel come to similar conclusions? And sure, Intel could go more for AI accelerators, but the point is it can't just give up on the non-CPU market, and GPUs are a good middle ground as Nvidia has proven.
 
Jul 2, 2024
4
7
15
As I mentioned some weeks ago, they should spin off their GPU division. They simply do not have the performance nor the funds to be competitive in the discrete GPU market.
It's not a sound business decision to keep developing and marketing these GPU's.
They’re currently very competitive in edge and automotive, wdym?
It’s absolutely a sound investment in its status quo.
 

HideOut

Distinguished
Dec 24, 2005
613
106
19,170
As I mentioned some weeks ago, they should spin off their GPU division. They simply do not have the performance nor the funds to be competitive in the discrete GPU market.
It's not a sound business decision to keep developing and marketing these GPU's.
if they are making money on them why would they stop? And right now AI is the future and thats mostly GPU based...
 
Apr 8, 2024
8
9
15
I feel that this article is simply click bait, or rushed out to attack Intel or Nvidia competitors. The title has a clear opinion of "falls short of prior gen", but then within the second paragraph they state:
"Keep in mind that this is not an official benchmark, and that Geekbench OpenCL can be a terrible way of measuring performance, so reserve judgment until review units are available"

This article is doing nothing but making a statement about an OEM that's trying, and somewhat decently succeeding. In our modern world where attention spans are VERY short, the title has quite an impact on people as to whether they even read the article and start spouting nonsense because tl:dr. The 'AI Market' isn't the whole worlds commerce market, it's a portion of the Tech Industry. The fact that Intel has brought a video card to market that is quite useful is many aspects beyond its integrated graphics is amazing. We can now look for Intel Arc cards to do video transcoding, including AV1 which has better compression than HEVC.
Lets also remember that driver tuning can also bring quite a bit of performance - look at AMD's gains over the years, as well as Intel's - I won't even mention what nVidia has been able to do, they're doing just fine.

P.S. - While I might be attacked as an Intel fan boi, I've been working in the IT industry for >15 years and I choose whichever OEM is the most effective for the use case. I wouldn't choose Intel right now for some products, but trashing their video cards because they aren't competitive for gaming feels like you're applying a blanket statement to a tool or device that has much more versatility.
 

JRStern

Distinguished
Mar 20, 2017
172
64
18,660
What you say is true, but I'm sure that within Intel so much of it is just considered history, that this puts them off trying very hard. Also that they have sufficient grief at this point with all of their mainline processor products, they just can't imagine taking on a new initiative ... that they've failed at several times before, even including the Gaudi line.
 
I feel that this article is simply click bait, or rushed out to attack Intel or Nvidia competitors. The title has a clear opinion of "falls short of prior gen", but then within the second paragraph they state:
"Keep in mind that this is not an official benchmark, and that Geekbench OpenCL can be a terrible way of measuring performance, so reserve judgment until review units are available"

This article is doing nothing but making a statement about an OEM that's trying, and somewhat decently succeeding. In our modern world where attention spans are VERY short, the title has quite an impact on people as to whether they even read the article and start spouting nonsense because tl:dr. The 'AI Market' isn't the whole worlds commerce market, it's a portion of the Tech Industry. The fact that Intel has brought a video card to market that is quite useful is many aspects beyond its integrated graphics is amazing. We can now look for Intel Arc cards to do video transcoding, including AV1 which has better compression than HEVC.
Lets also remember that driver tuning can also bring quite a bit of performance - look at AMD's gains over the years, as well as Intel's - I won't even mention what nVidia has been able to do, they're doing just fine.

P.S. - While I might be attacked as an Intel fan boi, I've been working in the IT industry for >15 years and I choose whichever OEM is the most effective for the use case. I wouldn't choose Intel right now for some products, but trashing their video cards because they aren't competitive for gaming feels like you're applying a blanket statement to a tool or device that has much more versatility.
The headline is supposed to draw readers, so there's always a balance between clickbait and boring. The important bit is that there's a B580 benchmark... even if it's flawed. Anyone that knows much about Geekbench OpenCL should know not to put much stock in it. I've routinely seen things where a 4090 or whatever will massively underperform, or some company like Apple or Qualcomm will put out drivers that massively boost performance. Geekbench is a synthetic benchmark in every sense of the word, and that makes it ripe for abuse in terms of targeted optimizations.

It's nice that Intel has brought a GPU to market, I really hope Battlemage shows major improvements from Alchemist. But I'm also realistic, and Intel is still behind on drivers and other areas. Intel's gains over time have mostly been from DX11 and earlier games, incidentally. Plus gains in games that get benchmarked a lot. 🤔

And FYI, AV1 doesn't really have better compression or quality than HEVC. It's similar overall, perhaps fractionally better (like 0~5 percent). The main thing is that it's royalty free, which is what helped AVC (H.264) become so popular — not that H.264 was supposed to be royalty free, but it basically became that way. HEVC (H.265) required royalties and so most companies and software balked. Higher bitrate AVC can look as good as lower bitrate HEVC (and AV1) and so that was the accepted solution for many years.
 

Jagwired

Distinguished
Aug 5, 2015
28
3
18,535
As I mentioned some weeks ago, they should spin off their GPU division. They simply do not have the performance nor the funds to be competitive in the discrete GPU market.
It's not a sound business decision to keep developing and marketing these GPU's.
Intel is falling now because they don't have a GPU business for AI, and they missed out on the crypto boom too. ARM also is predictably slowly taking over the CPU market. I don't know why AMD, Nvidia, and Intel haven't tried to promote RISC-V more, or come up with a new joint, open source architecture with other companies.

What Intel should spin off is the foundry business. I don't see a lot of value controlling the foundries at this point. In fact, it was putting Intel at a disadvantage for a while because AMD and Nvidia were able to use smaller manufacturing processes from TSMC.

It would be great for the electronics market if there was another independent competitor to TSMC. Spinning off the foundries would likely give Intel billions they could use to save the remaining business units like GPU's.
 
  • Like
Reactions: systemBuilder_49
Intel is falling now because they don't have a GPU business for AI, and they missed out on the crypto boom too. ARM also is predictably slowly taking over the CPU market. I don't know why AMD, Nvidia, and Intel haven't tried to promote RISC-V more, or come up with a new joint, open source architecture with other companies.

What Intel should spin off is the foundry business. I don't see a lot of value controlling the foundries at this point. In fact, it was putting Intel at a disadvantage for a while because AMD and Nvidia were able to use smaller manufacturing processes from TSMC.

It would be great for the electronics market if there was another independent competitor to TSMC. Spinning off the foundries would likely give Intel billions they could use to save the remaining business units like GPU's.
Intel's only advantage is that they are the only vertically integrated CPU company. If they divest the foundries they will be forever trapped in the same profit margin range as their competitors unless they discover some magical new design that lets them do more with less with the same process node size.

As the industry has shown no one has been able to really compete with TSMC. Their closest competitor Samsung is struggling and is being propped up by the rest of the Samsung empire. Same thing with Intel, their design house is the only thing keep their foundry business going. If Intel Foundry gets spun off it will go the same was as GlobalFoundries, an after thought.
 

systemBuilder_49

Distinguished
Dec 9, 2010
103
35
18,620
if they are making money on them why would they stop? And right now AI is the future and thats mostly GPU based...
What makes you think they will make money on Arc or Battelmage? They are not. Net Margins on NVidia GPUs were about 18% until 2018, when at first crypto and then the AI boom really began taking off. The Arc A770 is FOUR YEARS BEHIND NVIDIA. Yes, 4 years. It was released with 3070 die size, 20% more transistors than the 3070, using a better VLSI node, and it was released TWO YEARS AFTER the 3070. Moreover, it produces 3060-class performance, which is one generation (2 years) of progress. 2+2 = 4. Unless I am mistaken nobody is making money selling new designs that are 4 years stale. The only way to make money on a 4-years stale design is to start selling them, 4 years ago !!

Intel is losing money on every card they ship.
 
Last edited:

systemBuilder_49

Distinguished
Dec 9, 2010
103
35
18,620
Intel is falling now because they don't have a GPU business for AI, and they missed out on the crypto boom too. ARM also is predictably slowly taking over the CPU market. I don't know why AMD, Nvidia, and Intel haven't tried to promote RISC-V more, or come up with a new joint, open source architecture with other companies.

What Intel should spin off is the foundry business. I don't see a lot of value controlling the foundries at this point. In fact, it was putting Intel at a disadvantage for a while because AMD and Nvidia were able to use smaller manufacturing processes from TSMC.

It would be great for the electronics market if there was another independent competitor to TSMC. Spinning off the foundries would likely give Intel billions they could use to save the remaining business units like GPU's.
Intel is failing now NOT because of what you are saying. Intel just doesn't have enough business. All the nodes below 10nm are too expensive to build a productive foundry with the business Intel has. The costs of VLSI foundries has been growing FASTER than the PC or even Smartphone businesses for the past 20 years. The only way to build new foundries below 10nm is to merge the designs from the entire worldwide VLSI market, like TSMC is doing.

Intel has tried repeatedly for 20 years to enter new markets, and they failed. Every. Single. Time. They stubbornly and stupidly refused to open up their foundry to outside customers. They bet against EUV and ASML cleaned their clocks when they invented the shoot-the-tungsten-droplet-with-a-laser idea. These horrific mistakes guaranteed that their foundry business would falter.

What Pat Gelsinger promised (to put the pieces of Intel back together again) is impossible, but the investors are too dumb to realize it, and I'm not even sure that Pat knows of his own lie, he's that clueless. Now it is time to pay the piper.

When you say, "Intel's only advantage is that they are the only vertically integrated CPU company." - that is no longer an advantage, it's now a boat anchor tied to Intel's neck, and it is choking the company to death. The company is run by process engineers who would rather scuttle the entire company and follow it to the bottom of the ocean, rather than split the high-margin VLSI design business from their livelihood - the much lower-margin Foundry business, thereby guaranteeing pay cuts. So they are holding Intel hostage and forcing the USA government to subsidize their eternal greed ...
 
Last edited:

8086

Distinguished
Mar 19, 2009
114
47
18,610
Battlemage performance concerns are probably once again due to early driver issues. With time, just like ARC; Intel will get the performance up to where it should be.
 

LabRat 891

Honorable
Apr 18, 2019
109
76
10,660
Mmmmkay.

Navi 24 has less than half the shaders of Polaris 20, and a quarter the memory buswidth. Yet, the 6500XT (@gen4x4) edges out the RX 580 in most scenarios.

Vega iGPUs in the first few gens of Ryzen APU lost shaders but got faster (before RDNA was integrated into the lineup).


Not a perfect apples-apples comparison, but generational differences can mean a lot.
 
The company is run by process engineers who would rather scuttle the entire company and follow it to the bottom of the ocean, rather than split the high-margin VLSI design business from their livelihood - the much lower-margin Foundry business, thereby guaranteeing pay cuts. So they are holding Intel hostage and forcing the USA government to subsidize their eternal greed ...
I doubt it. Intel rested on its laurels for a while, and 'retired' a lot of key engineers before their time (who ended up at Apple and elsewhere...) but TSMC also got a lot of help from the Taiwanese government over the decades, did it not? And SMIC and various Chinese companies are absolutely being propped up by their government now. Intel will survive because the US government really has no other alternative. It needs domestic silicon production, period, in case of something like a China takeover of Taiwan / TSMC.

Why invest a bunch of money into TSMC — a non-US entity that just happens to do tons of business with the US — versus putting the money into Intel? And I really do think Pat can turn things around. Pat isn't the problem; the problem is the ten years of CEO 'leadership' prior to Pat. Brian Krzanich and Bob Swan dug a MASSIVE hole for Intel with their decisions and policies. It will likely take a decade to recover from all of that.

There will be US incentives for other companies (Nvidia, AMD, Apple) to use Intel Foundries at some point is my bet as well. And when that starts snowballing, Intel can close the gap with TSMC and perhaps even regain the lead. But it will inevitably be a very close race going forward, and I don't think any foundry company will continue to succeed without government funding to help out, simply because the costs for every new node are increasing too fast.
 

gg83

Distinguished
Jul 10, 2015
754
347
19,260
Considering AI is basically eating Intel's lunch (Nvidia has gone from being worth less than Intel prior to 2020, to being worth 30X as much as Intel, in terms of market cap at least), I don't think Intel can just pretend GPUs aren't important. Intel needs some changes if it's going to stay relevant.

I'm not saying GPUs alone are the solution, but Intel ignored GPUs for 20 years and it's now paying the price. Or rather, it dabbled in GPUs a little bit (i.e. Larrabee) but was afraid it would hurt the CPU division. And now GPUs are indeed killing the CPU division... just not Intel's own GPUs.

Look at how many Chinese startups are getting major state funding to try and create competitive GPUs for AI. If China also sees this as important, why wouldn't Intel come to similar conclusions? And sure, Intel could go more for AI accelerators, but the point is it can't just give up on the non-CPU market, and GPUs are a good middle ground as Nvidia has proven.
Nailed it. Intel's stock at $23 says it all. They needed gpus yesterday. Hopefully they come up with a better solution for ai than straight gpus, maybe.