News Intel Arc B580 "Battlemage" GPU boxes appear in shipping manifests — Xe2 may be preparing for its desktop debut in December

Status
Not open for further replies.
That can't be right, I kept hearing they were cancelled. Well, definitely cancelled after this gen, for sure this time.
JK, I plan to buy a Battlemage. Hopefully their high end will still come in 2 slot coolers. But to be fair, the novelty of seeing some things done worse and some better is probably worth more to me than many others, and I can always spend more on a different 50 series if I have to. But I do expect pretty good performance for the price.
 
  • Like
Reactions: User of Computers
BTM should have a better launch than ACM ever did, with the more established drivers. LNL with the BMG uarch has already been in the market, and the drivers have probably been cooking for much much longer than that.
 
I will be happy with 4070 like performance, which is the old rumor. It will replace my EVGA 3080 Ti so that can go into my collection.

Also, someone poke EVGA until they start making monitors. I want a monitor that says EVGA because that is hilarious.
You're getting that out of the B580 IMO, that would at the very best from a G10 card like B780 and I seriously doubt that will occur either. If you get regular 4070/7800XT performance, it'll be as good as it gets. It would need to be $400 max, but Intel can't really afford to give away these cards like they did with Alchemist.
 
Can we stop calling "Graphics Card" or "VGA" a GPU please? It's so freaking confusing and wrong, it's like calling a PC a CPU!
While you are technically correct, GPU is heavily implanted in the PC enthusiast lexicon, dating back to the late 90's. I highly doubt you will see this change. Personally I use "graphics card" as it's the most correct term. GPU is the processor on the card specifically, and VGA is an out of date term that was also a standard at the time (mid-late 80's), generally referring to the cards of the time, as the term "GPU" does now. VGA, VGA cable, or VGA resolution are the terms I recall. D-sub was/is also referred to as a VGA port.
 
  • Like
Reactions: jlake3
While you are technically correct, GPU is heavily implanted in the PC enthusiast lexicon,
I think the issue is really that you can't buy a GPU separately from the graphics card it's on. Plus, virtually all functional aspects of the graphics card are determined by the GPU model, other than a little bit of variation in clock speeds and sometimes with two choices of memory capacity. That's how they end up being virtually synonymous.

it's like calling a PC a CPU!
By contrast, a CPU is quite distinct from the motherboard or most other aspects of a PC. Telling me which CPU model you have doesn't tell me about the memory capacity, the secondary storage, its graphics card (if any), etc. So, a CPU is more distinct from a PC than a GPU is from a graphics card. Still, even if the analogy is imperfect, I get your point.

I agree that the authors of this site should all know well enough to say "graphics card" when they're talking about the entire card, and GPU when they're talking about just its processing unit.
 
It's "effectively cancelled".
THIS. The rumor wasn’t that no Battlemage card will ever see shelves, the rumor was that the flagship challenger die was scrapped, Battlemage would have less models and less availability, Celestial was questionable for any desktop release but if so would be even smaller than Battlemage. Basically: rumor was that the decision had been made to wind things down and they weren’t in this for the long haul, but they weren’t going to chuck out everything they were deep in development on immediately.

And Pat Gelsinger said on an investor earnings call that they're simplifying the product line and offering fewer SKUs, and that they see "less need for discrete graphics in the market going forward", which seems to square with the rumor.
 
THIS. The rumor wasn’t that no Battlemage card will ever see shelves, the rumor was that the flagship challenger die was scrapped, Battlemage would have less models and less availability, Celestial was questionable for any desktop release but if so would be even smaller than Battlemage. Basically: rumor was that the decision had been made to wind things down and they weren’t in this for the long haul, but they weren’t going to chuck out everything they were deep in development on immediately.

And Pat Gelsinger said on an investor earnings call that they're simplifying the product line and offering fewer SKUs, and that they see "less need for discrete graphics in the market going forward", which seems to square with the rumor.
They don't want to compete in a slaughterhouse market with Nvidia dominating and AMD picking up most of the scraps.

The future is much brighter for their iGPU tiles. Disaggregating it could be what's needed to make them larger, make a desktop APU, etc.
 
  • Like
Reactions: jlake3 and bit_user
While you are technically correct, GPU is heavily implanted in the PC enthusiast lexicon, dating back to the late 90's. I highly doubt you will see this change. Personally I use "graphics card" as it's the most correct term. GPU is the processor on the card specifically, and VGA is an out of date term that was also a standard at the time (mid-late 80's), generally referring to the cards of the time, as the term "GPU" does now. VGA, VGA cable, or VGA resolution are the terms I recall. D-sub was/is also referred to as a VGA port.
In fact my first monitor was a 17" inches CRT and it was HUGE back then and it has the best connection available that was a 15 pin D-Sub. Later came DVI-I and DVI-D...
My point is even VGA (Video Graphics Accelerator) which was used years after actual VGA (Video Graphics Array) standard is much more correct than GPU while referring to Graphics Cards (Graphics Accelerator). In this news when it says "Intel is hard at work shipping these (GPU) boxes to OEMs" it should mean this boxes contained GPU chips but you have to read more carefully to realize it is referring to the whole Graphics Card not GPUs! And then I said F***! Am I still reading Tom's Hardware?!...
 
In fact my first monitor was a 17" inches CRT and it was HUGE back then and it has the best connection available that was a 15 pin D-Sub. Later came DVI-I and DVI-D...
It seemed so weird to me that EGA used a digital cable, while VGA (which came later) used analog. After "multisync" monitors came along, I could start to appreciate the versatility of using analog for this connection.

BTW, some of my monitors had RGBHV inputs, with a separate BNC connector for each of the 5 signals.
 
Last edited:
  • Like
Reactions: P.Amini
It seemed so weird to me that EGA used a digital cable, while VGA (which came later) used analog. After "multisync" monitors came along, I could start to appreciate the versatility of using analog for this connection.

BTW, some of my monitors had RGBHV inputs, with a separate BNC connector for each of the 5 signals.
I've never heard of RGBHV, just googled it, that is interesting.
Good old days, my passion for computer was 100 times more than today...
 
  • Like
Reactions: bit_user
I've never heard of RGBHV, just googled it, that is interesting.
I think it was more popular on workstations. So, high-end monitors would tend to have it, so they weren't limited to just PC usage. Some systems combined the H and V signals on a single cable, so you'd have just 4.

Good old days, my passion for computer was 100 times more than today...
Yeah, I think these are wild times to be alive, but my problem is that I just can't think of many interesting things to do with computers that aren't already commonplace.

My machines at home aren't very new or high end, just because there's no point. I did recently have a case where I wanted to try something with AVX-512, but then I just rented a small AWS EC2 instance for a couple hours and that was that.

At work, I spend enough time compiling software that it actually makes sense to have more cores (current CPU is 24 threads, more would be nice).
 
  • Like
Reactions: P.Amini
In fact my first monitor was a 17" inches CRT and it was HUGE back then and it has the best connection available that was a 15 pin D-Sub. Later came DVI-I and DVI-D...
My point is even VGA (Video Graphics Accelerator) which was used years after actual VGA (Video Graphics Array) standard is much more correct than GPU while referring to Graphics Cards (Graphics Accelerator). In this news when it says "Intel is hard at work shipping these (GPU) boxes to OEMs" it should mean this boxes contained GPU chips but you have to read more carefully to realize it is referring to the whole Graphics Card not GPUs! And then I said F***! Am I still reading Tom's Hardware?!...
Yerp, I agree. It also seems the vernacular differs depending on region. "Graphics accelerator" was also a term thrown around in media of the early 2000's. Just finished flipping through some old Maximum PC rags I had squirreled away from that period, simply because of your comment. One of the reasons I love this site (as much as it annoys me at times...) is all the old school enthusiasts that frequent it. On that note I thank you for the trip down memory lane, and I'm gonna hoard these old magazines for just a bit longer...

(edit: to qualify) Old school is relative, myself? I've been PC gaming since the early 286 days, with a brief stint on a C64 thanks to a lucky garage sale find around '91.
 
Yerp, I agree. It also seems the vernacular differs depending on region. "Graphics accelerator" was also a term thrown around in media of the early 2000's. Just finished flipping through some old Maximum PC rags I had squirreled away from that period, simply because of your comment. One of the reasons I love this site (as much as it annoys me at times...) is all the old school enthusiasts that frequent it. On that note I thank you for the trip down memory lane, and I'm gonna hoard these old magazines for just a bit longer...

(edit: to qualify) Old school is relative, myself? I've been PC gaming since the early 286 days, with a brief stint on a C64 thanks to a lucky garage sale find around '91.
Good old days... and physical printed magazines... and old Tom's Hardware... Peace mate.
 
  • Like
Reactions: bit_user
I've never heard of RGBHV, just googled it, that is interesting.
Good old days, my passion for computer was 100 times more than today...
You've never heard of RGBHV because everyone called them BNC connectors. I had a Samsung 900NF CRT back in the day that had them. Just googled the manual, and they're even called BNC in the manual.

3. BNC Connectors(Option)
Connect the signal cable to the BNC signal port on the back of your monitor.
 
You've never heard of RGBHV because everyone called them BNC connectors. I had a Samsung 900NF CRT back in the day that had them. Just googled the manual, and they're even called BNC in the manual.
That would be cause "RGBHV" just refers to the video signals carried over the cable whereas BNC refers to the physical interface. It would not be incorrect to also refer to a VGA cable using a D-sub connector as RGBHV as it also carries those signals, but no one would do that in practice as it would only cause confusion.


In fact my first monitor was a 17" inches CRT and it was HUGE back then and it has the best connection available that was a 15 pin D-Sub. Later came DVI-I and DVI-D...
My point is even VGA (Video Graphics Accelerator) which was used years after actual VGA (Video Graphics Array) standard is much more correct than GPU while referring to Graphics Cards (Graphics Accelerator). In this news when it says "Intel is hard at work shipping these (GPU) boxes to OEMs" it should mean this boxes contained GPU chips but you have to read more carefully to realize it is referring to the whole Graphics Card not GPUs! And then I said F***! Am I still reading Tom's Hardware?!...
Yeah, cause GPU chips get boxed up and sold individually all the time. Come on now, it's clear they were referring to the physical box itself that the GPU will be placed into for retail. The inference that they mean the whole graphics card is easy to make considering that GPU chips are not sold in this manner.


It hadn't occurred to me that I've been referring to graphics cards as "GPUs" since, well, always. iirc, the term "GPU" was first introduced by nvidia as a marketing term to differentiate their graphics products that provided discrete hardware acceleration of 3D rendering from display out cards that were limited to having the CPU render via software and quickly became a blanket term for the 3D accelerated graphics hardware that began appearing around then. So, at least to me, GPU refers to both the individual chip doing the rendering as well as the collective device including memory, packaging and output. which can also be called a graphics card. Apparently I'm not the only one with that mindset.
 
Yerp, I agree. It also seems the vernacular differs depending on region. "Graphics accelerator" was also a term thrown around in media of the early 2000's.
Back in the early 1990's, SVGA clones started appearing and differentiating themselves by featuring "Windows acceleration" capabilities. This included things like copying from one 2D block of video memory to another, solid fill, line & polygon drawing, font caching, etc. I think that's when the "accelerator" terminology started. Then, "3D accelerator cards" were the next evolutionary step, getting more towards the late '90s.

Here are two examples of the latter:

(edit: to qualify) Old school is relative, myself? I've been PC gaming since the early 286 days, with a brief stint on a C64 thanks to a lucky garage sale find around '91.
My first real computer game was Ultima VI, on a 8088 PC XT clone w/ Hercules video card and a monochrome (amber) monitor. The first time I played it on a color monitor was revelatory - it was almost unplayable on monochrome, not to mention an 8088. By the time Ultima VII came around, I had a 386DX-25 w/ XGA graphics card (Tseng ET4000 chipset) and Sound Blaster Pro. That game really wanted a 486DX-33, though.

My other favorite games of that era were Castles, Civilization, the D&D games by SSI, Darklands, Descent, Doom, Lemmings, Populous, Quake, Wing Commander, and Wolfenstein.
 
  • Like
Reactions: P.Amini
the term "GPU" was first introduced by nvidia as a marketing term to differentiate their graphics products that provided discrete hardware acceleration of 3D rendering from display out cards that were limited to having the CPU render via software
No, the first mass market 3D accelerator cards used chips by 3D Labs and Nvidia's ill-fated NV1, but that was in 1995, long before Nvidia started using the term "GPU".

They didn't start using the term GPU until 1999, when they launched the GeForce 256:

GeForce 256 was marketed as "the world's first 'GPU', or Graphics Processing Unit", a term Nvidia defined at the time as "a single-chip processor with integrated transform, lighting, triangle setup/clipping ...

Source: https://en.wikipedia.org/wiki/GeForce_256

I don't know exactly what was the key distinction that classified it as a GPU, but maybe because some of that was accomplished using firmware-programmable engines. If so, they weren't even the first mass market 3D graphics accelerator to feature a programmable core - that distinction would probably go to the Rendition Verite (which I actually bought, except it got "lost in the mail").

Anyway, at the time of the GeForce 256, 3D accelerators still appeared to software as fixed-function devices. Programmable shaders didn't appear until the GeForce 3, which launched in Feb 2001. Personally, that's the threshold I think should've defined a GPU.
 
Last edited:
  • Like
Reactions: P.Amini and Eximo
Status
Not open for further replies.