News Intel Arc B580 "Battlemage" GPU boxes appear in shipping manifests — Xe2 may be preparing for its desktop debut in December

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
GPGPU was what was making the rounds in all of the textbooks in the mid-2000s. General Purpose Graphics Processing Unit. Since the idea was you could now run non-graphics code on them, which was 'new' at the time.
A site I used to follow, back in the early 2000's, was gpgpu.org. Here's a snapshot from the Internet Archive, to stoke your nostalgia:

At a job I took, back in 2002, they even got me an ATI 9700 Pro, with the idea that we could program it to accelerate some of the computational work we were doing. It never reached that point, but I think its GPU was the first to support floating point shader programs (using a special 24-bit number format). The kinds of shader programs it could run were extremely limited, both in size and complexity, but it was still enough to make it interesting for a wide range of applications.

The last thing I'll say about that card was that its fan failed after a year of low-intensity use. ATI was at least easy to deal with, regarding the warranty process, and cross-shipped me a replacement.
 
  • Like
Reactions: P.Amini
THIS. The rumor wasn’t that no Battlemage card will ever see shelves, the rumor was that the flagship challenger die was scrapped, Battlemage would have less models and less availability, Celestial was questionable for any desktop release but if so would be even smaller than Battlemage. Basically: rumor was that the decision had been made to wind things down and they weren’t in this for the long haul, but they weren’t going to chuck out everything they were deep in development on immediately.

And Pat Gelsinger said on an investor earnings call that they're simplifying the product line and offering fewer SKUs, and that they see "less need for discrete graphics in the market going forward", which seems to square with the rumor.
yeah but MLID says that, and he's a clown. I don't care that it's a logical fallacy. He's a clown
 
  • Like
Reactions: bit_user
No, the first mass market 3D accelerator cards used chips by 3D Labs and Nvidia's ill-fated NV1, but that was in 1995, long before Nvidia started using the term "GPU".

They didn't start using the term GPU until 1999, when they launched the GeForce 256:
GeForce 256 was marketed as "the world's first 'GPU', or Graphics Processing Unit", a term Nvidia defined at the time as "a single-chip processor with integrated transform, lighting, triangle setup/clipping ...​

I don't know exactly what was the key distinction that classified it as a GPU, but maybe because some of that was accomplished using firmware-programmable engines. If so, they weren't even the first mass market 3D graphics accelerator to feature a programmable core - that distinction would probably go to the Rendition Verite (which I actually bought, except it got "lost in the mail").

Anyway, at the time of the GeForce 256, 3D accelerators still appeared to software as fixed-function devices. Programmable shaders didn't appear until the GeForce 3, which launched in Feb 2001. Personally, that's the threshold I think should've defined a GPU.
thanks, exactly what i was referring to. kinda figured someone would be nice enough to look it up for me. really the key distinction that made it a GPU is that nvidia's marketing department declared it was a GPU, mostly to stress the whole hardware T&L thing.


GPGPU was what was making the rounds in all of the textbooks in the mid-2000s. General Purpose Graphics Processing Unit. Since the idea was you could now run non-graphics code on them, which was 'new' at the time.
GPGPU was the term brought about as a result of programmable shaders. it was a bit of a surprise when i first started hearing about that as up until then graphics hardware had always been for, well, graphics.
 
  • Like
Reactions: bit_user
GPGPU was the term brought about as a result of programmable shaders. it was a bit of a surprise when i first started hearing about that as up until then graphics hardware had always been for, well, graphics.
It goes back further than that, but you're right that programmable shaders is what made it popular. Even back in the mid 90's, you can find papers where people had started finding hacks, such as utilizing the blending operators, to harness graphics hardware to work out certain computational problems.

The best place I can think to look for those would be in an archive of old SIGGRAPH papers, since I'm sure there were some about that stuff, but I don't have an ACM membership.
 
  • Like
Reactions: P.Amini