abufrejoval
Reputable
My best "AMD" GPU was obviously the Mach8, the very first GPU I owned.
Before I had had tons of graphics cards, the very fist being an EGA in 1984 and with probably dozens of VGAs over the next 16 years.
But the Mach8 in 1990 wasn't a graphics card, it was a graphics processing unit, today you'd call it an accelerator, that could draw whole lines or even colored and stippled triangles with a single command!
That was simply awsome at a time, when a passive memory mapped frame buffer was the norm. For early workations that was a monochrome display operated via the terrific 32-bit BitBlt algorithm on the CPU at full RAM speed, which gave them this wonderful responsiveness even for pure pixel graphics.
But in the case of most VGAs, it was up to 1 megabyte of frame buffer memory being accessed across a slow 16-bit 8-MHz bus with wait-states through a 64KB window with bank registers: you could quite literally see the pixels being drawn (well, not quite). The "only smarts" these cards had was the ability to "color expand" data you wrote there, meaning that you could logically write a bitmap pattern and VGA chip would expand that to 2/4 or 8 bit. The logic for true-color never made it to VGAs which meant that in True Color mode, you could really see the pixels getting drawn...almost.
In any case true color animations or even video (=fluid motion aninations at >24Hz) seemed pure science fiction!
And then came the 8514/a capable of processing a display list of graphics commands using bus-master DMA to CPU memory to read the list and then work on ultra fast (zero filling and pattern expanding) VRAM in its own address space for drawing! Wow, just wow!
And the Mach8 was the clone, which brought this functionality to the masses who couldn't afford a PS/2 micro-channel system!
(Funny I just learned that one of the 8514/A guys was an Nvidia co-founder)
I don't actually recall being blown away by the Mach8 speed, to be frank. And it was still an 8-Bit (per pixel) pseudo color device, if I remember correctly. Gaming was still very much a DOS and VGA thing and I had very little time for that.
Mostly, I believe I was busy trying to make it work with X11R4 on Unix at the time. And I actually was busy porting X11R4 to a TMS34020 based GPU which had a true color visual with 32-bit per pixel as part of my thesis.
From then on it was mostly a blur of dozens of GPUs, which would eventually gain true color, video and fluid motion gaming, high and even indredible resolution in many, many far too small iterations.
Today, the idea of having a GPU using bus-master DMA seems mostly like a security nightmare, but I hear it's quite common both ways, and even from one GPU to another on systems like DGX workstations.
Before I had had tons of graphics cards, the very fist being an EGA in 1984 and with probably dozens of VGAs over the next 16 years.
But the Mach8 in 1990 wasn't a graphics card, it was a graphics processing unit, today you'd call it an accelerator, that could draw whole lines or even colored and stippled triangles with a single command!
That was simply awsome at a time, when a passive memory mapped frame buffer was the norm. For early workations that was a monochrome display operated via the terrific 32-bit BitBlt algorithm on the CPU at full RAM speed, which gave them this wonderful responsiveness even for pure pixel graphics.
But in the case of most VGAs, it was up to 1 megabyte of frame buffer memory being accessed across a slow 16-bit 8-MHz bus with wait-states through a 64KB window with bank registers: you could quite literally see the pixels being drawn (well, not quite). The "only smarts" these cards had was the ability to "color expand" data you wrote there, meaning that you could logically write a bitmap pattern and VGA chip would expand that to 2/4 or 8 bit. The logic for true-color never made it to VGAs which meant that in True Color mode, you could really see the pixels getting drawn...almost.
In any case true color animations or even video (=fluid motion aninations at >24Hz) seemed pure science fiction!
And then came the 8514/a capable of processing a display list of graphics commands using bus-master DMA to CPU memory to read the list and then work on ultra fast (zero filling and pattern expanding) VRAM in its own address space for drawing! Wow, just wow!
And the Mach8 was the clone, which brought this functionality to the masses who couldn't afford a PS/2 micro-channel system!
(Funny I just learned that one of the 8514/A guys was an Nvidia co-founder)
I don't actually recall being blown away by the Mach8 speed, to be frank. And it was still an 8-Bit (per pixel) pseudo color device, if I remember correctly. Gaming was still very much a DOS and VGA thing and I had very little time for that.
Mostly, I believe I was busy trying to make it work with X11R4 on Unix at the time. And I actually was busy porting X11R4 to a TMS34020 based GPU which had a true color visual with 32-bit per pixel as part of my thesis.
From then on it was mostly a blur of dozens of GPUs, which would eventually gain true color, video and fluid motion gaming, high and even indredible resolution in many, many far too small iterations.
Today, the idea of having a GPU using bus-master DMA seems mostly like a security nightmare, but I hear it's quite common both ways, and even from one GPU to another on systems like DGX workstations.