Ah, Don, the bridge soldering... With your avatar, how could we forget about that? Though you neglected to mention the whole reason the 9500pro came to exist, as a result of (initially) R300s apparently not being deemed fully capable and hence binned for 9500s... Or did there turn out to be nothing to that story?
On another note, I thought the GPU for the X1950pro was the RV570, not the R570. Some sources claimed the X1650XT used an RV570, but I'm pretty sure that's what the RV560 was, unless there were X1650s using locked X1950 GPUs.
Lastly, why is the 4870 listed there? While it's true there was a lot of excitement over how it tended to spank the twice-as-expensive GTX 260 "dinosaur," almost all the budget-minded enthusiasts sided with the 4850... Or two of them in CrossFire. That's what really started bringing CrossFire onto the same level as SLi, and also brought dual-card setups into the more mainstream enthusiast range, instead of just the purview of those with money burning a hole in their pockets.
[citation][nom]Achoo22[/nom]The author is too young and unqualified to write such an article[/citation]
One has to have lived when something was popular to know anything about it? That's news to me.
[citation][nom]Achoo22[/nom]I'm pretty sure there were some crucially important budget CGA/etc cards back in the 80s without which home computing for entertainment purposes would've never caught on.[/citation]
Also, note the presence of the "below $200US" qualifier. Last I checked, there weren't any VGA (or EGA) cards that had come anywhere NEAR that price point for the first years of their existence; as an example, I recall something about ATi making a splash when they first entered the market with their "EGA Wonder" card (their first serious product) that MSRP'd at a whopping $399US. Eventually VGA cards did come down to scrape that price point or so, (coincidentally around the time the bulk of PC games started calling for them in id's heyday) but there were no individual models, as far as I recall, that stood out as a serious "bargain" over any of the others: the affordable VGA adapters in use were present because virtually ALL of the makers were able to sell them under that price point.
[citation][nom]Achoo22[/nom]Once upon a time, pretty much every version of BASIC was specific to a particular video card - much like all PC games before VGA/UNIVBE/Direct X.[/citation]
Um, no. There was only really one version of the BASIC programming language itself. There were multiple interpreters and compilers for it, though. But even then, they were NOT differentiated by what video card was used: they were more based on the CPU and the total system architecture. None of the major 8-bit systems I can think of before the IBM PC really had that sort of issue with differing display adapters, (that is, if they could switch the adapter at all) regardless of which 8-bit CPU they were based on, be it Zilog's Z80, MOS's 6502, or even Intel's own 8085 & 8080.
Again, you'd generally need a different BASIC compiler for each different system architecture, but that has nothing to do with video cards: rather, what really changed the direction there was the introduction (and subsequent dominance) of the IBM PC architecture, which, ironically (at least in the context of your argument) allowed for a much, much wider array of display adapters to be used, but could universally use a single compiler to write anything for the system.
In fact, your claim doesn't even make any sense: it's the CPU itself (and the system's architecture to a lesser extent) that tends to be the deciding factor on how a high-level language compiles or interprets to low-level binary. As a result, the registers used tended to be standardized on a single system: it's what made alternative add-in boards compatible with the main system in the first place. The only thing that really tended to be tied with the display adapter was the display itself; a lot of such adapters only worked with a single monitor and vice-versa. This didn't start to really change until the advent of the IBM PC's CGA
[citation][nom]Achoo22[/nom]As a result, you can argue pretty strongly that the cards back then were much more important, historically, than anything on this meager and ill-conceived list. Even if the focus was explicitly on modern cards, the choices were poor.[/citation]
While you'd have had a point if you claimed that video STANDARDS then were more important than what they are today in terms of advancing the state of computing and gaming, (the shift to VGA meant more than DX 11 ever will) that is, unfortunately, not what you claimed. If you go on the basis of individual cards (which is what this article's all about, and what you attack) you are quite mistaken.
While there were a number of famous CPUs (particularly the 8088 springing to mind, and the 80386SL to a lesser extent) that were noted for being quite comparatively afordable for the performance they offered, they are not display adapters.
As for the selection of cards presented, I personally recall how much of an impact they made when they were released... And what kind of a reaction they got, both from enthusiasts, and from less-technically inclined gamers and other users. While, obviously, some of those cards got a bigger reception than others, (I'd probably say the 6600GT got the biggest one) all of them made a significant impact in terms of bringing graphics horsepower to users affordably. Since you don't make any alternative suggestions, I gather you actually don't have any direct prior experience there.
[citation][nom]Achoo22[/nom]We are not impressed.[/citation]
To echo your initial claim, what qualifications do you have here? It's pretty clear that you didn't even really live in the enthusiast world of the PAST few years, let alone that of a few decades ago. You never made any alternative suggestions, instead just making some vague claims and attacks. Mr. Woligroski is, in fact, very qualified to speak on the subject of video cards; in fact, he's one of the very few I would always listen to; he's one of the very, very few that knows what he's talking about.