Ah, Don, the bridge soldering... With your avatar, how could we forget about that? Though you neglected to mention the whole reason the 9500pro came to exist, as a result of (initially) R300s apparently not being deemed fully capable and hence binned for 9500s... Or did there turn out to be nothing to that story?
On another note, I thought the GPU for the X1950pro was the RV570, not the R570. Some sources claimed the X1650XT used an RV570, but I'm pretty sure that's what the RV560 was, unless there were X1650s using locked X1950 GPUs.
Lastly, why is the 4870 listed there? While it's true there was a lot of excitement over how it tended to spank the twice-as-expensive GTX 260 "dinosaur," almost all the budget-minded enthusiasts sided with the 4850... Or two of them in CrossFire. That's what really started bringing CrossFire onto the same level as SLi, and also brought dual-card setups into the more mainstream enthusiast range, instead of just the purview of those with money burning a hole in their pockets.
[citation][nom]Achoo22[/nom]The author is too young and unqualified to write such an article[/citation]
One has to have lived when something was popular to know anything about it? That's news to me.
[citation][nom]Achoo22[/nom]I'm pretty sure there were some crucially important budget CGA/etc cards back in the 80s without which home computing for entertainment purposes would've never caught on.[/citation]
Also, note the presence of the "below $200US" qualifier. Last I checked, there weren't any VGA (or EGA) cards that had come anywhere NEAR that price point for the first years of their existence; as an example, I recall something about ATi making a splash when they first entered the market with their "EGA Wonder" card (their first serious product) that MSRP'd at a whopping $399US. Eventually VGA cards did come down to scrape that price point or so, (coincidentally around the time the bulk of PC games started calling for them in id's heyday) but there were no individual models, as far as I recall, that stood out as a serious "bargain" over any of the others: the affordable VGA adapters in use were present because virtually ALL of the makers were able to sell them under that price point.
[citation][nom]Achoo22[/nom]Once upon a time, pretty much every version of BASIC was specific to a particular video card - much like all PC games before VGA/UNIVBE/Direct X.[/citation]
Um, no. There was only really one version of the BASIC programming language itself. There were multiple interpreters and compilers for it, though. But even then, they were NOT differentiated by what video card was used: they were more based on the CPU and the total system architecture. None of the major 8-bit systems I can think of before the IBM PC really had that sort of issue with differing display adapters, (that is, if they could switch the adapter at all) regardless of which 8-bit CPU they were based on, be it Zilog's Z80, MOS's 6502, or even Intel's own 8085 & 8080.
Again, you'd generally need a different BASIC compiler for each different system architecture, but that has nothing to do with video cards: rather, what really changed the direction there was the introduction (and subsequent dominance) of the IBM PC architecture, which, ironically (at least in the context of your argument) allowed for a much, much wider array of display adapters to be used, but could universally use a single compiler to write anything for the system.
In fact, your claim doesn't even make any sense: it's the CPU itself (and the system's architecture to a lesser extent) that tends to be the deciding factor on how a high-level language compiles or interprets to low-level binary. As a result, the registers used tended to be standardized on a single system: it's what made alternative add-in boards compatible with the main system in the first place. The only thing that really tended to be tied with the display adapter was the display itself; a lot of such adapters only worked with a single monitor and vice-versa. This didn't start to really change until the advent of the IBM PC's CGA
[citation][nom]Achoo22[/nom]As a result, you can argue pretty strongly that the cards back then were much more important, historically, than anything on this meager and ill-conceived list. Even if the focus was explicitly on modern cards, the choices were poor.[/citation]
While you'd have had a point if you claimed that video STANDARDS then were more important than what they are today in terms of advancing the state of computing and gaming, (the shift to VGA meant more than DX 11 ever will) that is, unfortunately, not what you claimed. If you go on the basis of individual cards (which is what this article's all about, and what you attack) you are quite mistaken.
While there were a number of famous CPUs (particularly the 8088 springing to mind, and the 80386SL to a lesser extent) that were noted for being quite comparatively afordable for the performance they offered, they are not display adapters.
As for the selection of cards presented, I personally recall how much of an impact they made when they were released... And what kind of a reaction they got, both from enthusiasts, and from less-technically inclined gamers and other users. While, obviously, some of those cards got a bigger reception than others, (I'd probably say the 6600GT got the biggest one) all of them made a significant impact in terms of bringing graphics horsepower to users affordably. Since you don't make any alternative suggestions, I gather you actually don't have any direct prior experience there.
[citation][nom]Achoo22[/nom]We are not impressed.[/citation]
To echo your initial claim, what qualifications do you have here? It's pretty clear that you didn't even really live in the enthusiast world of the PAST few years, let alone that of a few decades ago. You never made any alternative suggestions, instead just making some vague claims and attacks. Mr. Woligroski is, in fact, very qualified to speak on the subject of video cards; in fact, he's one of the very few I would always listen to; he's one of the very, very few that knows what he's talking about.
I still have a 6600GT in my parts bag for testing, and I remember following Don's advice to get the 7600GT a year or so later. I was happy to see that most of my GPU selections (on up to a HD5770 and a GTX460) made this value list.
[citation][nom]nottheking[/nom]Um, no. There was only really one version of the BASIC programming language itself. There were multiple interpreters and compilers for it, though. But even then, they were NOT differentiated by what video card was used.[/citation]
Google HBasic. I was there, dude. I've been immersed in the culture for 30 years. I'm not making this crap up to bolster my ego, like you are.
I could break down your wall of text and argue point-by-point, but you're not worth the effort.
This is such a broad topic, you really need a reviewer like JC Dvorak or someone who has been reviewing boards for decades. This freaking list doesn't even have any 2d boards. It's a filler article, and we are not amused.
[citation][nom]Achoo22[/nom]Google HBasic. I was there, dude. I've been immersed in the culture for 30 years. I'm not making this crap up to bolster my ego, like you are.[/citation]
A SDK is not a compiler/interpreter. The differences they deal with don't even require a separate program; all they concern with is helping the programmer stay within the system's limits. The end-resulting binary that has to be compiled to remains the same, (evidenced by the fact that the ASSEMBLY used remains the same; as someone who enjoys doing Z80 and 6502 ASM I can vouch for this) and the compiler remains the same: the GUI and other interface may change, but those aren't part of the programming language at all.
[citation][nom]Achoo22[/nom]This is such a broad topic, you really need a reviewer like JC Dvorak or someone who has been reviewing boards for decades.[/citation]
In all honest, as someone who's read PC Magazine across decades, I honestly think John Dvorak would write a pretty terrible article, and completely miss the point. (he'd likely have, say, entirely disregarded the X800GTO from this list for not having DX 9.0c) This is an article regarding personal economics, and something very deeply ingrained into enthusiast culture... That's a culture that Dvorak is entirely on the outside of, and Cleeve here is on the inside of.
A strong case here is simply looking over Toms' own history. You must remember that this is a hardware site, after all, and has published a LOT of articles on video cards. For the ones shown in the list, you can just look back at the original articles, which took note of how big an impression in terms of value many of these cards made; not just in the body text, but also the comments.
[citation][nom]Achoo22[/nom]This freaking list doesn't even have any 2d boards.[/citation]
As I'd mentioned (and you clearly opted not to actually truly read) there really weren't any EGA or VGA boards that both stood out in terms of value AND came close to $200US for their MSRP.
There's valid criticism that he left out a few solid, affordable and noteworthy 3D cards, such as the Voodoo Banshee or Riva TNT. But that's light-years more valid than claims over your favorite EGA or VGA board.
[citation][nom]Achoo22[/nom]It's a filler article, and we are not amused.[/citation]
"We?" Just what is with your multiple-personality act? When you presume to speak for others you come off sounding extremely arrogant... Yet you act as if I'm in this for my ego?
I LOVED this article! What a trip down memory lane. At one point or another, I owned most of the cards in this article. I still remember buying my first Voodoo 3 3000, and a few months later grabbibg the Hercules Geforce 2 GTS 64 for $399 at Best. I've upgraded my graphics card with almost ever new product generation since that first Voodoo card, and I don't think I've ever been as blown away as when I put that Hercules Geforce2 GTS 64 in my rig and played Half-life at max setting doing 60 fps (previously unheard of). Those were the days when even running your card in 32 bit color mose could seriously effect your FPS. Man, how things have changed! Again, great article.
Great article, nostalgia all the way. Made me feel very smart too, every card I have bought new was on it (except the first, a Voodoo2):
GeForce 3Ti200 (still got, fan died but put a new one on)
GeForce 8600GT (dubious about that being on the list, although it was very cheap)
[citation][nom]belardo[/nom]0 - Amiga (Because back then... PC was pretty much 16 colors, if that)[/citation]
The Amiga's a whole different platform... That, and the OCS it packed in the 1980s only had up to 64 effective colors through HalfBrite mode. By the time the 256-color capable AGA came out in 1992, VGA/MCGA adapters were quite commonly used for gaming on PCs. (HAM mode, in spite of claiming impressive numbers on paper, wasn't flexible enough for gaming usage, hence why HB was seen in virtually all Amiga games)
[citation][nom]Achoo22[/nom]The author is too young and unqualified to write such an article[/citation]
Yawn. Another self important elitist blanket statement with nothing backing it up.
Regardless, you're making assumptions about what you think we *should* be trying to write about as opposed to what we *are* writing about. I specifically targeted the GeForce MX as a starting point for a reason, you might not like the judgement call but that doesn't make you right, nor is there any way your opinion could possibly imply how old I am. Your attitude implies something about you, though, although it's not your age.
This list targets sub-$200 values, and we started with the GeForce MX because it pioneered sub-$200 value like no 3D card before it. IMHO.
You could make an argument for earlier models, but for me the MX was the card that started the modern 3D value era.
Had a GeForce 2 MX, then 8800 GT (which I still run), and recently a GTX 460 1GB. I was very happy with them. The 8800 GT was pricy when I bought it - ~$230 or so, but the recent buy of the GTX 460 was only $130. Fantastic card at that price, IMHO.
I have to disagree with 8600GT/GTS. Worst Nvidia X600GT cards ever made. 6600GT. Good. 7600GT great. 9600GT ultra great. 8600GT/GTS an overpriced joke. Going under $100 many months later doesn't say anything. Really. Every piece of hardware loses it's value over time. That doesn't make it any greater.
We'll have to agree to disagree. The 8600 GT/GTS and 2600 XT provided a distinguished record of low-cost value for a very long time, and deserved many a recommendation in their heyday. To me, any card that manages that feat deserves a recommendation. Most cards that get cheap as they hit end-of-life don't accomplish that and are often phased out too quick to be memorable in that role.
I remember the first dedicated GPU I bought was an x1950. Granted, the next generation of cards had already arrived (this was at the end of 2006). For my first PC Build, I eventually bought an HD3870 and later overclocked it with an aftermarket cooler. That GPU lasted me a year and half before I (now) possessed an HD5770 a little more than a year ago and it still runs fantastic. And now I'm eyeing the HD7000 series cards as the next upgrade to my system.
Granted, there were a lot of cards that came out and some were better than mine, but its hard to pay for these items when you're a college student. I'm glad that all of the cards I have bought are on the list and I didn't make a wrong choice in them.