I've owned a number of ATi-made and nVidia-made cards. To be honest, I think they both make great hardware, and acceptable software. (I'll probably never be pleased with the driver suite from either company) Both have a lot of work to do, and both are making attempts.
I'd say we have a rather viable competition going here. And that's a good thing; nothing like a competitor to stop one company from deciding to sell a video card for $1,000US. Those that don't live in North America would appreciate this even more, as prices for this sort of thing are always magnified, even BEFORE adding the VAT.
In more detail, though, I'll say that nVidia clearly has a better multi-card solution. CrossFire can be described as nothing short of "cumbersome." Who needs an external cable to transfer data? ATi makes a lot of integrated graphics chips, so why not simply make an assymetric sort of CrossFire solution, that can fit as many number/types of cards you want in a motherboard, and output it through a dual-link DVI port on the motherboard itself? Instead, they have a cheap, worse copy-cat of nVidia's solution.
On the flip side, though, when it comes to image quality, as well as video input/output, ATi seems to be a winner here. I still can't get over that nVidia CHARGES EXTRA for their solution, while AVIVO is free. Similarly, nVidia's been stuck at the same x4 "rotated grid" MSAA solution they've had for 4 years, while ATi's bested them with their x6 "random sample" MSAA, as well as augmenting it with later features, such as the ability to use it with OpenEXR and even
Oblivion's default HDR.
As for the analyst's opinion, they're an idiot, I'll say that. I rather actually loathe such suits who think that they know everything, when it's clear that all they really know is what sort of market value the company has, and how large a dividend they're giving out that quarter. These are the kind of people I dislike most: they have SOME brains, but they think that because of what they know, they must know EVERYTHING.
yeah like I'm gonna spend over $400 to get a 1900xtx to play at 800x600 or 1028x764......riiiiiiiiiiiiiiiiiight....especially now that i dropped almost a grand on my new flatpanel with an insanely high resolution on it.....riiiiiiiiiiight
i think MORE companies should aim their flagship graphics cards at the 800x600 resolution crowd...riiiiiiiiiiiiiiiiiiiiight
"Look mom I just popped $1500 on a new 2400x1800 flat panel and UBER-grpahics card....but i only play at 800x600 cause that's its performance "sweet-spot"'.......riiiiiiiiiiiiight
As the others have said, you should've actually bothered to pay attention when skimming over that post.
The truth is, because most people don't have that $1500US for that flat-panel, most (>90%) people who are serious gamers play at one of two resolutions: 1024x768, or 1280x960/1024. Those stuck with even less expensive graphics cards MIGHT go for 800x600, but a larger number of people (close to 5-8%) run at 1600x1200. The number of people who actually do go beyond that could be considered negligable; I'd estimate that at most, 2% of serious gamers do that.
A tug-o-war is a good analogy of their status in the market I think. Ati is actually older if I remember correctly. (correct me if I'm wrong) And inovation and "first to market" w/ technology 'X' has always tradeds hands from the begining.
<snip>
Correct; ATi, as I recall, was founded in 1985, compared to, if memory serves me correctly, 1991 or 1992 for nVidia.
And you're quite on when it comes to the analsys of the graphics wars; a number of "savage blows" have been dealt. Fortunately, we aren't seeing anything as embarrasing as ATi's card performance from 1999 through early 2002, or nVidia's from that point until 2004.
However, I'm afraid that nVidia might be setting themselves up for another fall, with their continued refusal to adopt an assymetric core design, and the head-scratching innefficiency of the 7950GX2... Let's hope that my worries are not well-founded, for another "lame duck" season of the wars wouldn't be good.
They do suck, a whole load of air. ATI cards sound like jet engines.
ATI cards run hot, turning cases into easy-bake ovens.
ATI drivers are just plain horrible.
That depends on the card; I have an X800XT, and it's the quietest (active) part of my PC. And it's sure a heck of a lot cooler, quieter, and has a far lower power draw than its nVidia counterpart, the GeForce 6800 ultra.
As for the drivers, I've used both a lot of ForceWare and Catalyst. Both, as far as the drivers themselves go, are fine. I'll say that the settings panels for both kinda suck, though. As for Linux, which I'm fairly certain you (or somebody) is getting at,
we all know how Linux-based games dominate the market. If you want to do professional 3D work, you don't buy a GeForce or Radeon; you go for a FireGL or QuadroFX, or possibly something made by another company like 3Dlabs.
Hmmm... That looks like it doesn't blow out the back. Also looks like it's an ATI-manufactured card. And it's an X1900 - that means they run a little on the warm side. I'm not saying you're
wrong about that...the picture speaks for itself.
I think you knew that they were speaking about the two-slot cards, since those are the chief ones that nVidia fans are pegging for being too loud and hot-running.
Why are you talking about 3 year old video cards? 5 series nVidia cards are the worst of nVidia. The ti 4 series performed better. I'm sorry you purchased a 5 series; but you are talking of nVidia from 2003-2004 during the plague of FX. This is 2006, nVidia has the best performance per dollar right now.
Clearly ATI cards are already at 90nm like nVidia
http://en.wikipedia.org/wiki/Comparison_of_ATI_Graphics_Processing_Units
They run so hot because they are clocked to high in an attempt to keep up with nVidia.
The GTX's fan out the back; however, GT's don't need to use up dual slots to keep cool because they are basically GTX's with less memory and a lower clock. They run quite cool too, something you can not say for any single slot ATI high performance card.
nVidia's cards are the best performance per dollar? How can you make such a generalization? Are you implying that you'll get more "bang for your buck" to go with, say, a GeForce 7950GX2 than a Radeon X1900XT, which costs about half has much, and in some rather popular games, like
Oblivion, just barely falls short?
You can't make generalizations like that; yes, some cards provide better price/performance ratios than others. But both companies make them, and you won't EVER find them at the top-of-the-line.
As for ATi's temperature/clock speed, that's a funny comment... In almost all benchmarks, ATi's X1900 posts pretty solid wins, ones that suggest that clock speeds have little to do with it... Especially since the X1900XTX is hardly clocked faster than the 7900GTX, at 650MHz and 600MHz, respectively.
That's interesting, but also worth noting that it's hardly the reference design; if that's the case, I can show you Radeons that have water-cooling.
And I'd almost swear its the same grille backing that MSI used for Radeon X850XTs and X1800/X1900 cards.
I heard you had to disable Crossfire to play games. :lol:
Good one!