Why Nvidia, Why ATI, screw Best Buy XD

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Currently Nvidia has the edge with its 8800 series in performance and DX10, unless you don't want to spend £200 or more - then it's more complicated.

Before that, the X1XXXs were probably the better cards out there. They had similar if not superior performance, but also could do HDR + AA

Before that, the Nvidia 6800s were probably better than the X8XXX as they had HDR.

Before that, you had the ATI 9700 pro - one of the outstanding cards in recent times.

Further back you have the original Geforce / Geforce 2's - in those days Nvidia ruled.

Maybe ATI will respond to the 8800 with a superior card - who knows.

The thing is to evaluate all alternatives when you buy and get the best features + performance for your budget at that time.
 
Well... Technically there's the issue of ATI having the best AGP card out there period :)

True, but that's once again a case of availability more than the companies. If the GF8800GTS-320 came to AGP tomorrow at a reasonable price (relative to perfomanc) then it'd be the same as the PCIe realm, where nV leads.

Of course for now ATi is the King of AGP (for luddites :twisted: ) ..... AND MATROX IS THE KING OF ISA and PCI-X! ...(and ATi is king of PCI and nV Kin of PCIe)

Gosh darn it all, you had me looking for an ISA slot on my motherboard. Then I looked at a really old mobo that I had in the back room and found one. Even still had a Matrox card in it.

As to Nvidia vs ATI, I've switched back and forth several times through the years. Can remember when I thought ATI was a bad joke. Then I bought a 9800 Pro and thought it was the best thing ever. After that came a EVGA 7800 GTX which was good for a few months, then a X1900 XTX Toxic, which is doing a RMA thing and regulating me back to my old computer running that old 9800 Pro. Thank God for having a spare computer that runs when the gaming one is down. I've seen the companies trade positions a lot of times and expect that to keep happening.

For the moment, I think Nvidia's 8800 is the best DX9 card that can be bought. The jury's still out from what I've seen when it comes to getting the thing to work with DX10. Will have to wait a few weeks to know if ATI's R600 will compete, win, or fail in the benches.
 
We can talk about this all day, but I think to sum this all up, there's no point comparing the two companies as wholes. When you're in the market for a graphics card, post your budget and you'll get a slew of answers. The good thing about video cards is that (currently) all the newest ones use the same slot, PCI-Express. This isn't true with CPUs, so slot is always a factor with those. When you go with a good company, it's hard to make a really bad decision with video cards.

Abyss, sarcasm is never obvious on the internet. Your post was quite confusing.
 
Gosh darn it all, you had me looking for an ISA slot on my motherboard. Then I looked at a really old mobo that I had in the back room and found one. Even still had a Matrox card in it.

Yeah man, ISA and PCI-X, MATROX... KING OF THE BIG SLOTS !! (hey that rhymes) 8)

ha ha :lol: your a poet and didnt know it.
but your ski,s are longfellows :lol:
 
I like ATI cards for only 1 (petty) reason. I can get a red card from ati. I was all over the 8800gtx black cards, till they switched to green. I know its small. But dang nabit I want a pimp looking card!
 
ha ha :lol: your a poet and didnt know it.
but your ski,s are longfellows :lol:

LOL!
listnu7.gif
 
Before that, the X1XXXs were probably the better cards out there. They had similar if not superior performance, but also could do HDR + AA

Thats not strictly true, the x1000s were not fully capeable of HDR+AA either, ATi eventually came up with a driver hack to get it working in Oblivion, but to my knowedge it never worked in any other game.

I still remember my first ever Gfx card.... It was a Diamond Viper, VESA Local Bus, (now THATS a big card) and the idiot at the store put too much RAM in it so it had 2mb rather than the 512k it was suppoed to have.

I was so proud, although I always wished I could use like 1mb of that as system RAM so that I wouldnt have to have a "clean" boot disk for every time I wanted to run Doom (1)...
 
Thats not strictly true, the x1000s were not fully capeable of HDR+AA either, ATi eventually came up with a driver hack to get it working in Oblivion, but to my knowedge it never worked in any other game.
FSAA+HDR works in farcry also, although from my own experience only 2XAA worked so I am still baffled when review sites use 4xaa+HDR in farcry. When I try that, FSAA gets disabled and the jaggies remain.

But don't blame ATI for the fact that the game companies don't build the support into their games. ATI's hardware could do it just fine had the option to enable both been built into the game. And the hack you speek of was just the chuck patch which is currently implemented into the driver. I've been running HDR+AA in Oblivion ever since upgrading to a X1800XT and using the initial version of the chuck patch. Simply a much better way of playing Oblivion IMO as choosing either or was a lose/lose situation to my eyes.
 
Thats not strictly true, the x1000s were not fully capeable of HDR+AA either, ATi eventually came up with a driver hack to get it working in Oblivion, but to my knowedge it never worked in any other game.

Well like Paul said, you're wrong, it is strictly true. But there's also more to it, it's not just HDR+AA, which the GF7 can do as can the GF6, X8, and R9700 series cards. But the thing to know is what is the limiting factor, like the GF6 being able to do a particular kind of HDR, FP16HDR (aka OpenEXR HDR) which can't be done in conjunction with hardware MSAA on the GF6 & 7 and remain FP16 throughout unlike the X1K which can.
ATi does not have the same ROP limitation in the X1K as is found in the GF7. And BTW, it's not HDR+AA, which can be implemented in many ways, but FP16HDR+FPMSAA, or even efficient FP16HDR+AA. Also not all GF6 can do FP16HDR, it's not an SM3.0 feature, the GF6100,6200,6300 series (all but the crippled rebranded GF6600s) are SM3.0 compliant but can't do the FP16 blending in the ROPs needed for FP16HDR.

And Serious Sam2 is another titles that can do FP16HDR+MSAA on the X1K;
http://www.behardware.com/medias/photos_news/00/15/IMG0015941.gif

The reason a 'hack' was required for Oblivion is because Bethesda removed the option on PC (it was launched with the X360) because of 'performance issues' despite the fact that the X1800XT performs HDR+AA+HQAF faster than the X360, perhaps it was due to there only being one sided 'performance issues' and it just so happened to be on the side that was TWIMTBP, another example of sponsored titles not always being the best choice (and that plays both ways). Stuff like that hurts the consumer, not help.