Best Graphics Cards For The Money: Jan. '09

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

fledgling101

Distinguished
Jan 9, 2009
2
0
18,510
It's good to hear that issues are being worked on, maybe you'll pull it off and manage to combine the best of the current and the older articles. Good luck with any ventures in that direction : )
 

channelz

Distinguished
Dec 4, 2008
2
0
18,510
When you list "best cards for the money", would it be useful to include not only the initial cash outlay, but also factor in the differential cost of electricity to operate the card over its projected lifetime?
 

chewbenator

Distinguished
Jul 5, 2006
246
0
18,680
In a crazy price fluctuation today Buy.com and it's affiliate sites on eBay and Amazon cleared their stock of most GTX 280 models. The prices ranged from $180AR w/ coupon to $230 w/ coupon with the available option of free shipping. I personally picked up an EVGA GTX 280 Superclocked for 190AR w/ free shipping. Because of the semi-static nature of these articles you can't predict prices, or keep them up to date, but at ~$200 the GTX 280 was a steal. Only time will tell if the prices seen today will be seen again in the near future with the new Nvidia offerings, but congratz to anyone that got in on this.

 
G

Guest

Guest
I was able to get the Radeon HD 4670 GDDR3 from NewEgg 2 weeks ago for $59.99 after $20 mail in rebate (plus shipping)
 

GeneralKarl

Distinguished
Jan 11, 2009
2
0
18,510
Last month for AGP bus?? Nonsense, I have 4 AGP computers all pending graphics card upgrades. It will be many, many years before I build a game computer that does not use an AGP bus, and I am not alone by far. Virtually All of the gamers at my game portal use AGP Computers to play the very latest of games.

Do we get 100% of the new inefficient game graphics engine capability? Nope, but it would be silly to buy several "upgrades" in m/boards and g/cards just to transfer game developers costs to us by letting them rap out poorly written graphics code/using half developed graphics engines.

Looks like it will be the 3850 for me, not that the article included the price, but hey :D
 
G

Guest

Guest
Can we stop with the "tie"? Please?

It really doesn't help. Anyone can just go by price on newegg and see that between $xxx and $yyy prices, the cards are... surprise the ones you picked. Great job on the research, not! Really though, if you are going to have a "Best blah for the money", just do one for each. Tie's really don't narrow anything down, when there's really only 2 competitors.
 
G

Guest

Guest
Your really need to start categories for GPUs and CPUs for notebooks also. There are plenty of games that prefer notebooks over laptops.
 

cleeve

Illustrious
[citation][nom]stinkmuffin [/nom]Can we stop with the "tie"? Please?
[/citation]

Well, I went down to Newegg today and told them that they should arbatrarily raise the price on a graphics card if it performed similarly to another card of equal price. But you know what? They said no. Weird, huh?

Seriously, I'm not going to flip a coin to choose one. If it's a tie, that means I'm happy recommending wither card. It also means you can't go wrong with either one. So if it bugs you, just read the first one from now on and pretend the second one isn't there... :)



 

cleeve

Illustrious
[citation][nom]GeneralKarl[/nom]Last month for AGP bus?? Nonsense, I have 4 AGP computers all pending graphics card upgrades. [/citation]

Whoa, dude! Now I never said to take a handgun to your existing AGP build, but the facts are thusly: 1. Nobody is buying a new AGP motherboard, and 2. the oinly two viable cards for an AGP upgrade have been the 2600XT and 3850 for MONTHS now. It's of limited usefulness in the guide as there's really not much to buy out there anymore.

If they intro new AGP cards, I might revisit it for a month. AGP is a great bus, but it's time has past. Once again, not suggesting you throw away your AGP system, just saying it's place in a buyer's guide has become obsolete.
 

rener

Distinguished
Jan 13, 2009
14
0
18,510
What about the Nvidia 9800GTX/9800GTX+ 512MB DDR3? This card is Kick A**, especially in SLI configuration. I just bought one yesterday for $187.95 free shipping no tax because out of state with $30.00 mail in rebate, so will come out to $157.95
 

rener

Distinguished
Jan 13, 2009
14
0
18,510
What about the Nvidia 9800GTX/9800GTX+ 512MB DDR3? This card is Kick A**, especially in SLI configuration. I just bought one yesterday for $187.95 free shipping no tax because out of state with $30.00 mail in rebate, so will come out to $157.95
 

cleeve

Illustrious
IMHO, the 9800 GTX+ isn't as attractive as the cheaper 4850, which also does extremely well in Crossfire configuration. And for the same price as a 9800 GTX+ you can often find a 1GB version of the 4850.
 
G

Guest

Guest
The hidden issue i see is the Nvidia "PhysX" engine - which requires dual core cards - in order to be used ( 3d graphics or physx processing but not both at the same time on the same core from what i believe).

So if you have a dual core card and enable physx - will your game frame rate drop etc.

Can anyone expand on this - are all the top single cards dual core (ie in non sli cards)

Thanks
 
G

Guest

Guest
Please create a CPU Hierarchy Chart as well. These Intel naming conventions are becoming very confusing!
 

wt684

Distinguished
Jan 12, 2009
6
0
18,510
I'm trying to decide between getting an HIS 4850 IceQ4 Turbo 512 MB, or a Sapphire 4850 1GB (reference cooler).

The price is about the same (in Canada) since the IceQ4 has the better cooler.

Can someone tell me if the 1GB performance is better such that I can disregard the extra cooling?
To me, if the performance difference is only 5% or less on the 1GB, I would be inclined to go with the better cooling.
 

rener

Distinguished
Jan 13, 2009
14
0
18,510
Well what I heard is you should stick with Nvidia because all games will be made using thier GPU's exclusively look at the deal between EA and Nvidia it's already starting to happen. The graphics will look like caca as they already do for ATI compared to Nvidia.
 

cleeve

Illustrious
[citation][nom]rener[/nom]Well what I heard is you should stick with Nvidia because all games will be made using thier GPU's exclusively.[/citation]

Wow. That's a complete and utter fabrication.
You're treading deep into blind fanboy territory now.

As for the rest of your made-up 'facts', image quality between Ati and Nvidia cards are par, and many folks will say that Ati's IQ is better, especially when it comes to Transparent Antialiasing.

And as far as EA is concerned, please let us know about an EA game that doesn't work on Ati cards, because I've never heard about it. Good luck finding one.

You're entitled to a brand prefrence if you like, but spreading falsehoods because you're being a fanboy is pretty lame... :p
 

rener

Distinguished
Jan 13, 2009
14
0
18,510
[citation][nom]Cleeve[/nom]Wow. That's a complete and utter fabrication.You're treading deep into blind fanboy territory now.As for the rest of your made-up 'facts', image quality between Ati and Nvidia cards are par, and many folks will say that Ati's IQ is better, especially when it comes to Transparent Antialiasing.And as far as EA is concerned, please let us know about an EA game that doesn't work on Ati cards, because I've never heard about it. Good luck finding one.You're entitled to a brand prefrence if you like, but spreading falsehoods because you're being a fanboy is pretty lame...[/citation]

You better read up bwoy and get some knowlegde in your little cabesa, grab the issue of feb. 2009 maxmimum pc and check out page 08 and put your noodle brain to work. It says that EA and Take-Two sign on to use the physics acceleration API, which means if you own a GPU from Nvidia 9800 series and up PhysX will automatically be enabled and if you own an ATI which doesn't support PhysX you will not be able to enjoy the cloth effects and enhanced particle effects and the graphics will be toned down. Sorry buddy to put you down about ATI but hey caca happens and yes I have 2 PC's one Nvidia and one ATI both running AMD.
 

cleeve

Illustrious
[citation][nom]rener[/nom]You better read up bwoy and get some knowlegde in your little cabesa... It says that EA and Take-Two sign on to use the physics acceleration API.[/citation]

Hahaha. PhysX? You're talking about PhysX when you suggest that "all games will be made using thier GPU's exclusively"?

Here's a few thoughts for *your* cabesa, fanboy: PhysX support does not mean that games will not run on ATI cards. I don't know who told you otherwise, or if you just made it up to suit your brand prefrence.

Were you aware that many current games support PhysX, and it makes pretty much no diffrence? Games liike Mass Effect, Gears of War 1 & 2, Mirror's Edge, UT3, etc? And all of these games are fully playable on ATI cards. In fact, the diffrence that 'PhysX' makes is so negligable that nobody really cares?

Heheh. PhysX... seriously. If we ever see a 'killer app' that makes PhysX a must have, you can come back and not look so noobish. It might happen. But it's not happening yet... as for now, PhysX support means some bits of cloth, plastc, and smoke added. I think you might be overestimating the importance of that a tad when you suggest that "all games will be made using thier GPU's exclusively"...

By the way, did you know that Nvidia is helping get PhysX working on Ati cards?
http://www.tomshardware.com/news/nvidia-ati-physx,5841.html
If a lone dev can make this progress, it looks like once that 'killer app' arrives there's a good chance PhysX will be working on Ati cards...

As far as which cards you own, is there a reason I should care? I don't think that impacts the misinformation you're spreading. If you just want to compare, I'm running dual 8800 GTS 512MB cards. :)
 

drewp29

Distinguished
Jan 7, 2009
66
0
18,630
I don't believe my 8800GTS 512 is equivalent to a Radeon 4830. Every review I have read rates its performance closer to or better than the 4850. This is the reason I haven't upgraded yet . . . I'd have to go to a 280 to see major improvements, and THAT amount of money isn't warranted until Lionhead gets off their bums and jolly well puts out Fable 2 on PC. Which hopefully will be 3-4 months from now (fingers crossed), but that's unlikely. And at that point the 295 should be in my price range . . .
 
Status
Not open for further replies.