Nvidia's GeForce GTX 285: A Worthy Successor?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
... Hum... Just to make a point against people in the dual-GPU hating business... If you look at the score between the Radeon HD 4870 X2 and the manufacturer OC GTX 285... there is about 27.6% difference... and would you look at that... there is a 26.5% difference... So you can change from GTX 280 to 285 for a reduction in value... while between the GTX 285 and Radeon HD 4870 X2 there is an AUGMENTATION in value (which isnt common in the market these days).

Also, for the ones who would like to know why HD 4850 X2 reviews are rare... it's because they're exclusive to Sapphire at the moment... and because there are no reference drivers provided by ATI/AMD, there are quite a few bugs with them at the moment. (I haven't extensively researched this subject but that's what I last heard).
 

noobinberg

Distinguished
Jan 27, 2009
44
0
18,530
I too wish this article touched on OCing these cards...this is Tom's afterall, and don't we all OC everything. I pretty sure that everyone will agree that stock speeds on anything is useless.
 

68vistacruiser

Distinguished
Jan 22, 2009
100
0
18,680
Since this is an article about Nvidia's 285, and whether it's a better choice than the 280, the 4870x2 should not have been put in at all. For that matter, the 295 should have been pulled, too, and replaced with a 260 (216). I'm interested in the store-shelf performance vs $$$ this article meant to explore for the 285 compared to it's siblings.
 

wavebossa

Distinguished
Sep 25, 2008
127
0
18,680
Good review but not great. And the reason why I can't call it great is simply because...

In this day and age who honestly reviews a gfx card without testing it's OC capability?

Not trying to sound ungrateful just trying to make sense of your decision not to OC this card.
 

nerrawg

Distinguished
Aug 22, 2008
500
0
18,990
I have to say that I second wavebossa's opinion, would like to have seen Tom's take on whether or not the 55nm gives a better OC headroom.

For those of you who want to read more about 285 OC ability look here:

EVGA: http://hardwarelogic.com/news/133/ARTICLE/5639/1/2009-01-27.html
BFG: http://www.driverheaven.net/reviews.php?reviewid=711
ASUS: http://vr-zone.com/articles/nvidia-geforce-gtx-285/6431.html?doc=6431
Leadtek: http://www.techpowerup.com/reviews/Leadtek/GeForce_GTX_285
Inno3D: http://www.hexus.net/content/item.php?item=16880

Hopefully this should give you some references as to how the 285 GTX overclocks, there will be differences both between brands and individual items of course.
 

novastorm

Distinguished
Jan 2, 2006
14
0
18,510
[citation][nom]Proximon[/nom]Perfect. Thank you. I only wished that you could have thrown in a 4870 1GB and a GTX 260+ into the mix, since you had what I'm guessing are new beta drivers. Still, I guess you have to sleep sometime[/citation]

In fact that is the GTX260+ they are talking about here.....216 stream processors as opposed to the old normal GTX260 which has 192....
 

A Stoner

Distinguished
Jan 19, 2009
325
100
18,960
I overclocked my card yesterday. I have the XFX GTX 285 vanila, not factory overclocked and I am pretty dissapointed. I was hoping that I would be able to get a 725 GPU clock. I was able to get it to 725, but after about 3 minutes it started getting errors. Using ATITool. I was not able to get rid of the errors until I let it cool down. The temperature at which the errors appeared as 81C. I was able to get it to stay error free up to 702 GPU. The memory was able to remain stable at 1400, but completely crashed the system at 1415. My final overclock was set to 702 GPU and 1360 for memory. This was stable for 750 minutes, or almost 12.5 hours. I do not have the best powersupply, so I am not sure if the reason for my crashes is the card, or maybe a lack of enough juice for this card. I was running my Q6600 overclocked to 3.00GHz with prime running on three processors while I had ATITool running on the fourth. My processor and memory are perfectly stable at this level for 24 hours on Prime. So, as I said, it is possible my power supply is the reason for my lower than expected overclock.
 

hannibal

Distinguished
A Stoner: Try out running Q6600 at normal speed, so there is more power to GPU. What power suply you have? The 285 does not eat very much power... But of cource there are differencies allso in those GPU's and not all reseive the same speed.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Good to see some pretty thorough work done on this, though as totenkopf mentioned, I think most of us knew how it'd go for the most part, especially with just those cards chosen: it'd edge out the 280 consistently, and the dual-GPU cards would use it to wipe the floor. More pertinent to what most enthusiasts were wondering was on the side of power consumption, which showed that perhaps the 285 was a slight bit of a disappointment; sure, it's better, but not a whole lot.

One more question that came to mind that wasn't answered was one of noise: it does look like the cooler for the 285 is similar, if not identical, to that for the 280, but given the cut in TDP, I wonder if perhaps the fan still is capable of being slower and quieter?
[citation][nom]totenkopf[/nom] Maybe it's not that likely, but you know there are idiots out there with a 1200w PSU powering a dual core and a gtx260 getting terrible efficiency. It might be a pretty small window to actually get savings with a higher consumption card, but I'm sure that PSU efficiency mitigates the benefits of low power parts more than we think.[/citation]
Not exactly; that's not quite how electricity works. Electricity is more of "pulled" rather than "flows," in that a PC, even at the PSU really won't draw more power than it needs. A 1200w PSU does not by default draw any more power than a wimp 300w unit; the wattage rating merely means how high it'll go without worry of bursting into flames. So as a general rule, with the same parts, a higher-rated PSU used in a computer will draw less power than the cheaper PSU, as higher efficiency is usually required in order to cut some of the heat of power conversion that can kill a PSU.
 

totenkopf

Distinguished
Dec 11, 2007
49
0
18,530
Nottheking, you missed my point. I know that a 1200 watt PSU doesn't pull 1200 watts from the wall at all times. I was talking about efficiency in particular. For example, Jack has a 750w PSU that is 85% efficient when generating 500 watts of power, while Jill's 1200w PSU is only 70% efficient when generating 500 watts off power. Therefore, Jack is pulling about 588 watts from the wall while Jill is pulling about 715 to generate the same 500 watts for their machines.

Of course, these efficiencies are made up. but generally speaking it seems that PSU's are the most efficient between 50-75% load. If this were true and your systems PEAK load were 500w, you would want a PSU with max output of about 650 or 700w (as idle would draw less). With all this in mind I was wondering if there was a point where your PSU was so inefficient that you actually drew less power from the wall by buying a more power hungry card to increase your PSU's efficiency (decrease in wall draw while generating more power for the system). Like I said, I think it's possible... there's probably just a small window that requires some pretty specific circumstances and probably wouldn't amount to much money savings anyway.
 

A Stoner

Distinguished
Jan 19, 2009
325
100
18,960
[citation][nom]hannibal[/nom]A Stoner: Try out running Q6600 at normal speed, so there is more power to GPU. What power suply you have? The 285 does not eat very much power... But of cource there are differencies allso in those GPU's and not all reseive the same speed.[/citation]
Yeah, I was thinking about changing my clock back on my CPU. Just do not spend alot of time doing this. For the greater part, the games I play would do just fine on a 6800 ultra, which I have on my laptop which runs all the same games. I was just trying to answer some of the peoples questions about clocking the card. My power supply is a thermaltake toughpower 700W I think. It only has a single 6 pin and a single 8 pin for PCIe power. I had to use the supplied 6 pin adapter that came with the card to get get both 6 pin connectors up. I know 700 watts is more than my computer uses total. My wall pull is about 480 watts at load and about 230 at idle. My concern is that the 12v RAIL or RAILS that are supplying the power, maybe one of them is also connected to one of the other high power components, such as the motherboard/CPU. I am also running with 6 hard drives, and while not a high power draw, with 6 it may add up.

I think I mentioned earlier that I replaced an overclocked 8800 GTS 640 with this GTX 285. I have not checked to see which card should pull more power.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
[citation][nom]Daeros[/nom]In other words, no matter how well ATI's strategy of using two smaller, cheaper GPUs in tandem instead of one huge GPU works, you will still be able to say that Nvidia is the best. Also, why would most people who are spending $400-$450 on video cards not want a dual-card setup. Most people I know see it as a kind of bragging right, just like water-cooling your rig.One last thing, why is it so hard to find reviews of the 4850x2?[/citation]

ATI's solution is a two card setup for all practical purposes. Two 4870x2's take you into CrossfireX territory and both triple SLI and CrossfireX are immature technologies. Newegg's shipping me a Sapphire 4870x2 as a warranty replacement for my recently dead MSI 3870x2 so I'll be happy for a year.

I've noticed too, that most sites don't review the 4850x2. Perhaps because it's not a reference AMD design and standard releases of Catalyst don't work well? It's my understanding (correct me if I'm wrong) that you need drivers from the card manufacturer instead. It's rumored that AMD will be coming out with a reference design, so expect more coverage of the 4850x2 in the future.

Can't wait to get my ASUS 21.5" 1920 x 1080 LCD and a Phenom II 940 next week to go along with the basically free 4870x2 upgrade! As for Nvidia being the best, in monster single GPU yes, but as more titles support Crossfire and SLI, dual GPU cards have a future.
 

mf_fm

Distinguished
Jun 30, 2006
24
0
18,510
fuxk cards that are over USD$250, fuxk'em all, period.

NO such need to pay over USD$250 for ANY video card, it's 2009 already, prices for video card remain almost the same BULLSH!T rate. EVERY YEAR.

why slightly better spec, cost USD$400~600???????????? WTF???????? every time! wanna suck our balls dry since the 90's.

SLI/CF bullsh!t crap of crap, do a fuxking multi-core gpu already, suck my juice from our balls again. stop telling us the gamers need hardcore bullsh!t, lies, marketing bullsh!t.

if DDR2 can be dirt cheap, ANY parts can be the same, since they're not fuxking seafood (ain't no seaonal).
 

iamtheone

Distinguished
Apr 8, 2007
23
0
18,510
Do not understand why anyone thinks multi-gpu is the way to go. previous comment mentioned a problem called microstuttering which increases latency for every additional gpu(if you game online) why does this website play into marketing crap? sli and crossfire is not better than a single gpu solution except in games that allow enough fps to adjust and there are not many. 60 fps limits won,t allow for the needed adjustments so if you spent all that money you have wasted it unless you like bragging about the benchmarks you get in a game you only play yourself ,against stupid bots!
 

hannibal

Distinguished
Multi-GPU is a solution where it is easier to controll the heat, the production effiency is better because smaller GPU's are easier to produce (less GPU that don't meat the specs), and the same (small) gpu can be used in more diverce products (1 core in low end, two or more in higher)

The problems are as you said the stuttering and multi GPU driver support.

Thats why we need these test, so we can see if the multi GPU solution is a worth of it. So far it has proven to be. Two core ATIs have been relative good when compared to equal Nvidia one core solutions. But there are differencies. In some cases one core has been better, in other multicore solution has been faster.
When the last ATIs mega size GPU did not fare well, they did have to do changes. And they definitely did it! But Nvidia has allso proven that one coresolution has its strong points too. It seems to be that at this moment it is a little bit cheaper to get more speed with the same money with multi GPU solutions, but it can change in future, and depends on aplications you use.
 

mikeny1

Distinguished
Feb 12, 2009
4
0
18,510
Where the heck is the February 2009 update to Best Graphics Cards for the Money!? For crying out loud, it's already February 12th guys!! geez...
 

bad_code

Distinguished
Feb 20, 2009
108
0
18,680
Wow! I can't believe the interest in building an expensive machine just to get 30-40 frames a second. And then to over clock it, just to get that frame rate.
 
G

Guest

Guest
Stay away from the EVGA cards with the thermal only heat solution (no fans). In SLI they are too close together and will generate enough heat to shut down the system while playing Crysis in Ultra graphics mode. (Looked great for about 20 minutes though). Just a word to the wise from someone that is now replacing these cards with an single 295.
 
Status
Not open for further replies.