GeForce GTX 285 Gets 2 GB: Gigabyte's GV-N285OC-2GI

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
Could it be that the drivers or the games aren't coded to support the additional 1gb of memory? I would expect some gains in the video mode about 1080p with more memory usually. But oh well, you get more e-peen if you tell all your friends that you have a 2 gigabyte graphics card anyway, so may be worth it just for that ;)
 

amdgamer666

Distinguished
Oct 24, 2008
101
0
18,680
We all know that the 2 gigs is quite pointless in a single card config so... can we get a 2x2 gb sli vs. 2x1 gb sli comparison article? pretty please with sugar on top?
 

anonymous x

Distinguished
Apr 22, 2008
121
0
18,680
[citation][nom]JonnyDough[/nom]What they need to do is fire a good chunk of their marketing dept. and stop focusing so much on sales and put more of that capital into engineers. They need to get on board with GDDR5 and lower power levels, and quit messing around with multiple cards, crappy sophisticated drivers, and GPUs that overheat. Who cares who owns the top end? Give us more competition in the mid ranged and actually stay competitive. NVidia is starting to hurt a little from AMD's manufacturing process/GDDR5/and targeting of the mid-low range masses. They need to refocus their efforts on making great GPUs, and stop worrying about their naming scheme. Their next card could be labled "5" and the next one "10". Variations that differ from the standard performance can be spaced in between these numbers. So for instance, they could create a standard for a "3" which might be a "5" with less memory or limited lanes.[/citation]
What you need to do is realize that nvidia cards get the same memory bandwidth as the Ati GDDR5 cards, that Ati has multiple gpu cards as well (4850x2 and 4870x2), read that many people prefer nvidia's drivers, adn compare the idle/load temps for the 4800 series to gt200 (hint- gt200 cards are on average cooler). Competition in mid range? 4830-9800gt 4770/4850-gtx 250 4870-gtx 260 4890-gtx 275. Hurting? you do realise that nvidia's latest cards are made with a 55nm process, just like the ati competition, and that 40nm 4770s aren't selling like hotcakes. What do you expect? Nvidia to start their own fab? That costs billions. Ditch TSMC and go to Global Foundries? Don't make me laugh. I don't see ati using your naming scheme. Who would of thought that a lower number like 4770 would be better than a higher number like 4830. It's not hard to use google to search for benchmarks. Would you buy a car just because of its name? Nvidia does make great gpus- the fastest you can get.
 

azone

Distinguished
Oct 9, 2008
48
0
18,530
They should of tested grand theft auto 4. That game eats graphic memory.
I have A 4850 1 GB and after I put the textures and shadows on high I am at the 1gb limit and there are still more things I can turn up. 2gb is needed on that game for max settings.
 

Spanky Deluxe

Distinguished
Mar 24, 2009
515
7
18,985
This is a great sounding card... but only for the few. I.e. this is an absolutely superb card for people doing CUDA or OpenCL work. Its got more memory than an nVidia Tesla C870 and more memory per GPU than a D870 or S870. Its also halfway towards the amount of memory in a Tesla C1060 and is a superb bang for buck compromise unit.

Shove four of these into an Asus P6T7 motherboard and a standard Core i7 CPU and some standard DDR3 memory (no need for ECC for CUDA since the cards don't support it anyway) and you've got yourself an incredibly good value for money CUDA based personal supercomputer. You could build that for about $2500. A CUDA machine with 8GB of 2GB on GPU super fast RAM with 960 stream processors at 1.5GHz each all for only $2500... wow.
 

rambo117

Distinguished
Jun 25, 2008
1,157
0
19,290
[citation][nom]roast_pork[/nom]I've had a 4870 X2 times 2 in Crossfire since last Oct. and they are worth the cash![/citation]
showoff :p
 

Kill@dor

Distinguished
Apr 28, 2009
663
0
18,980
I just wanted to let you guys know i did more test on my machine yesterday. I found something very interesting on the GTX275 @ stock (no OC)...

OK! So the Nvidia Control Panel has the tweaks you can adjust, from fan speed, Core/Shader/Mem. OC, AA tweask, etc.

Changes: I changed my settings from multi-display - single display; change no AA - 2xAA and 2xAF (small i know but i can use 8xAF); tripple buffer - on; occlusion - on; vsync - forced on; antialiasing settings - enhance the application; texture filtering - high quality; antistrophic filtering - on; conformant texture clamp - clamp;

Basically everything is turned on except muti-display now showing single display. I also overclocked my E6600 CPU from 3.0GHz to 3.4GHz (1.28 vcore lowest on my mobo) and the frames jumped from 29fps to 40-45fps. I usually run 38-40fps constant. Certain areas in the game drop frames to 27-29fps (almost unoticable) and smooth frames on 1920x1080 @ 60Hz. I also have DDR2 1066 which helps a little. Everything is on very high, but i turned the view distance to 40. If its on 100% the frames drop 3fps...which i don't want.

I am happy with the 1792MB GTX275 so far, that i am even considering paying the premium for a Core i7 later this month. But that is still yet to be completely decided...i have had my E6600 for some years and it has not disappointed me.
 
G

Guest

Guest
thanx for the great review my mind is clear now with memory amounts on these cards i see that only memory speed can be the difference between performance i think if they can work out what it will cost to put GDDR5 memory instead of GDDR3 they will challenge the GTX295 with performance at the highest quality.
 

lsorense

Distinguished
Jan 9, 2007
14
0
18,510
What is wrong with Gigabyte? I don't want to ever see a VGA connector again. And HDMI isn't much use either unless it is a type B (which can do dual link and hence handle a 30" screen) which it never is. Essentially they have made a card that can only run one dual link display rather than the two dual link that have become standard on practically every mid to high end card. The loss of Svideo is no big deal, but the loss of the second dual link port is a big deal. Awful design choice. And this is supposed to be their top of the line card?

I would much rather have to use a DVI to HDMI adapter (or just cable) or a DVI to VGA (if for some reason I need it ever again), rather than loose the option of using two dual link DVI screen.
 

Crashman

Polypheme
Former Staff
[citation][nom]lsorense[/nom]What is wrong with Gigabyte? I don't want to ever see a VGA connector again. And HDMI isn't much use either unless it is a type B (which can do dual link and hence handle a 30" screen) which it never is. Essentially they have made a card that can only run one dual link display rather than the two dual link that have become standard on practically every mid to high end card. The loss of Svideo is no big deal, but the loss of the second dual link port is a big deal. Awful design choice. And this is supposed to be their top of the line card?I would much rather have to use a DVI to HDMI adapter (or just cable) or a DVI to VGA (if for some reason I need it ever again), rather than loose the option of using two dual link DVI screen.[/citation]

Yes it's an expensive card, but how many of its buyers have even one 30" display...maybe 10%? And how many will have TWO? I think Gigabyte is fairly safe on this one, so long as nobody who will never own two 30" displays listens to you.

As for anyone who does have or will have two 30" displays, I don't believe they'd overlook the lack of a second dual-link interface.

I think as long as buyers "keep it real" Gigabyte will notice the slight increase in sales to HDMI users more than they'll notice the tiny loss in sales to dual 30" display users.
 
G

Guest

Guest
[citation][nom]chovav[/nom]hey, how about GTAIV? they say that the maximal viewing distance of a 1GB card is 32%, and MSI claims for its 2GB card that it's possible to get 100% viewing distance. Is there any possibility to test this? this could actually be the only game that will use this amount of memory..[/citation]

My thoughts exactly! To me it would such an obvious game test for an article like this! The one game that claimed it required "future" tech and LOTS of video ram to run. Now we have a card with 2GB Video RAM and can test Rock* claims as either being true, or (as I still suspect), it was just a TERRIBLE port with 0 optimization for PC. TEST IT! Please...
 

Crazy-PC

Distinguished
Mar 29, 2008
204
0
18,680
The difference is not significant between GTX285 1GB and 2GB, but how about if the graphic cards connect with dual monitors in clone mode and extended mode for 2560x1600?
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
[citation][nom]twisted politiks[/nom]ive had EVGA's GTX 285 for about two months now, nothing new in the memory department[/citation]

And it's EVGA, which I'd buy over Gigabyte in a heartbeat, since their (Gigabyte's) tech support told me not to overclock on their mainboard because it corrupted the RAID ROM if it re-set the BIOS. Why in the world would I pay twice as much for a mainboard if I didn't want to OC? Wasn't even a stretch for an OC either.

My other mainboards don't have that problem. Sure, I have to reset the BIOS for RAID, but the ROM stays intact, so I don't have to re-install the OS and everything else.
 

Crashman

Polypheme
Former Staff
[citation][nom]Marcus52[/nom]And it's EVGA, which I'd buy over Gigabyte in a heartbeat, since their (Gigabyte's) tech support told me not to overclock on their mainboard because it corrupted the RAID ROM if it re-set the BIOS. Why in the world would I pay twice as much for a mainboard if I didn't want to OC? Wasn't even a stretch for an OC either.My other mainboards don't have that problem. Sure, I have to reset the BIOS for RAID, but the ROM stays intact, so I don't have to re-install the OS and everything else.[/citation]

You have the same chance of RAID corruption on any Intel chipset board, when overclocked, regardless of brand. If you overclock anything, you risk corrupting data.

EVGA in particular is ESPECIALLY as bad as everyone else :)
 

+1, its a chipset issue, not a board maker. Luck for most its not an issue.

To be honest, I remember when boards used to not have PCI/AGP locks all the problems caused by overclocking.
 

MamiyaOtaru

Distinguished
Jun 19, 2008
23
0
18,510
[citation][nom]tpoke[/nom] Also (GTAIV) requires one card to be more then 1gb because it dose not recognise dual cards solutions "second" bank of memory. [/citation]

Because nothing does. Did you read the article? 2 cards in SLI with a GB each counts as one GB of video memory, since they store the same data.
 
G

Guest

Guest
Wouldn't it be nice to test something like EVE online at these high resolutions and multiple clients? I recently had to ditch my GTX 8800 for a GTX885 as 2 x 2560x1600 displays and 4 clients (2 small windowed) were just overwhelming the old card.

Since each client is like a separate game with its own textures loaded onto the card and such, extra memory should be king.

 

cmdrdata

Distinguished
Jun 18, 2009
3
0
18,520
I’d like to see 2x2GB 285’s tested also, but perhaps GTAIV uses memory inefficiently? Who’s got the money to buy 2 or even 3 of these cards anyway?
 

steve_dallas

Distinguished
Aug 24, 2009
1
0
18,510
I would be interested to see what the results would be with two 24" displays. I actually use a 30" and a 27" Dell for my gaming. I run games in windowed mode, and three to four instances of the game running at a time. (Ex Multi-Box MMORPG player, now running them on one box.)

Since my game of choice (City of Heroes) does not like SLI or AMD cards, the nVidia 285 platform would be the logical choice at this time. When there are hundreds if not thousands of players all fighting in the same area at a time, with plenty of effects going off per player, a lot of graphics cards suffer at normal resolutions. My current nVidia 260 (Original, not 216) does a good job most of the time with 2 instances running, but the third can often slow things down a lot. I was wondering if having the extra memory would help. It is too bad that I can't run 2-3 cards and assign an instance or two to each one.

Please note that multiple instances of the game are not officially supported, never mind that I am running them on a box with Win 7 RC 64 bit with 8 GB of RAM, which is not officially supported either. My current rig does often crash with the current game client, as often as every 45 minutes or so. I was hoping that the increased memory might help reduce crashing.

I know the new 300 series chips are coming from nVidia sometime in the next 6-8 months or so,but I wonder if they will take a page from AMD's book and go multi-chip on the high end rather than going to a huge monolithic core again.

I am tempted to get a 2 GB 285 now, just in case they do go multi-chip with the 300 series. What do you folks think?
 
Status
Not open for further replies.