Nvidia GeForce GTX 260/280 Review

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

area61

Distinguished
Jan 8, 2008
280
0
18,790
it took them so much to match a dual gpu card.same goes to the darkside aswell.isnt the hd4870 as fast as the hd3870x2.but put it this way,the hd3870x2 doesnt lack that far off the gtx260.but if its to be believed that the hd4870 is as strong as the hd3870x2,the hd4870x2 will kill the gtx260
 
G

Guest

Guest
"It is going to get owned by the 4870x2"

And a dual 280GX2 will destroy a 4870X2. That argument is desperation ATi fanboyism. This GPU will have a longer shelve life in future games than the 9800GX2 by a mile...or any other present GPU.
 

draxssab

Distinguished
May 12, 2008
59
0
18,630
[citation][nom]revparadigm[/nom]"It is going to get owned by the 4870x2"And a dual 280GX2 will destroy a 4870X2. That argument is desperation ATi fanboyism. This GPU will have a longer shelve life in future games than the 9800GX2 by a mile...or any other present GPU.[/citation]

it's really improbable that a 280GX2 could be made, get informed! It would melt it's own plastic case! :p

Seriously, i think that YOU are the fanboy here, you can't face the truth tht thi round is probably owned by ATI
 

jakemo136

Distinguished
Jan 10, 2008
46
0
18,530
C'mon Tom's- At the beginning of the article you yourselves say that "GTX" is now the prefix of the card, while on the charts you have it as the suffix... A little congruity, please?
 

harrycat88

Distinguished
Jun 18, 2008
98
0
18,630
ROFLMAO
I knew this was going to happen. Nvidia doesn't have anymore companies to steal ideals from. All they have is their manipulation gimmicks which has worked repeativly for several years until now. I have purchased 3 different Nvidia based cards in the past only to be disappointed because my Voodoo 2, Matrox G400, and ATi X800 out performed it by a considerable margin. All I buy now is ATi based products because with ATi, you get almost double frames rates from one generation to the next. With Nvidia, you get 5 to maybe 10 frames per second more per generation. I think it's a lot cheaper to just upgrade you CPU for those extra 5 to 10 frames.
Please Note: The ATI HD2900 series and 3800 series are the same GPU. ATI just made a little mistake with the HD2900 series due to being pushed by unhappy fans.
 

redsmurf

Distinguished
Mar 12, 2003
26
0
18,530
Please do this same review, but add in SLI pairings in your Benchmark summaries. That would make this the end-all review to have a complete grasp on the current graphics situation.


Because the big question now is, how will 2x 9800 GTX (or 8800 GT's) in SLI compare to the single GTX 260 & 280 ? And what kind of FPS are you getting with 2x GTX 280 in SLI ?
 
G

Guest

Guest
verey, very disapointing and at that price.... I actually got a 2nd hand 8800gts instead of a 9x series becasue I expected to upgrade to this, but no way now.

Will get a cheap 8800gtx instead......or stick with what I have.
 

crisUK

Distinguished
Nov 8, 2007
7
0
18,510
This card has a joke price in the UK, £418 is the cheapest I have found that's $823US. They are on sale in the US at $649 given $1.97 to £1 and if you add VAT at 17.5% that should still only be £392.
I have seen these cards on sale at £457/$900US (dabs.com).Damn we get ripped off in this country :(
 
G

Guest

Guest
The general dissapointment is that this card does not beat the 9800 GX2. There are several things that need to be looked at.

First of all, it is important to understand the idea of beta drivers. Obviously, the drivers are not yet mature for the G200, and most likely do not tap into the new architecture's full potential. The drivers are improved versions of the 8000/9000 series drivers. They are not yet written from scratch, as they usually are with a new architecture. For this reason, potential is tapped more like in an 8800/9800 card, implying some limitations. there is no reason not to give nVidia a couple of weeks to mature their drivers. Remember the buggy Vista drivers? The same applies here. If the hardware is indeed capable, then nVidia's software team will exploit the power of the G200.

Second, most complaints pertain to the fact that the GTX 280 does not always beat the 9800GX2, or neither severely outperform the 3870x2. However, one important fact is overlooked: a single chip is capable of surpassing(in some benchmarks) dual-chip cards, which means that the GTX 280 has almost if not more than twice the processing power of the previous generation chip; that's impressive. Also, nVidia is capable of giving single-chip solutions, while ATI needs to put two on a card in order to stay competitive in the high-end segment.

So, before bashing nVidia for an utter failure, wait for mature drivers (which the previous-gen and ATI cards were priviledged to). You could look at it this way: a single chip card, with beta drivers is capable of outperforming the fastest dual-chip solutions. Does this say nothing about the G200's power?

What we can do at the moment is wait for mature drivers and more comprehensive benchmarks. Either way, this generation is a significant step-up from the previous.
 

crisUK

Distinguished
Nov 8, 2007
7
0
18,510
Additional:
Just seen a HD4850 for £139 so Crossfire for £278 which gives ~GTX280 performance for a saving of at least £140. Thank god I bought an Intel chipset!
 

cisco

Distinguished
Sep 11, 2004
719
0
18,980
Another addition to the disturbing trend of "high end" mainstream cards. Looks like another weak high end card to me. I was expecting something that was a real jump in performance. Why would anyone upgrade from an 8800 to gain a couple of frames? WEAK. I hope ATI decides to make a fast card, looks like the high end market is up for grabs. I think this is a sign of the shrinking PC gaming market.
 

dark41

Distinguished
Mar 2, 2006
127
0
18,680
Seems to me the target market is people with older 6*** and 7*** series for upgrades than 8*** series. While the GTX seems better value than a 8800 or 9800x2 due to the energy savings, I'm still not convinced that it's worth upgrading my 7900GTX TDH Extreme. I'll wait for the next round, and possibly the one after for the best bang for the buck.

It also appears that most of the posters on here don't pay their own electric bills. I wonder if they're parents knew how much electricity their 8800GTX/Ultras and 9800x2 used just to sit idle if they'd still be allowed to have computers. The GTX is a small step in the right direction, but I'd have to see quite a bit more improvement to justify a new top of the line video card. At this point there's no good reason for manufacturers to stick with 65nm anyway.

But the only way they'll learn is if more people adopt my attitude and quit buying every upgrade just to have the latest bragging rights for benchmarks, and only buy when the technology provides a real impact. Me thinks you people have no one to blame but yourselves for Nvidia and ATI lacking in technology advancement. ;)
 

madwheels

Distinguished
Jun 20, 2008
10
0
18,510
As far as this new infor,it's not that great!The 3D may not add to a card but I can't really feel it's going to be better?I think it's time to really wait it out,cause some of the numbers may show inprovement but not that impressive.Nvidia is just playing with names GTX260/280 ya that'll work?lol wake me when it's over.
 

oceanclub

Distinguished
Jun 20, 2008
6
0
18,510
I honestly don't see how these cards can be recommended, as they're a considerable disappointment. Normally, it's a no-brainer to upgrade every 2 generations - I've gone GeForce 2-->Radeon 9800-->GeForce 6800-->GeForce 8800.

However, over the 8800, the GTX 280 offers only maybe a 25% improvement, while the GTX 260 offers even less! It doesn't make financial sense to upgrade to these cards from a 8800 at least. Seems to be that nVidia is in a hole - not that Ati is much better.

P.
 

sinephase

Distinguished
Jun 16, 2008
2
0
18,510
Synthetic benchmarks show it has a considerable gain on older cards, so I presume, as the article suggests, that the poor in-game performance gains are more-than-likely a driver issue.
 

thatmymp5

Distinguished
Mar 15, 2008
17
0
18,510
it is a slow card because the GPU and the core are slower than 8800 and 9800. What's funny way to laid a card on the market which is actually slower than 8800 and 9800 series. lol...anyway, that's the power 280 card which cannot be ignore. slow but steady as she goes 280 card. what a uniquely luxury value to priority own 280 card on the market! cool! anyway, price will drop to suit market's demand in no time. haha :)
 

mikeinbc

Distinguished
Jun 21, 2008
3
0
18,510
The only mention of ATI's new cards from Tom's review (bottom line):
"and if AMD doesn’t manage to jump onto the stage".

AMD is going to jump onto the stage and CRUSH these cards with the 4870 & 4870x2.
 

Very_Frightened

Distinguished
Jun 21, 2008
1
0
18,510
unlike dragoncyber i won't be quite so polite with nvidia:

i bought an 8800gtx back in late 2006, which i still have, as it is a good card still, but my initial dissapointment came with the discovery that the 8800gts actually beats my cards in some test (WTF)!
what's with that policy of always selling two high-end cards, that are almost similar?
as it stands, looking at the 9800gtx and the 9800gx2, i simply see no real leap forward in perfomance to my card, furthermore amd/ati's offering has impressed me more since the release of the 3870, 3870 x2 and now the 4850 !
needless to say, i am actually looking forward to the 4870 and 4870 x2 !
end of the day i am looking at performance charts, and i at least used to always buy the winning card:
not anymore!
... because paying top dollar no longer means you get the fasted (!), it means you get the fasted on average, and you have to check out the benchmark for the induvidual games you actually play (WTF)!
nvidia, if i am to spend money on a performance crown, then i expect to get a crown ruling over at least its own subjects!
BTW nvidia, you sat on your lazy, fat arses for almost two years, doing absolutyely NOTHING until amd/ati's 3870x2 actually took the performance crown from you, only to "JUST" beat it with your 9800gx2 A FEW WEEKS LATER?!??
... are we really to believe THAT product only just finished in time for you to take back the crown within weeks? ... -> bullshit, guys!
besides, you were after all just rehashing 'old' technologies from your 8800 series cards and then rebranding them with a new name!
as for the dip-shits at tomshardware: it is increasingly obvious to note how you've finally realised on which side of the bread the butter is ...
 

dobby

Distinguished
May 24, 2006
1,026
0
19,280
all this show us is how much more powerful the 9800GX2 is in comparison to the 3870 X2

dont worry they'll come down in price, when the 4850 come out, and hopefully these cards will make sense when games will hopefully start to ustilise Physics
 
G

Guest

Guest
To dragoncyber's comment:

"Now the biggest embarassment for Nvidia is that they are pushing CUDA technology and folding at home client CRAP!! These cards are designed for gaming!! Who gives a crap about how fast they can cure cancer or render 3D medical images and scans of the human body. Maybe 10% of the cards produced will be used for this purpose..the rest will be for GAMING!!"

The only thing embarassing is your utter ignorance of GPGPU computing (http://en.wikipedia.org/wiki/GPGPU http://www.gpgpu.org/). And BTW, Stanford University's folding@home has been using ATI GPU-based clients for sometime (http://folding.stanford.edu/English/FAQ-ATI2). Its just that nvidia GT200 GPU too is being used now and and with superb results (http://www.tomshardware.com/news/Nvidia-Stanford-Folding-Home,5654.html), beating all competing products at this type of simulation. Protein folding simulation is something lots of pharmaceutical companies must do since thats critical to drug synthesis. This segment of HPC has a huge potential and its just a matter of time before GPU's become ubiquitous in this area. This is also the reason Intel is so desparate to launch their larrabee CPGPU sometime in 2009/2010. So get real and feel happy that your GPU can not only play games but also do serious advanced scientific simulations of immense commercial value.
 
Status
Not open for further replies.