Nvidia GeForce GTX 260/280 Review

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

hannibal

Distinguished
Apr 1, 2004
2,372
60
19,890
14


Toms Harware used more AA and Anandteck use did not use so often AA settings. The huge bandwide adwantage really helps 280, when usin higher resolutions and big AA settings. Is it worth of it is a matter of your personal taste.
 

nirgal

Distinguished
Oct 2, 2007
20
0
18,510
0
Why were the initial comparisons made against ATI 4xxx but the benchmarks against 3xxx? And why only one ATI card and multiple Nvidia cards (which furthermore were practically the same)?

What is the purpose of "Plateforme.jpg Test configuration:" on page 14?
 
G

Guest

Guest
What everybody is failing to see here is that whilst its not a huge performance boost vs the 9800x2, the 9800x2 is infact 2 cards technically. Now imagen for a moment a GTX280 in SLI and then do a comparison. that would be the fairest way.
 

gothminist3r

Distinguished
Jun 17, 2008
6
0
18,510
0
[citation][nom]doomsdaydave11[/nom]Keep in mind kids that this is one chip, not two. Remember that this card will trash any single chip currently on the market I am slightly disappointed, but not as much as you would think from the benchmarks. Any manufacturer could put 2+ chips on a board... at least Nvidia raised the bar for the fastest single retail chip.[/citation]
I understand where ur coming from but it seems that your are oblivious to the process being used by NV. Imagine what NV would charge for 1000+ sqr mm of die?? Their process limits them to one chip per board which is ignorant to the pattern tech has been following for the last couple of years.
 
G

Guest

Guest
It is not really fair to compare the 280 to the GX2, wait for the 200 series GX2 and then compare, the 200 series compared to the 8800 and 9000 series is excellent performance.
 
lol, still can't play Crysis with max detail at 1900*1200. Too bad, the card' doesn't seem to be what it was supposed to be. We'll see what ATI can do. Either way should drive prices down. Wonder if nVidia plans to make dual GPU cards from these....
 

spaztic7

Distinguished
Mar 14, 2007
959
0
18,980
0
It is not really fair to compare the 280 to the GX2, wait for the 200 series GX2 and then compare, the 200 series compared to the 8800 and 9000 series is excellent performance.
From my understanding, Nvidia said they are not doing a GX2 version of the 280.
 

gothminist3r

Distinguished
Jun 17, 2008
6
0
18,510
0
[citation][nom]spaztic7[/nom]From my understanding, Nvidia said they are not doing a GX2 version of the 280.[/citation]
Yip. Look at the die size in a cost of transistor perpective and you will understand why.
The argument of "compare apples 2 apples" I.e GTX280 to 9800GTX, doesn't hold true if the deciding factor is $ and not FPS.
 
G

Guest

Guest
It has potential, it is all about the drivers. Right now the drivers are at the immature stage and cannot handle things as well. When new drivers come out it will out perform.
 

klawrence0

Distinguished
Jan 19, 2007
15
0
18,510
0
To the guy missing the point about CUDA:
1) you missed the point
2) dude, folding @home has ramifications for everyone, not just those afflicted with cancer. We are talking about processes of degeneration that occur in EVERYONE over a long enough period of time. The thing is, most people die from different reasons than cancer but only because other types of damage kill us first. There are 6 or 7 types of cell damage associated with aging, and the loss of control of cell programming (cancer) is just one of them. In order to increase lifespan, all of these will have to be solved. If you want to live a longer and healthier life, you had better be using all those extra cpu cycles and donating the extra power expense to causes like these! We are bred to accept age-related death, but it is not the only option. Please visit www.mfoundation.org and see what is going on!

I urge any young people thinking about what they want to study in college to consider going into biology to contribute to realizing this most important goal. And if you want money? hey the people who help are going to make out like bandits, I promise you.
 

thr3ddy

Distinguished
Feb 8, 2007
26
0
18,530
0
With specs like that you'll have to wait for better drivers before saying it's a "POS." I have no doubt that the performance of this card will increase tremendously as better drivers become available.
 

spaztic7

Distinguished
Mar 14, 2007
959
0
18,980
0


They should have had the drivers ready for launch. Shipping with bad drivers when you pay that much money is bad business.



Um... CUDA is not just for F@H, you do realize that? Right? CUDA is a development pack that lets you take advantage of the processing power of the GPU. You can use this for F@H but F@H is a small insignificant aspect of what CUDA can do. Physx is another part of CUDA that once again is just an addition and many people could care less about. Havok is doing more then a good enough job. But Physx can be better and having a dedicated slab of hardware to take care of those calculations is always better than having to waste clock cycles on physics rendering.

So... maybe you missed the point about CUDA. Oh, where are your TPS reports.
 
G

Guest

Guest
Um, gee, and ati x2 is sometimes a little faster. What about, I dunno, a GTX 280x2?
 

grieve

Distinguished
Apr 19, 2004
2,709
0
20,790
4
It seems in every thread about the 280 i have read everyone thinks it sucks...

After reading this entire review (plus others) I think the card is pretty good, it holds its own against dual processor cards! If a 280x2 is ever released it will kick as*!

There is no question i would choose a 9800GX2 at this moment though as $600 for the 280 is absurd.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
1
Since the main reason, to sell different models, is that the higher end models have failed processors, disabled in lesser models, and looking at the hugue number of shader processors, is expected to have many different models. More than in the 8800 generation.

So, having such a little number of differente models (only 260 and 280), models, with such a ahugue change in processors, it means that many good working processors have being artificially disabled. The 260 need to have a hugue modding/bios editting, and overclocking, treasure untapped.
 

arrpeegeer

Distinguished
Apr 22, 2008
135
0
18,680
0
Argh, typed a whole long commend and I guess the opposite direction arrow that I used is being interpreted as the end of an HTML comment somehow. Not retyping. FAIL! :)
 

draxssab

Distinguished
May 12, 2008
59
0
18,630
0
All this time waiting, all this talkin and talkin.... for this.. At the price of a 280GTX, you will probably get 2 4870, that will probably perform better. Now show us what you've got ATI!
 

mr roboto

Distinguished
Oct 22, 2007
151
0
18,680
0
[citation][nom]spaztic7[/nom]From my understanding, Nvidia said they are not doing a GX2 version of the 280.[/citation]

From my understanding it's NOT POSSIBLE to do a GX2 version. LOL
 
Status
Not open for further replies.

ASK THE COMMUNITY