GTX 280 Reviews/Benchmarks Compilation!

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
http://enthusiast.hardocp.com/article.html?art=MTUxOCwxMiwsaGVudGh1c2lhc3Q=

They seems to be very happy about 280. The 280 offer more for those who use high resolutions and high AA. Look those Age of Conan and Assasins greed results!

And look these Crysis results. http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_280_Amp_Edition/10.html
The 280 beats crap out of everything else when using sigh resolution and high AA.
http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_280_Amp_Edition/24.html
And the sound... There has to be some bios related bugs in 280 at this moment, that cause different sound results...
And the performance / dollar: http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_280_Amp_Edition/25.html
 

As I pointed out above, [H] has some weird stuff going on with the 9800GX2 and Crysis.

As far as tech power up, they are testing DX9 high, which is fine for me (Win XP), but really is poor testing for these high end cards. They should be testing Crysis DX10 very high IMO as that's what many were hopign this card could handle. Those DX9 high results are very different results than DX 10 very high where the 8800GX2 in most reviews is easily beating the GTX 280 at any playable resolution.

Crysis:
http://www.tomshardware.com/reviews/nvidia-gtx-280,1953-18.html
http://www.anandtech.com/video/showdoc.aspx?i=3334&p=11
http://www.legitreviews.com/article/726/12/
http://www.hexus.net/content/item.php?item=13736&page=12

 



You are going to get TGGA started again about how much he dislikes [H] VGA reviews :non:
 


Yep the 9800gx2 is faster, but only when using low or not at all AA. If you look those sites (Even Toms Hardware) that use AA then 280 is better.
So if you have very big monitor, or you want to use high AA settings the 280 is better. If you don't use AA or use lower resolutions 9800gx2 is better. We are both right in this matter.
 


Sorry, that was not the meaning! I don't want to feed trolls...
I just wanted to point out that there are mixed results around, and try to pring up situation where 280 can demonstrate it's strong point, that seems to be AA at high resolutions. It's far from ideal gaming card but it's not as bad as many claims.

 
when any of the current G92 based card they all gonna get smoked if at 1920x above with ALL FILTER TURN ON. because their 256bit bus will get bursted!!!

no point using or testing these high end card at less then 16x or under because they cant stress them enough. which this point out another thing. for those games the fps being equal to the GX2 and 280, because the game is not demanding enough to max out the 256bit on the GX2. wait for graphics card killing game to come out and we will see whats what.
 
its funny how the 10+ reviews i read all got different result with similar setup in the same game. clearly their driver is abit different. how much different? i think you can see for yourself. and how much improvement its gonna make? look at the result and predict. take a look at the spec sheet as well.
 

By big monitor, you mean a 30" as at 19x12 more often than not the 9800GX2 is winning. FSAA I will admit to, but notice I said playable settings. What good is an FSAA victory if it's avergeing 26 fps in a timedemo, it will not be playable in game anyway. It seems the GTX280 can pull ahead high res high aa, makes sense really. But unless it's playable it's a pointless victory. I somewhat glance by any results that average well under 30 fps.
 


Yep, but the problem is we already have a graphics killing game out right now and the GTX 280 is unplayable at 19x12 DX10 very high and trailing the 9800GX2 in most reviews.
http://www.tweaktown.com/articles/1465/7/page_7_benchmarks_crysis/index.html
http://www.driverheaven.net/reviews.php?reviewid=579&pageid=14
http://www.legitreviews.com/article/726/12/
 

Yeah, that does jump out. Keep in mind though that all sites do not test the same timedemo or game area. [H]ardocp and Driverheaven play the game and use fraps. Results won't be the same as it's not the same manual runthrough and tester. Some sites use the Crysis GPU bench, some use the CPU bench, some may use a custom script. We need to keep in mind 30 fps in a timedemo does not = 30+ fps while actually gaming, especially the second half of Crysis.
 
Well unless the 4870 can justify myself buying a Crossfire board, I hope so these damn Nvidia POS boards are getting on my nerves, then it seems I might be going the 2x 9800 GTX route, since they OC well and do not have insane power and heat requirements like the GTX260.
 
What I don't get is that the reviews are largely positive, and yet people are saying stuff like "This sux!". As far as I can see, the real negatives are noise, the shafting on DX10.1, power consumption, but mainly price. People have short memories - the 8800gtx wasn't exactly cheap when it came out. nVidia is just being sensible - they know that loads of people will shell out the cash for bleeding edge, and then they can work their way down. It's exactly what would happen if they were all on ebay, and would maximise profits: the first ones will go for £400, then later it would settle around £300, before ending up around £200 after a year or so.

The comparisons with 9800GX2 seem odd to me. Given that there are big scaling issues for 4way sli, the relevant comparison seems to me between 1 GX2 and 2 sil'd GTX260s. Granted that there's a big price difference here, but again that's due to their stage in the release cycle - the 260 will be available for under £200 very soon. I don't know why anyone would want 2 sli'd 9800GX2s really, unless they'd already got them that is (which is panning out to be a sound investment). As physics starts to get used, and as games start to use the raw computational power of the cards, and as drivers mature, I'm pretty sure we'll see the new series come into their own.

The exciting stuff from my point of view is the CUDA stuff, and the excellent idle power consumption. These features do appeal to me, and make the new series much more appealing than previous gen stuff. Gone are the days when we'll simply ask "Yeah, but will it run Crysis on very high at 1920x1200?" (Crysis is, I'm informed by people who know about such things, a poorly-coded dog, in any case). GPUs need to be assessed under more criteria than benchmarks on existing titles - i.e. taking into account their other features, and the expected performance on future titles. The _reviews_ are taking this into account, and I get the feeling most people out there are not really heeding this message.

The conclusion I'm taking from the reviews is that the sensible purchase is the GTX 260. I'm buying 2 of them and putting them in SLi. I may get a third in due course, depending on whether scalability improves, but for now I think 2 of these babies will handle anything I throw at it for a good old while.
 


Peoples expectations were too high. Performance is about as I expected and pretty good for a single GPU. Price and noise are disappointing. Overall pricing is much better now than back when the 8800GTX came out. We didn't have near top of the line single GPU cards like the 8800GTS G92 for $160 after rebate, or the 8800GT 512MB for $130 AR. Also, the 8800GTX crushed the 7950GX2 while the GTX280 is at best trading blows with the 9800GX2. The GTX280 doesn't claim all out best gaming card you can stick in a single PCI-e slot like the 8800GTX did. Yet, it's way higher priced than the 9800GX2. Anand shows SLI 8800GT a good match for the GTX280, and on newegg a pair of 512MB 8800GT is as low as $270 shipped after rebates compared to $650+ for this new beast. The GTX280 price isn't out of line for a top of the line launch, but it is out of line for the times and the amazing price/performance we have been seeing lately.
 
We were all just spoiled by the G92/38xx series price war and the low prices/good performance that we got from those cards. I told my friend several months back that I expected the GTX280 to run Crysis at my full res (1680x1050) with 4x AA and high settings around 40+ FPS, which it does (a lucky guess tbh). It even stays around 30fps in Vista 64bit, very high DX10, which we all know takes a heavy performance hit. My current card would cry if I tried to make it do that.

I am waiting on the 4870 to see what it is like, but I want a fast single card solution (as I do not like dual card solutions at this point) and the 280 is the fastest at this point. If the 4870 is within 10% of the 280 and scales well in crossfire then I may consider crossfire (but like I said, am trying to avoid).

Best,

3Ball
 


If talking actually playing through all of Crysis, I suspect 16x10 4xaa DX10 very high will at times even make the GTX280 come to an annoying crawl.

Look over these links if you have not(already provided).

12x10 no fsaa is barely above 30 fps -
http://www.tweaktown.com/articles/1465/7/page_7_benchmarks_crysis/index.html

Actual gameplay at 12x7 no fsaa DX10 very high here:
http://www.driverheaven.net/reviews.php?reviewid=579&pageid=14

16x12 very high 4xaa looks to be a long way off:
http://www.digit-life.com/articles3/i3dspeed/0508/itogi-video-cr3-wxp-aaa-1600-pcie.html
 


Indeed at very high I would suspect it to be difficult in a few area of the game (especially the later parts), but my current setup dips near 15fps in some places in crysis (1680x1050 all on high in XP with no AA). I will be pushing the processor a bit further, getting a new hard drive and video card to stop that, but I am most likely still going to play in XP though I run a dual boot. Actually tri boot if you count my ubuntu linux. So who to believe? These show it doing well...in one case closer to 50fps! Some are running vista some are running XP. XP seems to be 40 - 50 fps while vista 30 - 40 in the reviews. One of the reviews has no AA in it at 1680x1050 and it hits 55! Alas, I digress...

http://www.anandtech.com/video/showdoc.aspx?i=3334&p=11
http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_280_Amp_Edition/10.html
http://benchmarkreviews.com/index.php?option=com_content&task=view&id=179&Itemid=1&limit=1&limitstart=8
http://www.insidehw.com/Reviews/Graphics-cards/nVIDIA-GeForce-GTX-280/Page-4.html

Best,

3Ball


 
From all those testing, it seems GX280 not better then 9800GX2 by a lot... sometimes 9800GX2 has a better performance... I don't know whether is this a stupid idea, but will that be the driver problem? what I mean is, the current driver is not for GTX260 or GTX280, it needs another version of driver in to unleash the beast... could that be?
 


Well the reviewers did use the newer beta drivers that were designed for the 260/280. Yes, driver improvements will help to improve the performance, but not so to the point to make the card significantly better imo. The problem is that the 9800GX2 is just a fast card. They have improved the dual card solution, which makes it hard to build on top of for the next gen of cards. Before they were just building on top of a single card solution (nay say the 7950GX2...its early design just wasnt as mature as the 9800GX2).

So if they were just competing against the 8800GTX then the card would look to be much better than it currently is looking. The price would also be more justifiable (even though still high) if the 9800GX2 were not around, but they just kinda shot themselves in the foot with the 9800GX2 because it did well for the type of solution that it is. I suspect that once they have a die shrink that we will see a dual card solution of the 260, but I could be wrong...just what I would think they would do.

Best,

3Ball
 
i think you people forget that the GT200 is a architecture optimisation and improving on the G92 core. and the GTX 280 is just simply raw power that packs in there, clock for clock the GT200 is unbeatable by any card.

the 9800GX2 got more SP then the GTX 280
256 vs 240
and higher clock then the GTX 280
600/1500/1000 vs 602/1296/1107

when AA was not turn on to 16x the memory doesnt come in as a limiting factor. it will just be puure shading power. all the shader and clock speed "advantage" onlet let the GX2 lead the GTX 280 by 1 frame or two. and if the GTX 280 were OC it could probably pass the 9800GX2 with a improve driver.
 


Yea... i see...
And now, I heard that they are reducing the price of 9800GTX, from $229 to $199, to make it more competitive to the ATi's 4850, it looks like a good idea, but then are they shooting their own feet? coz ppl may buy 9800GTX to SLI and that's even cheaper then a GTX280 but having a better performance....
 


TBH, I am unsure what that company is thinking at this point. I am considering going over to the red camp, which doesnt actually bother me since I have always preferred them. The P45 boards look really appealing so it is possible that I may be making my first attempt at a dual card solution (even though I am not sure that I really want to). Atleast if I use crossfire instead of an X2 solution and I have scaling problems I can use just one card instead of the 2, alas...I digress! lol

Best,

3Ball
 


Actually it's good idea. 9800 is now middle range card. If you want real power you use 280 in sli or triple sli... is you have money, if you don't have... take middle range cards.
I think that it's cheaper to produce 9800 series card to compere with ATI's 4000 series than try to sell 200 series that are extremely expensive to produce... They have a flagship 280, that very few can afford, the real buck they are gonna make with their middle range cards like 9800GTX and 9800GTX+ that is 55nm shrink. Most propably we will see some "stipped" versions even cheaper 9800 GT? 9700?
that are allso made in 55nm prosessing... so they are cheaper to produce.
 
now 9800GtX going to reduce to 199 but the new 9800GTX+ will be at the old price point of the current 9800GTX. Nvidia is pushing the limit of their G92 chip as they clock at 738/1836 that quite clock to its limit. and a better optimised driver have release so the gap will be reduce between the 9800GTX(+) and the HD4850 at resolution upto 1920. because above that with filters turn on the 256bit buswidth really suffers!