Nvidia GeForce GTX 275 Preview: A Well-Timed Retaliatory Strike?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]bounty[/nom]. If Stalker: Clear Sky supports 10.1 I believe you should have at least posted the 10.1 results, then maybe added the DX10 results if you had the time. You took time to demonstrate the Physx advantage, why not the 10.1?-Bounty (8800GT and 4830 owner)[/citation]

Bounty, check out the 4890 story. =)
 
[citation][nom]cangelini[/nom]Bounty, check out the 4890 story. =)[/citation]


Even better, you already have 10.1 results. ;>
 
[citation][nom]leo2kp[/nom]Maybe I'm just lost or confused and maybe it's been answered before, but why are you running your RAM at 1066 and not 1600?[/citation]

Leo, fixed. That should be 1600 MHz. Thanks!
 
So, I'm still waiting for a card that is reasonably priced so that I finally can upgrade my trusty 8800GT.
The 260 has been too expensive up until recently and I felt it wasn't such a big upgrade, not in all test at least, but maybe it would be useful for the 1920*1200 that is the native resolution of my monitor. So when the price was dropping for the 260 I was almost going to buy one when this one comes out. Now the question is if it's worth to wait a while more for that one to come down a bit in price.
There are a couple of games where I could play in 1920*1200, that I now can't do it with. Maybe the 260 is just enough anyway. I mean performance/price must be much better for the 260 compared to the 275, right?
 
Got to love benchmarks and how they can be so different. This site, http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/16365-sapphire-radeon-hd-4890-1gb-review.html , has the 4890 mostly winning against the 275 and pretty much staying beside the 285. They also managed to overclock to: Final Core Overclock: 1Ghz
Final Memory Overclock: 1200Mhz (4800Mhz effective)
I wish all the major sites would get together and design an I.S.O.(International Organization for Standardization) benchmark. Xposted to 4890
 
From pricing point of view, is looking up for consumers again. 🙂

But I still have to wonder if there will really be ample supply of GTX 275s. I mean it seemed like Nvidia could have released these cards much earlier. It looks just like a GTX 285 with a ROP disabled but hell they could have done it with GTX 280s even. May be it took them around to stock pile enough "defective" GTX 285s? Then again, I can't see Nvidia intentionally trying to make GTX 275s from GTX 285s and at the same time I would think Nvidia wants to increase the yield of GTX 285s...
 
[citation][nom]Pei-chen[/nom]Like someone already said, this card feels like another 8800GT. I could see myself buying this card as GTS 350 for $120 next year.[/citation]
Clearly you did not RTFA. This is nothing like the 8800GT. Read before you fanboy the flames.
 
I would buy the 4890 instead, much better overclocker. Talks about it reaching 1ghz on the core clock.

I want to see a 275oc VS a 4890oc, no hand picked cards.
 
On Stalker: Clear Sky, you write,

"In this comparison, all cards are set to use DirectX 10.0 to keep performance comparison consistent."

I don't feel like you should disable a card from using DirectX 10.1 when it's able to. It's a extra the card has to offer and it shouldn't be disabled because NVidia can't measure up and offer it with their cards. That's the same thing as saying, "We'll disable one of the core's on the HD4870 X2 because the GTX 285 only has one core."

If ATi's hardware can utilize it, and NVidia's can't, that doesn't mean you turn it off. It's a pro for the ATi cards over the NVidia cards. There's also many games that are designed with an architecture that clearly favors NVidia cards, but I don't see benchmark tests leaving those games out and choosing different games. It really makes the benchmark invalid because if I were to go out and buy an ATi card, I might get either better or worse performance with DirectX 10.1, and since you don't allow a card to do what it was made to do because the competition isn't good enough to do it you don't give accurate results. I'm not a fanboy or anything, and if I am I'd be a complete NVidia fanboy, but that's not a good way to test cards. If one card is better, than it's better, that doesn't mean you make it worse so it can compare to the competition on a more even level. Next thing you know you'll be saying "We overclocked the NVidia card, and underclocked the ATi card so they'd match up."
 
anandtech reveiw was completely at odds with this and gave the title to hd4890 and i have also seen it clocked at 1ghz and i thin that will blow the overclocked 275gtx outa the water. And the 275gtx is 280$ and the hd4890 is 230 or 250 without rebates so for the 50 bucks hd4890 for the win.
 
If we're really going to pick between them it's really down to personal preference. Both cards perform roughly the same, both cards cost roughly the same; so you're a happy bunny if you're an AMD or nVidia fanboy. End of story.
 
anyone knows if theres around more info related to the supposed Nvidia 275's optimizations that makes the 275 run lower quality AA and AF in certain games to beat the 4890?
 
I use my GPU for Folding@Home far more often than I use it for gaming. This seems to be one area in which nVidia has a big advantage. I'd love to see F@H points per day included among future benchmarks.
 
[citation][nom]Pei-chen[/nom]ATI and Nvidia should fix their prices and price their cards better; $50 increments from $50 to $600. I don’t like the idea a $300 card could beat the previous gen’s $500 card because of competition.Ford and GM for the win.[/citation]

ummm...are you crazy? screw that idea! I would much rather be able to upgrade next time around for $300 than have to say - aw the next model is also $500, guess I will have to wait a while! who the hell wants cards to stay at $500? the cheaper the better I say! heck I remember when I bought my first Riva TNT for $169 (or was it $179?) and I thought "damn! this is an expensive card!" well hell, what I wouldn't do for a GTX 285 for $169 now!

for that matter why do you think 8800 Ultra's were $700 dollars and higher? because there was NO competition. if ATI had an answer that card would have come down in price a lot. granted that card stayed at the top of the heap for a long time because there was no competition but again, that is also why the cost stayed so high. At least the length of the 8800 Ultra's reign of supremacy made the cost a little easier to swallow.

think about this to, IF competition stays healthy and ATI doesn't go through any droughts (and neither does nVIdia for that matter) there WON'T be any $500 dollar cards from the year before getting beat by $300 cards the next year because they will keep each other's prices low and in check!

and as it is I seriously doubt there is that huge a market for reselling video cards like there is cars. I have said it in comments before - people need to leave the car comparisons out.
 
[citation][nom]ferrari911[/nom]On Stalker: Clear Sky, you write,"In this comparison, all cards are set to use DirectX 10.0 to keep performance comparison consistent."I don't feel like you should disable a card from using DirectX 10.1 when it's able to. It's a extra the card has to offer and it shouldn't be disabled because NVidia can't measure up and offer it with their cards. That's the same thing as saying, "We'll disable one of the core's on the HD4870 X2 because the GTX 285 only has one core."If ATi's hardware can utilize it, and NVidia's can't, that doesn't mean you turn it off. It's a pro for the ATi cards over the NVidia cards. There's also many games that are designed with an architecture that clearly favors NVidia cards, but I don't see benchmark tests leaving those games out and choosing different games. It really makes the benchmark invalid because if I were to go out and buy an ATi card, I might get either better or worse performance with DirectX 10.1, and since you don't allow a card to do what it was made to do because the competition isn't good enough to do it you don't give accurate results. I'm not a fanboy or anything, and if I am I'd be a complete NVidia fanboy, but that's not a good way to test cards. If one card is better, than it's better, that doesn't mean you make it worse so it can compare to the competition on a more even level. Next thing you know you'll be saying "We overclocked the NVidia card, and underclocked the ATi card so they'd match up."[/citation]


dude get a clue! they did test 10.1 in the ATI review!!! this is the nVidia review! why the hell would they compare two different versions? they are trying to compare apples to apples - not apples to oranges. geez. who rated that comment up? if someone is interested and wants 10.1 they are not gonna care about the nVidia offerings period. but maybe people who are interested in the nVidia card would see how ATI does at the same settings the nVidia cards can do and reconsider if there was a huge difference in ATI's favor. I wonder where is the common sense sometimes.
 
This article says that the PhysX is proprietary, but has anyone tried a Ageia Physx pci card and tested that with one of these ATI Radeons?
 
thegh0st, calm down :) - he just doesn't understand experimental design. A surprising amount of the general populace of the world doesn't understand experimental design - that's why there are multiple college courses offered in it. It might sound common sense to you (and me), but that doesn't mean...

The point of making a non-biased comparison is to compare things on as equal a level of playing field as possible. To remove all the variables save for the specifics which are being tested. In this case, they weren't comparing DirectX, they were comparing the card's relative performance. As is the case, everything else must be ultimately controlled as best it can and normalized to prevent unknowns in the data. So this is why they pick standardized platforms, standardized tests, and limit capabilities to the lowest common denominator.

It's good experimental design to leave DirectX (in this case, by establishing a baseline) out of a comparison that doesn't directly address DirectX.

That being said, I think this is a pretty good reinforcement about how powerful the GTX 285 really is, considering it's a single card. The Radeon 4870 x2 I would have expected, as that thing is a beast in and of itself. But the 285... as a single GPU... is nothing short of amazing in my opinion.

I'm going to keep my 9800 GX2 for at least another year. Might get the 295 eventually, but that depends on what the next generation is like...

Instead of offering a sum of framerates, would it be better to have "weighted" framerates? e.g. the highest scoring card gets 100% on a fps score, and then go down from there. That way, you can then average out a scaled score that prevents "bias" in high framerate games, and respectively adds weight to low framerate games. So instead of "framerate sum" you'd have "average percentage of performance, relative to highest card".

If you didn't want to use one of the fps' ratings from the cards to establish the 100% mark, you can always use 120 FPS or something arbitrary like that to establish the 100% mark, and score according to that.
 
Excellent review! You did a good job of sorting through all the minutia and summarizing the basic difference and similarities of the cards tested.

Nvidia and ATI have so many screwball model #'s and price points that it's as confusing as hell for the consumer!
 
I have a gtx 285(320 $ new)but I see a review between gtx 275 & gtx 285 http://www.techpowerup.com/reviews/Powercolor/HD_4890/6.html In some games it seams gtx 275 win against gtx 285 !!! End it is at least 70-90 Usd cheaper, for start a gtx 275 is 250 USD !!!,what a great card single or in Sli, best card for the moneyhttp://img.tomshardware.com/forum/uk/icones/message/icon6.gif Because I prefer single card not 2 or 3, I acquire the GTX 285, 4 days ago. But I now that is overpriced, better for money is gtx 280!!!
 
Well I was always a bit of nVidia fan boy 😛 I stopped gaming a few zears a go because of college. That was in 2006, till then I owned many cards from both nVidia and ATI. I was always able to OC nVidia cards better than ATI's... I ordered a new PC last week... but I canceled the order, simply becuase of those 2 cards (275 and 4890).

I must say it was a hard decision, but I picked 275GTX. Since my friend own 2 PC (1 ATI based (4870) and 1 nVidia based(260 1core 216)), he said I should pick 275 GTX. He said his 4870 just isn't a good OCer like 260... I belive him... I will post my screens of my OCed 275 :) I hope it will be atleast +10% xD
 
I recently bought the palit gtx275 and upgraded from a 9800GTX+ OC (BFG) what a card, on the 185.63 beta drivers I am running two monitors (1650x1080 X 1900x1600) with FSX on max along with quite a few add ons including REX, this is giving me 30fps with it coming down to 20 fps around very busy airports (I have got quite a lot of AI traffic on) overall well chuffed and the gtx275 only cost me £200 :)
 
By the way got mine overclocked to

- Core Clock: 725MHz
- Memory Clock: 1225Mhz
- Shader Clock: 1525MHz

This passes 3dmark06 with my e8400 o/c to 3.8ghz with a result of 18100 points.
 
8800GT performance similar to the 8800GTX's, for lesser price.
Sorry but just no, I had a 8800gt for about a day running cysis on a 22" widescreen. after that day i took it back and got a 8800gtx which simply blew it out of the water. The difference in performance made me swear never to buy a mid range card again, even if it's at the higher end of mid range.
 
Status
Not open for further replies.