GeForce GTS 250: Nvidia's G92 Strikes Again

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

curnel_D

Distinguished
Jun 5, 2007
741
0
18,990
[citation][nom]Pei-chen[/nom]Are you two idiots? GTS 250 is the same as 9800+ GTX. If Nvidia can sell retail 9800+ at 738MHz why would they need to cherry pick GTS 250 at the same clock?If every GTS 250 is running at 850MHz vs 738 on 9800+ you can say Nvidia is binning better chip for 250 but they are running at the same speed.[/citation]

I dont start the rumors, douchebag. But that doesnt even answer the questions about legitiment and trusted hardware sites hinting at (And in HardwareOPC's case, outright saying it)the rumor. Oh, and perhaps you missed the part of the article, and every single article on the web reviewing only overclocked 250's?

Or is it too apparent you ran your mouth without thinking?
 
G

Guest

Guest
will stick to 4830/4850 and 4870 for my customer pc builds, prefer ati cards and intel processors, great mid range systems with this combo.

Really dont get Nvidia naming policy though, seems a bit of a dupe if you ask me!
 
G

Guest

Guest
My 8800gt will not SLI with the 9800GT, an identical card. It will not SLI with the GT250, very similar card because Nvidia changes not only the name but also the bios and blocks the SLI. I will never buy another Nvidia card or SLI motherboard. I do not think that buying one Nvidia card and hope to buy another later for SLI is a viable strategy. Go Crossfire.
 

romulus47plus1

Distinguished
Feb 25, 2008
872
0
18,990
[citation][nom]morpheas768[/nom]Can anyone please answer me this stream processor thing?[/citation]

Different technology. Nvidia's stream processors have more raw power.
 

trinix

Distinguished
Oct 11, 2007
197
0
18,680
[citation][nom]morpheas768[/nom]Can anyone please answer me this stream processor thing?[/citation]
They work differently. Nvidia uses all purpose SP's while Ati has specialised SP's. That way, when a lot of different tasks needs to be performed the Ati will go a bit faster, but when a single type will need to be performed, it will have to wait for the SP that can perform the task to be done so it can be pushed in.

I'm not too knowledgeable about it, but with a bit of searching, you can find the exact answer you are looking for.
 

morpheas768

Distinguished
Mar 3, 2009
270
0
18,960
[citation][nom]romulus47plus1[/nom]Different technology. Nvidia's stream processors have more raw power.[/citation]
I'm sorry for being persistent, but could you be a little specific about that? maybe each SP can do more??
 

morpheas768

Distinguished
Mar 3, 2009
270
0
18,960
[citation][nom]trinix[/nom]They work differently. Nvidia uses all purpose SP's while Ati has specialised SP's. That way, when a lot of different tasks needs to be performed the Ati will go a bit faster, but when a single type will need to be performed, it will have to wait for the SP that can perform the task to be done so it can be pushed in. I'm not too knowledgeable about it, but with a bit of searching, you can find the exact answer you are looking for.[/citation]
I got it. THANKS!
 

bdf2000

Distinguished
Feb 18, 2009
2
0
18,510
I'll be keeping my 8800GTS G92 for another generation or two. The cards aren't getting much better as time goes on, and neither are gaming requirements. Crysis is still the top dog out there and I can run that on high just fine at my relatively low res (1440 x 900).
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]Curnel_D[/nom]Chris, it's a decent article, but why in the world would you use 512mb models in everyting aside from the 250 and 260. If you would have shown the 1gb 4870, along with a 1gb 9800+, it would have showed a clearer picture of how the 250 is identical to the 9800+/9800/8800GT.Meh. And there are MASSIVE rumours saying that Nvidia is hand-picking the review models sent to reviewers, even confirmed by HardOCP. Addressing that in this article would have been great.[/citation]

Why? Because that's what we have in the lab. However, for the sake of technicality, let's take a look at NewEgg and see what stepping up to 1GB models of everything else does:

Radeon HD 4850 1GB lowest price: $161
Radeon HD 4870 1GB lowest price: $214

Now, the comparison against the 512 MB 4850 and 4870 cards was perfect because the GeForce GTS 250--the card being reviewed here--fell right in between the performance and pricing of those two boards, making the comparison a particularly good one.

The 512 MB Radeon HD 4870 was faster than the GTS 250, so it's fair to assume that the 1 GB card will be even faster than that. In fact, its price tag puts it at the same level, essentially, as BFG's overclocked GeForce GTX 260 C216.

At $161, the 1GB 4850 costs as much as a 512MB 4870. Given the differences in memory technology, I'd go with the 4870 and overclock it to the best of my ability before buying a *slower* card with more memory that's going to get hung up that much faster at high resolutions.

Personally, I think comparing the GTS 250 *directly* to a GTX 9800+ paints the picture that they're the same clear as crystal.

And as far as the rumors are concerned, why buy in to the sensationalism? I didn't overclock the card in the event that there were memory ICs onboard not representative of a board you'd buy at BestBuy. But to be honest, to cry foul about one graphics card launch where boards are "hand-picked" ignores the fact that *all* vendors hand-pick the best of the best to put on show. It's the reviewer's responsibility to take that into account when they set up their testing and neutralize any advantage that gives in the results.

Incidentally, that's what makes our System Builder Marathon series so special. Where else can you find a trio of systems with hardware all bought off the shelf at retail and compared? :)

 

konjiki7

Distinguished
Jan 12, 2009
99
0
18,630
This seems like a marketing ad for nvidia seeing how its direct competition isn't include... models such as the 4870 1gb or 48501gb.

Nothing says you appreciate your customers like offering old tech under a new naming scheme.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]nerrawg[/nom]Conclusion in this article finally gets to the point, after having compared OCed cards against vanilla. Good article - yet "The Real Story" might be missing out on a few more valid points. 1. All of the AMD 4800 cards can be easily overclocked, especially the cheap 4830 which often OCs over 700 Mhz on its GPU clock. This will effect the value evaluation, because the 9800+/250 is gonna have to OC pretty well to match it bang for buck, and seeing as the tested cards are already OCed, well I really wonder if it has that headroom?2. 4850s and particularly 4870s come in much hotter versions than the vanilla flavors - ex. sapphire toxic etc. The prices of these models will be important to consider.3. The G92 architecture is from what I have seen sketchy performance wise in SLI compared to the 4800 series in Crossfire. I am not sure of this, but I would be cautious of using a G92 card if you where planning on using a multicard setup, atleast from the tests I have seen. It would be interesting to see direct tests between a GTX 250 SLI and 4830/4850 CF setup. I'd put my money on the CF solution and I'd love to be proved wrong for Nvidia's sake.[/citation]

1) All of the Nvidia cards can be easily overclocked, too. As you could see from the benchmark results, rarely was the overclock OR extra 512 MB of memory able to help the GTS 250 outpace the old GTX 9800+.

2) Differentiated 4850s/4870s are going to cost more than what I cited. In other words, if you want to spend more on fancy cooling, it's going to affect the performance/$.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]trinix[/nom]Not everyone is a tech person, they'll ask friends and family about performance and the tech person in the family might prefer the nvidia or the ati card and recommend that one over the better one. Also at shops the knowledge isn't always better. I've seen people behind the counter, who don't know the difference between ddr1 and ddr2 memory and will just tell you they don't have it. Rebranding is evil, but if that's the way Nvidia can keep making money and stay alive, I'd rather have that than the solo reign of Ati.[/citation]

Um, so long as the ill-informed and too-lazy-to-research consumer isn't paying more for the re-branded product, why does it matter *at all* if a GTS 250 gets recommended instead of a GTX 9800+?

If that recommendation is being made on the numbers, then at some point, someone would have had to look at the benchmarks in order to formulate an opinion on whether AMD or Nvidia was their vendor of choice. If it's instead based on brand loyalty, then there are other, less scientific factors in play.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]nerrawg[/nom]http://www.tomshardware.com/news/N [...] ,7150.htmlAs stated above, found it at the bottom of article, 512 bit is listed as only difference between 9800+ and 250 - but its not in this article - I trust Chris on this oneThe direct competition absolutely is included. [/citation]

Nerr, the correct vital stats are in the table on the second page of this piece. I'll let the news team know about the specs on their respective pieces--thanks for the heads-up!
 

bustapr

Distinguished
Jan 23, 2009
1,613
0
19,780
[citation][nom]cangelini[/nom]Erm, remember when they launched? ;-P[/citation]
Deuschbag! Were you around a few days ago when right on the home page of tom's, There was a tigerdirect ad about gtx280 at $349.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]sublifer[/nom]I do feel sorry for the average joe consumer... someone who has an 8800GTS and thought they'd be getting a huge performance boost by skipping a generation and got themselves a GT250.[/citation]

A fool and his money are soon parted. Feel sorry for that guy because he'll waste a lot more money in his lifetime than just $150 bucks on a graphics card ;-)
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]alanna[/nom]The review has missed that the G250 is only 9” long and GTX 9800+ is 10.5” long.[/citation]

Both cards are exactly the same length. Finkle is Einhorn.
 

tuannguyen

Distinguished
Jul 22, 2008
488
0
18,780
[citation][nom]Pei-chen[/nom]Chris, you should really consider sending Kevin, Tuan and Jane to training. Kevin and Tuan can't keep facts straight and Jane is simply blogging.Using GTS 250 as example, Kevin and Tuan reported that 250 is using 512 bit memory bus. I almost went over to Anand to check the spec. before clicking on this article.[/citation]

Hey Pei-chen -
My personal apologies for the error in the bits for the GTS 250. But if you'll allow me for a second: truth be told, the reviews go through far more rigorous editing than the news, simply because the news dept. is so very time sensitive. Unfortunately, this means sometimes we'll have typos, or worst, factual errors. We do our best to make sure everything is accurate, so I apologies to not only you, but everyone reading this, about typos and other errors. It's a consequent of writing news very fast sometimes. My apologies.

But, I realize everyone's pretty enthusiastic about what they read here, and demand better and better quality. No problem. Keep sending in those criticism.

I heard once, that when people stop criticizing you, that's when you should be worried--because they stopped giving a damn. Right? :)

My personal apologies again as I am responsible for the news operation here.

/ Tuan
 

tuannguyen

Distinguished
Jul 22, 2008
488
0
18,780
[citation][nom]Curnel_D[/nom]I actually prefer jane over kevin and tuan both. She might be blogging, but most of the time it's interesting, and never misleading or downright untrue. Both Kevin and Tuan are total morons IMO. Are they college kids doing a practicum or something? Because there's no way they actually have any journalism credibility. They're even the laughing-stock of other forums on a consistant basis.[/citation]

:\

That was pretty brutal of you to say.

No problems. If you allow me, I'll still say sorry for the error. So, my personal apologies, as I am responsible for the news department here. Some days are better than others. Really though, we try to cover as many topics as we can in a day, at the speed they land on our desk at. Unfortunately, this puts a great deal of pressure on us to get the news out to everyone very quickly, and consequently, the articles don't go through as tough of an editing chain, but we still do our best to remain as accurate as reasonably possible.

But, criticism accepted, even what you said.

I take responsibility for errors you may see on news articles.

/ Tuan
 
G

Guest

Guest
Test in SLI! It's true that most shoppers in the Sub-$160 range don't want to fork out for an X58. This is no reason to avoid reviewing it, though, because AMD has interest in things like the 4850 X2. Three reasons: Intel P55 SLI, a possible NV GTX 255 with dual GTX 250's, and running dual 512MB GTX 250 for a faster setup totalling only $260.
 

scryer_360

Distinguished
Jan 13, 2007
564
0
18,980
All I can say is this: the G92 has had a good run. I have an 8800GT with 512 mb of GDDR3, and with some slight tweaking and OCing, it'll be just as fast as the GTS 250 (making up for the difference in extra memory).

However, nVidia is in somewhat of a quandry. For the longest time their "mainstream" chips were all 65nm: only recently have they got anything at 55nm sizes. But the ATI (by AMD) cards have been on a 55nm process for a year now, and selling at much better prices (and presumably, much higher volumes). Whats all that mean? ATI has been basking in higher profit margin chips that whole time. A 55nm chip is smaller and easier to manufacture than a 55nm. It takes time (and therefor money) to engineer the chip, but in production the savings pile on.

And now AMD is on the verge of a 40nm chip. This means that when nVidia is still getting its 55nm lineup out and through the door, and in this case, still trying to find a replacement for the G92, ATI will already have much cheaper cards rolling of its lines. The early reviews of the new 40nm ATI cards are coming in, and the results are said to be splendid. They do have to work with GDDR5 memory, which adds cost, but presumably not so much as larger memory interfaces.

nVidia better have kept its R&D budget large and lush, because before you know it it will be Christmas 2009, and ATI will be eating nVidia's lunch. Hopefully, they have a new architecture in the works.
 
Status
Not open for further replies.