NVIDIA GTX 350

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
One thing tho, people whove bought the G280s , Im wondering if many will go for this new card? All theyve gone thru, the overpricing, the eol of their cards, will they be willing to jump? Too much toying from nVidia. I dont see this as a 7800GTX256 to the 7800GTX, which no one could find, to the 7900GTX thing. People dont like getting burned. Those cards held their price, the G200s havnt
 



I really wouldn't hold my breath on a gtx200x2 for the same reasons I stated before: huge power requirement, high heat production, forced AFR rendering - they would have to change the architecture considerably to get past all of this. Even if Nvidia can release a card to best the 4870x2, I think it is going to have a rather long list of issues.
 
The crazy ones that bought those cards at launch will probably be glad to see this performance so fast, I dont think they will be bitter when they see it.

 

But you see, this is my problem with this "very close" thing. The 4870 is on beta drivers. The 4870x2 is on very beta drivers, and we havnt seen full implentation of the 4870x2s inherent abilities. So, how can anyone predict such a thing? Someone who knows one side well, doesnt know the other at all.
 
The race for the top spot will be interesting. The 55nm switch for the G200 should bring them a nice boost, giving them an advantage over the ATI cards, but how theyre going to take the crown will be very interesting
 
That is true, you have to guess some bit.

4870x2 is some 25% behind 280 sli @ 1080p in of the one titles it doesnt shine in, they are expected to make strides in that, but they might not either.

All I know is one single slot card coming from the green corner 2 or 3 weeks behind Ati's dual card will perform at or above tri sli level on Crysis and a few others @ 2560 x 1600, which is incredible.

Put it this way if one gtx 280 gets 60fps on COD4 @ 1080p, the new single slot will get over double that, Ati at the moment are only offering on average 50 to 80% over 4870

 



Source?
 
Of course, its a new "solution" and each game has it's card that it prefers, nothing architecturally new its just a dual pcb card.We have already seen from a few titles that 4870x2 is alot more than just crossfire on a stick, NV's is the same.

Price are going to drop at a faster rate than the last 3 weeks when the answers come soon.
 
Im wondering if this will go around 1 more time. The 4870x2 could come in dirt cheap too, preventing buyers from jumping on this other card. If ATI played to their strength here, and I mean cheaper process, then this x2 killer would be a moot point, wouldnt it? Put it in the position the G280 is currently in, being the better card for single chip, but costing too much for performance gains
 


I have been told from a good source, he doesnt lie and I dont either.

I have been on this site a long time now and its the first time I have ever said I have info, becuase it is the first time I have had info.Like I said all I know is its dual based and each core is alot faster than gtx 280.They have made some modifications for this of course.

Its going to be very very fast and well worth waiting 6 weeks for, which is what I am going to do.

 
But will it be that much better than the final 4870x2 that we still have no idea as to how itll perform, which could sell as low as 450$? Thats not what Im hearing. Im hearing itll edge the 4870x2 out, but it wont be cheap
 
Yea, exactly. Having the performance crown is only worthwhile if there is a real world benefit.

When we were comparing 9800 GX2 and 3870x2 - there was a real benefit going up to the 9800GX2 as the 3870x2 was lacking. The difference in this case is that the 4870x2 is anything but lacking, so unless Nvidia plans to make an enormous performance jump without straddling thermals and power requirements - I don't really see the point of it.

Even with coming titles such as Fallout 3, Crysis:Warhead, Far Cry 2, Stalker: Clear Sky - all graphically intensive titles, but no more graphically intensive than Crysis 1.21 - do you really need this kind of graphics power if there is no real world image quality benefit between the competitors?

When you talk about these types of ultra high end configs, then you are looking almost solely at AA performance. AMD has proven that the 4870x2 delivers incredible high quality AA performance which is beating/rivalling the gtx280 - in a prototype stage. If Nvidia was to release a gtx200 x2 - then it had better be blowing AMD's AA performance out of the water, or there is not a real benefit to spending that kind of money :S
 


Not necessarly.
Do you take all the IP you have and then try to sell cards losing $50-100 per card or do you rush out a refrsh that brings you closer to breaking even or profiting?
If the loses were small and the price drop small, then the benefit of large production to spread costs makes alot of sense, but if you're losing alot of money per card, then allowing it to go on until 2009 when you could've stopped the bleeding now makes alot of sense. The whole thing will be to nail that 55nm refresh. Can't rush it and mess-up yields in the process.

They are going to take such a massive hit in the bank with this whole thing. It would be better to just set the cards at X amount and ride the storm out, there is no way they are gonna beat ATI in the price-performance department this round.

That makes sense to a point, but if you can write off the this round and put it as a $50-100 million dolar hit, and then turn that R&D investment into a money maker and as importantly market share maintainer and buzz-generator then wouldn't you rush to market a replacement like ATi did with the HD3870, rather than sit back clutching yourself rocking and moaning hoping for it to pass, while the competition simply drank more of your milkshake. 😗

I'm surprised that nV would monkey with the FP and a few others things, however likely the design plans were on the wall in March to move in that direction with a G92 style refresh meant to keep the G200 as the industrial/proffesional/workstation part (all the FP bits, etc) which can cost a ton per chip and they will still sell in their $1000-4000/card so fractional savings are less important, then make the 55nm refesh a part to go after those initially not tempted by the GTX280, so you bring out a slightly less robust spec sheet, but faster and better gamer that costs less to make.

The strategy is very sound, and likely was started being put into place long before they knew what the RV770 really was, let alone how well the G200 would match-up against it. The only difference being that it was likely intended to be the Xmas refresh selling in late Oct/ early Nov to make a ton of money, now it'll be the August/September part meant to keep nVidia in the game around back to school time.
Undoubtedly this will be countered with an ATi price drop, to make things competitive.

Overall the best way for nVidia to move forward is to simply move past the GTX280 and focus on the 55nm refresh as if the GTX280 were already in the market 9-12 months and was end of life naturally. If ATi had stuck with the HD2900 the whoe time, they would've been in far worse shape than they ended up. nVidia has to go through that same realisation.
 


Probably ( little bit faster ) but in certain games it will be worth every penny to some, we are talking totally playable performance in every game @1080p, no video card cant do that now including 4870x2.

 


Its a huge benefit, I dont know performance figures for such and such a game, but if gtx 280/4870 can only max out above games out at 25fps, new card will be more than double that easily.

It might not mean much too you, but you can benefit from the aggressive price dropping which will carry on for the next few months.4870 $200


 
GreatApe has it bang on, slowest version of 55nm coming out in the next few weeks will be a good bit faster than current gtx280.Thats very good news for those who didnt go out and get a card yet.
 
See, thats the problem with previews, using ES cards, and beta drivers. We have an idea as to how good the 4870x2 is, but we really dont know exactly how good its going to be. Theres more performance coming from this card, exactly how and how much no one knows, especially nVidia. So to say this card cant run every game at 12x10, I need a source. And as for nVidias card, Id also need a source, tho I think I know who it is, and like I said, BZ has good info, but he doesnt know ATI, and completely missed just how good the 4xxx series were going to be. Also, theres the benches on Assassins Creed that werent announced properly, nor the findings. So, while ATI has stated that thered be something more with the x2 cards, and a certain level of improvement, we havnt seen it yet, nor do we know how its being done. To say that 1 card is going to edge another card out, and not even knowing how the other card will work and eventually perform is stretching it a tad
 
Isn't it about time we stopped looking a Crysis performance and just considered benchmarks in real games. Crysis is clearly just so poorly coded and designed that it is like getting increasingly large and powerful engines in your car, and then continually filling the fuel tank up with watered-down petrol. It might run, but it will run like sh!t whether there's a 5.0 V8 or an asthmatic 3 cylinder 650 cc.
 
Nvidia is soo screwed. They released cards that were allready obsolete. They keep pimping up their DX10 cards while ATI DX10.1 cards run more effectively with the 4XXX series. Their GTX350 , if real, has to compansate for the more streamline DX10.1 cards from ATI. A 900 dollar card will not compete with a 400 dollar card. Why pay more for an obsolete arch? Nvidias arrogance put them in this hole.
 
Point here is, Crysis is severly cpu bound as well. Run a Quad at 6Ghz and see what happens. Its choking alot of gpus, for lack of cpu performance, so to conclude the gpu to do it alone is just crazy. And yes, its waaaay overhyped
 
Status
Not open for further replies.