NVIDIA GTX 350

Status
Not open for further replies.

dos1986

Distinguished
Mar 21, 2006
542
0
18,980
Its coming late august - early september, when the 4870x2 is out this will be max 2 weeks behind it.

I have heard specs from a good source on ocuk and the specs/ design dont match what I have been told.2gb is real, ddr5 not and its dual.

Rest is pretty much spot on.



 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980


Why?

It will arrive shortly...just after the release of a competitors dual solution.

They are saying it will be faster than the competitors by a bit.
 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980


The price is unknown at the moment.It will be alot more expensive than your 260, more than double.

260 is a hell of a card anyway, it a beast when overclock to the hills.
 

Just_An_Engineer

Distinguished
Feb 18, 2008
535
0
18,990


You should be as it's not true. That same site reported earlier that the GTX350 would be 45nm and has been running a long series of BS rumors. If you looks around the other forums you will see that nobody is taking it seriously.
 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980
Its coming dont worry, once the 4870x2 is out, expect to hear all about it.

Doubt it all you want, I will bump up this thread in september
 
There may be something coming, as Ive heard as well, but it wont be ythis. The power envelope simply wont allow something like this. Youd need special power hookups for it. Also, one look at the clock speeds on something that huge and then your talking a huge heat problem. Default on the G280 is what 607 core speeds? Gimme a break. Look at this and compare to whats already been released.
 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980


What happens in games that need the cores like crysis or warhead or Fc2 or alanwake?


 
738 with oc, yes, and then look at the power draw after the oc. What Im saying is, thats over a 30% core increase which would more than take away any die shrink, even adding in the reduced power using GDDR5, plus the compatability of the card even using GDDR5, which no ones heard a thing about. We all knew ATI was using it well before their cards arrived. I also find it hard to think we wouldnt have heard about that as well. If all the speeds are ramped up, and you throw in the die shrink, youre still talking about a G280x2 power demand, which I just cant see happening
 

Just_An_Engineer

Distinguished
Feb 18, 2008
535
0
18,990


Speculation is that the 55nm flavors of the GTX280 and GTX260 will be out earlier than initially expected. Perhaps as soon as late August.
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280

Actually Nvidia joined the SOI consortium. I doubt they joined it just to get something developed. Maybe they got a "special" manufacturing deal out of it too. That would be the only possible choice to get a card that is even more monstrous than the GTX 280 on 55nm working withing the same or an only slightly higher power envelope.

Until i read something official i think nvidia is just playing the marketing card to keep people from running off to buy 48xx cards.
 
I believe its possible to have a 55nm G280 come out in very limited quantities by sep, with clocks ramped to 800 core, and such a card could give the x2 problems as we saw at the previews. However, Im of the mind ATI is sandbagging as well and going to surprise yet again.
 
Remember, ATI stated that wed see improvements of 15% with the x2. Looking at those previews, I didnt see that. And of course, none of the drivers are even finalised yet on what we saw on top of that including the regular CF setups, so yes, theres much more performance to come
 

onearmedscissorb

Distinguished
Apr 25, 2008
38
0
18,530


What happens to something like Crysis is that it continues being the same unoptimized piece of junk it always has been. :kaola:

That game doesn't "need cores." It needs to be written to actually use them, which is precisely its problem. It does not scale well at all. When SLI/Crossfire setups of cards a generation ahead of what it was made for still cannot really get it to run right at its higher settings, I think it's pretty ridiculous to consider the hardware to be the problem..
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
This card would still be stuck with AFR rendering, which is going to make it inferior to the gtx280

They need to rethink the technology like AMD did before they start making more dual gpu cards. =/
 
Status
Not open for further replies.