GT300- Next generation Nvidia 40nm chip yields are fine

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.



Simply because, the 5870X2 will cost $600+. If they can double a $380 card and sell it for less than $600, that's worth it. But other than that, if the GT300 is better, it will sell more, even if the 5870X2 is faster. And FYI you seem to have a touch of red to all your comments...and a touch of green for envy aswell.

As a final point, everyone can have a point of view, but when it comes down to it. You will hate yourself if one company wins out and dominates the market, even if it was your favored company. Without competition you get jacked up prices that no one likes. Like when the 8800GTX stayed at ~$400-500 for over at least a year because nothing could compete with it. There has to be competition, or the consumer gets screwed, so learn to appreciate it, not bicker about it.
 



Erm...

Don't forget there is a 5850x2 this time :)



 
Simply because, the 5870X2 will cost $600+. If they can double a $380 card and sell it for less than $600, that's worth it. But other than that, if the GT300 is better, it will sell more, even if the 5870X2 is faster. And FYI you seem to have a touch of red to all your comments...and a touch of green for envy aswell.

If the GT300 is better even if the 5870x2 is faster? It wont be better if the x2 is faster and I can assure you there is no touch of green for envy lol. Nvidia is tanking so hard it's almost pathetic. I can't even find myself to hate them anymore, even when they are doing crappy stuff like bribing eidos to disallow AA in batman on ATI's. It's almost sad the way they have capitulated.
 
excuse my silly, impulsive question, (not gonna read 3 pages of posts)
so if someone could answer me without feeling offended,
any chances of nvidia new series coming out in the next 2 months?

 
Reminds me of ATi/AMD's HD2900XT story.

They were a year behind the schedule.
They were promising the fastest graphics card ever thought of.
They had lots of different problems with their performance.
They were doing press releases and pr meetings etc.
They were late for the release of Windows Vista (A.K.A. DX10[.1])

Then, they went back to the design board and slowly recovered. Created the 3xxx series (a bug fix for HD2XXX series), and then more successful 4xxx series (Improvement over 3XXX) and now with 5XXX series (more improvement with the help of smaller node), they get back to the top.

Durin this time, nVidia rebranded their G80 for a couple of hundred times. Created the G200 but couldn't scale down to mid-low end. (That was the real reason behind rebranding of G80) Went back to design board and started from scratch for G300. As with anything you do from scratch, well, first "prototype" is far from perfect.

In short, nVidia this time flipped. Period.
They won't go bankrupt.
They will see their mistakes and fix it with the next iteration and further refine and improve with the third one at the latest.

Anyway, be prepared for ATi domination for some time, let's say, 1 year to 18 monthes.

My 2 cents.
 
Doubt it. Specs and White Papers have been out for a week now, and everything I've looked up looks quite promising. The only question is if they will get the chip out the door by late december, or slip into January.
 
Nvidia expect Fermi release late Novemeber.


Not the best source but...

http://www.fudzilla.com/content/view/15812/1/



Im getting sick of ATI knobrats ALWAYS saying that 'Nvidia DONT have a card...will be 6moths ETC...' when you have no base on your argument just as much as the next person sayin it will be here by end of november...

IN FACT....theres LESS evidence to say the cards gonna be later than sooner...

So just please just STFU!!!
 
if its much faster than gt 200 series than it will beat 5870 like piece of cake

Who will give a rat's rear when they are twice the price and are in short supply especially in this economy. Most are already assuming that they are going to run $600~750 for just a single gpu cards and one can forget for a while about a dual gpu unless nvidia wants to extend it's e-pen even further. After I am done with basic upgrades on my rigs I am going to buy up a house off from one of the banks off the books for only a few k unlike the newts that paid 100~500k for the same shack (they are all shacks but some are better than others. 😴 😴
 


5870 is a new tech and gtx 295 old timer and beat it... what a shame... Nvidias new gt 300 will destroy that 5870 like tornado... they just waiting ati to release that 5870x2 with new drivers and then they will kill them forever.
 
5870 is a new tech and gtx 295 old timer and beat it... what a shame... Nvidias new gt 300 will destroy that 5870 like tornado... they just waiting ati to release that 5870x2 with new drivers and then they will kill them forever.

Don't worry John_, We'll make sure you get the help you need.

Blind faith in anything is asinine.. Probably more so when we are talking about person X's favorite graphics card... Fact is it is not out yet and there is absolutely no information as to how it will stack up to whatever is top of the line when it is out.. Certainly we all hope it is soon.. we all hope it kicks ass, and we all hope it is cheep.. Why on earth would a fan of computer progress hope anything else?

The ignorance of the public is nauseating.. Can't we have a week where we all just talk about how great the tech is, how we cant wait to buy whatever is best, how the brand doesn't matter so long as the hardware rocks?.. on this tech forum.. and avoid the fundie nonsensical jargon.

We are all capable of making *** up.. it is those who make the best of the info that we have to formulate logical opinions that I care to listen to, regardless of the side they may be on... I swear, bias will sink whatever is left of us before too long.. it makes my head spin how all these folk skipped the logic thought classes in school.

You know what people.. "I like this brand therefor it is best and the others suck so I am right before I even debate" is not evidence.. it is not proof.. it is as far from scientific as possible.. it is barely even an opinion.. Saying garbage like that puts you on about equal footing with the folks who claim evolution is a lie, the earth is flat, and creation is as young as the domestic dog...



Why is it not possible to be excited about the power the fermi card might bring.. while at the same time being pumped about what ati has already brought to the table.. I don't live months in the future... Am I going to have to deal with red team fanboi's lambasting the gtx300's constantly on the basis that "in 5 months our new card will be out." Last i checked there will always be new things coming.. no reason not to be excited for them while at the same time being excited for what we already have..

The 295 is the fastest single card out there.. on what planet does that make a card slightly slower (for now) but far cheaper, with more features, a bad piece of tech that is not warranted its own respect? ...sigh...

/end rant
 


whats the point to have dx 11 card now while you dont have dx 11 games yet. till than nv gt 300 will wipe the floor with that ati 58xx end of story...
 
You don't even make sense.. I have to ask on which planet you are from.. but I don't really care.. IT is fine that you want to live in your small little world sheltered off from reality.. but if you think you are making anyone with more brain power than a gnat respect your thoughts you are powerfully mistaken...

I didn't mention dx11... I'm not sure what your point is.. What will be the point of the gtx300 by that logic.. It won't be until the next 'generation' that dx11 will have games released for it.. Everyone knows that... People are buying these cards for the performance now, and right now the 5800 is the fastest single gpu card... Tomorrow that may change.. but if you have even the semblance of intellect you would realize that living in the future is pointless as there will always be new tech coming to make whatever you want now pointless to buy..

I'm sure you get some sort of gratification when the team you cheer for wins... that is fine.. but you are allowing yourself to fall victim to idiocy.. it is sad to observe.
 
whats the point to have dx 11 card now while you dont have dx 11 games yet. till than nv gt 300 will wipe the floor with that ati 58xx end of story...

What do you think games developers will be making dx11 games on? Will it be the 5870, or the non-existent g300?


...

........

...............


Yes john, games devs will be creating games on ATI's dx11 right now. They will be taking advantage of all the *currently* available features of dx11.

Unlike you and the rest of the nvidiots, they aren't waiting on fermi. You won't find a single software developer who is, only nvidiots who will be waiting and waiting...and waiting. Have fun waiting while the rest of us play to the max. :)
 
If you really have stopped with the hating then perhaps you now start with the investigating.

http://www.bit-tech.net/news/hardware/2009/10/03/nvidia-dismisses-amd-s-batman-accusations/1

ATi were invited to take part but declined, apparently they don't care about their user base as much as you would like them too.
 
oh yeah, I totally believe that ***. They "implemented" AA? The way this guy said it is like they "INVENTED" it 😀. I will laugh my ars off :)

Let me just tell you that AA is naturally supported in DX10. So you dont need to "implement" anything, just enable it.

Not to mention all the options in the config which are actually set to "FALSE" by default

like UseAtiTextureCompressionOptimisations and stuff like that

Do no harm my arse

pathethic 😀
 


Linking to a blog is fail, especially when the blogs state very clearly -

"His postings are his own opinions and may not represent AMD’s positions, strategies or opinions."

And to be brutally honest Mousemonkey - Nvidia are full of these little issues aren't they. What is gonna be next, disabling Physx when an ATI card is present? Oh wait...


So no, I don't believe it. Maybe if Nvidia didn't lie and cheat habitually at every opportunity then it would be believable. Give me one good reason why I should believe this after they were caught cheating with fermi this week, caught disabling physx on ati cards this week, caught disabling AA in Batman last week alone. Now they are saying that it wasn't them? How do you know that Eidos havent been pumped so full of Nvidias cash that they are now willing to say anything Nvidia demands?
 
Oh btw, I had stopped hating on Nvidia up until this week. But this week all Nvidia has done is proven how desperate they are, and how low they will stoop in order to keep what little they have left.

You people are smart enough to realise when you are being taken for fools.

And yes mousemonkey, I will 'investigate' that issue further. It isn't exactly easy to do that neutrally considering what nvidia have done though.
 
From the first page of that link I immediately saw something that makes sense.

While they may have a valid point there, and everyone I have talked to at NVIDIA believes they are doing the right thing for the gamer, I would argue that most gamers disagree - as do I. By not letting the feature continue as an "unsupported configuration" (much like early overclocking registry mods) NVIDIA is not only defiling their reputation with the community but is keeping PhysX from a significant audience that might eventually have been convinced to buy more NVIDIA cards in the future.

Once again we have Nvidia saying something, yet the logic says something entirely different.

In reality, I think this claim from AMD is pretty much unfounded - NVIDIA has long been accused of doing things like this but AMD has similar relationships with developers - see games like Battle Forge, DiRT 2 and Tom Clancy's H.A.W.X. The truth is that both sides of the coin work as closely as possible with developers to make sure the latest titles work as well as possible on their own hardware. But without a doubt, NVIDIA's development efforts in this area are much more extensive. The developer relations team at NVIDIA is significantly larger, has a significantly larger budget and in general works with more developers than AMD's. In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money/time to get it working on their hardware. Eidos told us in an email conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined.

What needs to be shown here is eidos proving to us that AMD were given the chance. Eidos have came out and said AMD were given the chance, but they haven't shown any evidence of it.

Some emails or whatever would go a long way to proving it. Surely those must exist? Emails from Eidos to AMD would be easily checked for their existence at various points, neutrally in ISP's server caches for example.

That is the first step. Assuming Eidos *did* invite AMD, then we would need to see the reason(s) why AMD declined. However, until Eidos can actually prove that AMD were invited to their studios, there is no reason whatsoever to believe it happened.
 
Occam's razor anyone?

What is more likely

a)Eidos did what they said and fanboi are just making conjecture and raging.
b)Eidos didn't do what they said and was influenced by nvidia and are now covering it up.

There is a very simple reason AMD declined Nvidia has legions of engineers to send out and do that AMD does the same but on a less grand scale, it's why you have that Nvidia "the way it's meant to be played" *** slapped over a ton of games means nvidia sent people to help along with that games development.