GT 300 / Fermi Expected launch late November

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Yes and probably more than most.

There actually has been credible sources saying the new GT300's high end card might be out by the end of this year, not just fudzilla.

'Credible Sources', eh ?

I don't trust nVidia's credible sources anymore, they were the same ones that said the yields weren't that bad and that there was a ton of good parts, which we found out was wrong, also that nV weren't EOLing the GTX2xx series, and also the chief of 'the credible sources' which held aloft a card pronouncing it was the real deal, and then later said, 'I didn't know'. Then they were unable to produce the mythical card that was supposed to be running the demo in the presentation.

If the CEO doesn't know whom to trust for information about the new chip/card, how can there be any credible sources anymore? :heink:

Which is why I don't listen to 'credible sources' from nVidia anymore, because they are simply mouthpieces for nV PR.

Sofar the most credible source about current and future activities at nV has been Charlie, which is unfortunate since he's only got the dirt; wheres there's no other positive force equal to him right now (except maybe Rys @ Beyond3D who gave an indication of what Fermi was hours before the launch). If they were more honest about the good and the bad, then maybe nV could've developed a trusted source for their PR, now no one believe anyone other than the person who may possibly have an ae to grind with nV. Not a good situation for information shaping for nV, and that's what they need alot fo now and oer the next few months.

Right now, until nV provides the Engineering sample that was supposed to be running that Fermi demo, I have no interest in what theire 'credible sources' say as they are simply parroting the PR for strategic purposes.

IMO, believe the worst, hope for the best, they you won't be as dissapointed. :??:
 



Theo Valich got the boot from Toms Hardware because of his crappy reporting, that should tell you something about how bad he is.
 
Heh, that's the plan for me though.

The GT300 will probably beat the 5870, that's cool.

Do I EXPECT it to come out this year? Somewhat, not a full launch for sure, but quite possibly a small release of the high end cards, which would suit me fine because I buy high end hardware 😛. I'm hoping though, that I'm wrong and they do a full release sooner rather than later so I can save a little money on these new fangled cards.

And even if they don't come out with the card until the beginning of next year, hell if I care, no games really need it 'til then anyways, I can buy AvP3 and a couple GTX380's at the same time then 😛

The thing that irks me most though is people whining that nVidia has abandoned gamers. The card has all the relevant graphical API's, and a bunch of programming extras which actually give the gamers even MORE, they just ignore that fact though. 10x Faster application switching... to most people that means nothing, to graphics however it DOES mean something, even if it is targetting at HPC moreso. It means your in-game physics will render faster and more seamlessly, but no one cares about that of course. Nor do people seem to care about the impact of parallel kernel support (it's small yes but it matters still).

Graphics doesn't improve by leaps and bounds these days and people don't get that. It isn't DX8 to DX9 anymore, you have to appreciate the little changes that make your gameplay more imersive... otherwise, why are you a gamer? 😛
 
Interesting that you mention immersion, and you clearly have plenty of cash to burn on tech too.

Why don't you have 3x24" screens running eyefinity right now? You would actually rather wait on this fermi, without knowing anything about how well it will perform really, instead of do the smart thing and be enjoying 3 screen gaming.

You want immersion? No, you want nvidia and are prepared to forgo gaming quality because of that.

http://hardocp.com/news/2009/10/13/eyefinity_update

Fermi will not be capable of 3-screen gaming like this.
 
Alright, fine, forget BSON and fud for 2 seconds... if none of you are willing to look yourselves I'll save you the effort, is Anandtech good enough for you? They had a VERY frank discussion with a a couple members of nVidia, one is an engineer who is very... blunt about what it takes to make a GTX 😛

Here's the article from the start:
http://www.anandtech.com/video/showdoc.aspx?i=3651&p=1

Here they talk about pricing:
http://www.anandtech.com/video/showdoc.aspx?i=3651&p=7

Mass availability in 2010 is likely, meaning it is entirely plausible that limited high end quantities could make a late november early december release, which is what the "rumours" state.
 


Nothing says immersion like having black bars in between your field of view, eyefinity is a gimmick (and a retarded one considering when it first came out you couldn't even USE all the monitors without a special dongle on most monitors, gj AMD, have they remedied that yet?). I've never liked mutli-monitor gaming and never will.... I prefer one very LARGE monitor with great resolution and high definition. That is immersion to me.

As for quality, and what we know for a fact, that the Fermi will beat the 5870. By a landslide? Maybe, maybe not, but it will in fact be better for sure (unless the Fermi specs were pulled out of their arses).

nVidia is STILL the quality king, not the price king, that hasn't changed yet even if their PR and marketing skills are piss poor. So unless the Fermi comes out and is an overpriced piece of crap, I'm going to buy one and stick with quality.
 
So you like one large 2560x1600 screen, this is somehow better than 7680x1600?

Nah, you're full of it. We don't know anything 'for a fact' except that the 5870 is here now and blowing away nvidia's best, right now.
 


I'm having a very strange sensation of deja vu... it's creeping me out right now.

But anyways, how am I full of it? nVidia has released the whitepaper specs on the Fermi. It is what it is, and if those specs are true it WILL beat the 5870. That is a fact.

As for the screen... yes, one large 2560x1600 screen is better imo, I absolutely hate the bars splitting up the screen. I have dual monitors at work for AutoCAD and Solidworks and such, as it makes that easier, but a game... heck no. Seamless graphics or gtfo imo.

Now if I was Bill Gates rich, I could buy seamless fit LCD monitors and get a truly panoramic view for multi-screen gaming, and I would.

Unfortunately, there IS an actual limit to my cash, and that would be it.
 


Did you notice the complete lack of important specs in that whitepaper, ie core and shader clocks?
 


No it is not a fact... How short our memories are.. How good did the 2900xt look on paper again?

The thing may very well be the best scientific card in history... yet if it has to do direct x functions in software we won't be screaming its praise will we? Specs mean nothing until we can see it in action.. we can make an educated guess that it will be fast (I'm pretty sure it will be faster than 5870.. but who knows how much more it will cost.. there is also no guarantee that it will even be up against the 5870 when it comes out.)

It is far from 'fact'
 
I think you're underestimating the mainstream market ATI just created for ultrathin bezel monitors. The monitor selection for a case study is retarded, the makers would obviously go for a lower resolution to break into this segment 1080p. It's also stupid to think monitor makers won't adopt this strategy quickly as it gets more of their products in households. It should be far more cost effective for them to make these 1080p's rather than super high res.
 
😵 ?

I never said there wasn't such a market at all, I know there is. I said I hate it myself, I can't speak for the rest of the population. I am just saying I'd rather have a huge monitor than a bunch of ones put together with lines everywhere. Until there are 0" besel monitors, I ain't going mutli-screen for gaming.

And as for the specs, are you HONESTLY going to say to me, the GT300, from what you now know, isn't going to beat the 5870, if even by a little bit. Are you going to tell me that?

I already said, we don't know if it'll destroy it by a landslide or barely nudge it out, I'm not arguing that. The point is, it will be better.

I ALSO said if it ISN'T a ton better than the 5870 and it's way more in price, even I might switch over to ATI, but I just don't see that happening. If it does I guess nVidia is going to have to rethink what it's doing soon so it doesn't get pulled under by the tide.
 
We simply don't know enough about the fermi specs that count in gaming. So fermi is double everything...guess what the 5870 is double everything too, except clock speeds which went up by about 15%.

If fermi's clocks stay the same? Or what if they have to go down because of the new arch?

doubled g200 + same clockspeeds
doubled r700 + 15% clockspeeds

What wins there? And even if fermi does beat a single 5870, do you really believe it can possibly beat a 5870x2?
 


You have no facts, you have speculation. That is not science.. that is pulling garbage out of your ass. When we have benchmarks, and full detailed specs, we can make assumptions and assertations. For now all we can do is wave our hands and fall prey to the profound confirmation bias seemingly prevalent in everyone around here...

I currently do not know how much of the core is set aside for tasks almost entirely useless for gaming, yet profoundly awesome for Tesla applications.. I do not know if the core will support dx11 in hardware.. or how much. I do not know how slow software implementation of dx11 features through a general core will be. I have no idea how fast clock for clock this beast will be, or if it will just be another card that looks like a beast on paper but can't cut it in the real world.. If you attest that you know these things no one else does, then please let us know.. Otherwise your opinion (which it is just that) is no more valid than the folk who scream the gt300 will be slower than a 5870..

The only 'fact' we have so far is that we don't know... It is all well and good to assume that the card will be faster than a 5870.. it probably will be.. but there is noting more to that than an assumption.
 


Oi don't bring up x2 cards... cause you can just make a x2 Fermi. The point is whether or not a single Fermi GPU will beat a single R870 GPU.

In any case, lets stop arguing and wait and see what happens :)
 
Another question too is for DX11 games only. I know they aren't out yet, but since the GT300 isn't out either and probably won't be until the games are, this is worth considering. Assuming you are correct, and the GT300 beats the 5870 by 20% (very generous) in DX10 and down, what will happen in DX11 games? How big of a hit will there be from emulation of some of the features? Obviously, NVidia is hoping that the game developers will drop these like they did to 10.1 and make something like DX 10.5 instead, but this time they are late to the party and a lot of people are sick of them dragging their feet. Depending on the number of features emulated and the complexity of those features, I could easily see it chopping the GT300's lead into a loss.

But, this is of course all pure speculation. The only FACT we do know is we haven't seen a real GT300 yet, not even a measly engineering sample leaked, so we won't know much for a month or two I'd think.
 
I find it odd that everyone is saying I have no facts, and then immediately saying the GT300 will probably beat the 5870..... well, at least we all agree on that.
 
Yep, at this point it definitely isn't worth fighting over. I think in another thread that just popped up the first fake benchmarks are out, which is a good first step. Soon/eventually slightly more credible benches will come out, then we can start some good and fun arguments.
 


Obviously you do not understand the difference between assumption, opinion, and fact.. I suppose you skipped over the science classes in college.. be that as it may..

Perhaps you should look up what probably means as well..

We can assume that Fermi will be faster.. it is likely that it will be unless nvidia really made a lemon.. but that is not a fact.. that is an opinion.. You would be best to understand the profound difference between the two. One can hold a rational or irrational opinion.. there are no irrational facts.. they simply are right or wrong. Opinions and rational assumptions are based on facts... but they do not have to be. I could legitimately say the 5870 will be faster than the Fermi based on the same information at hand without being wrong.. there are no facts pertaining to real world, or even perceived, performance.

You cannot go preaching you know for sure.. or that you have facts.. when you have no such thing. You have an opinion that may very well be entirely based on your bias that nvidia will win anyway, regardless of the information at hand.. That i agree the new cards will probably be faster has nothing to do with my issue with your statements.
 


You seem like a very pretentious person, but I try not to judge, so maybe you're right and I am in fact being rediculous, it is possible. But someone else pointed out something rather important:



As it was so kindly pointed out, 50% more transistors. That is a FACT, cold... hard... fact. I am not just making crap up. So it can be pretty safely assumed, that with 50% more transistors as a reference the GT300 will be faster. Or as jennyh put it, "if it isnt faster then there is something badly wrong". And if something is badly wrong... I guess nVidia might follow the Dodo bird....



Those benchmarks were so hilariously fake it made me laugh for hours. nVidia rarely refers to the Fermi as the GT300....
 
I may very well be a pretentious person.. but in my line of work people get fired for bias, and mistaking opinion for fact.. at any rate.

Yes, it is a huge chip, since nvidia usually makes things properly one can assume it will be "fast." But there is no way to prove that yet, so we have to wait and see.. There are many many cards that have yet to be played. Transistor count does not translate into performance.. just what it "should" perform like.
 


Apology accepted...








Haha just kiddin.

But now that we're all on somewhat the same page, can someone please find whoever leaked the retardedly fake "benchmarks" for the GT300 and shoot them in the face with a .12 gauge.

I am looking for more Fermi info still and now all I can see is a thousand pages of "Fermi Game Performance Figures Leaked!". It's drowning out everything else. I used to be able to find Anandtech articles and such, now I can't even find articles on Enrico Fermi..... sob.
 


The gtx260 had 50% more transistors than the 4870.