GT300 Additional Info!

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Solution


More like a Valentine's day gift for your machine. Guess you didn't read the Anand article posted above;

"The price is a valid concern. Fermi is a 40nm GPU just like RV870 but it has a 40% higher transistor count. Both are built at TSMC, so you can expect that Fermi will cost NVIDIA more to make than ATI's Radeon HD 5870.

Then timing is just as valid, because while Fermi currently exists on paper, it's not a product yet. Fermi is late. Clock speeds, configurations and price points have yet to be finalized. NVIDIA just recently got working chips back and it's going to be at least two months before I see the first...
And to those proclaiming NVidia's domination already, that article was 90% about compute abilities, not graphics. While it is very likely the card will be fast, probably faster than the 5800s, the question is how much faster. Today's articles left too much out to even guess, as real world performance often differs from paper and we didn't even get the full paper.
 
I just hope the competition is even more fierce than the one between GT200 and 4000 series. That will mean only 1 thing, faster better technology as competition drives innovation, and of course reasonable prices 😀

$380 to play games is not much reasonable 😉 (but same applies for $600 for iPhone 😀)
 
The thin line everyone had to catch is "we are sure it WILL be faster". So they don't KNOW for sure yet 😀. They hope - yeah, but not sure. I guess they don't have even drivers ready as they got the first working chips few days ago. Thats the main reason they didnt tell how much faster not the bullshit "we dont want it to affect our current sells" - phfffff

The only thing that will affect it bad is if you dont give people assurement it will be faster and worth waiting for. and believe me if they were SURE they would have told.
 
The day when Nvidia finally admitted defeat, and retreated from the gaming market.

This will be lucky to beat 5870. Both have doubled transitors but Nvidia have spent a lot more on non-gaming stuff. All I can think of is, they intend to bribe the gaming industry with $hundreds of millions in order to adopt this. Otherwise, nobody would be bothered.
 
The fact it runs c++ nativity is quite interesting and nearly sells me to get a version of one when it comes out to mess with it. I'm sure nvidia will set up an decent price vs performance card to compete with ATI i mean it's not like ATI are slouches like when the 2k series were released and nvidia could do w.e they wanted with prices.

It is a bit concerning that they don't know how it stacks up yet against ati flagship in games; maybe they are just unsure about the drivers and will rather reserve judgment till later either way it is turning out to be a very interesting time for nvidia.
 
yeah, gaming wise that didn't look as impressive. Sure there is a lot improvements but mainly on the compute related stuff which doesn't do anything on games.
But then again, just doubling pretty much everything gives you a lot of brute force to throw at the games anyways...
 
It will be 'capable' of running games great of that I have no doubt.

The problem is, it will take programming for it especially. No games dev is gonna be arsed with that...unless Nvidia starts throwing an awful lot of cash at them. It will take a lot of cash however, perhaps this is Nvidias last throw of the dice and they are desperate.

This cannot fail for Nvidia - think what that means. However ATI cannot allow it to get widespread adoption at the expense of their stuff either. We really are at make or break time I feel.
 


More like a Valentine's day gift for your machine. Guess you didn't read the Anand article posted above;

"The price is a valid concern. Fermi is a 40nm GPU just like RV870 but it has a 40% higher transistor count. Both are built at TSMC, so you can expect that Fermi will cost NVIDIA more to make than ATI's Radeon HD 5870.

Then timing is just as valid, because while Fermi currently exists on paper, it's not a product yet. Fermi is late. Clock speeds, configurations and price points have yet to be finalized. NVIDIA just recently got working chips back and it's going to be at least two months before I see the first samples. Widespread availability won't be until at least Q1 2010.

I asked two people at NVIDIA why Fermi is late; NVIDIA's VP of Product Marketing, Ujesh Desai and NVIDIA's VP of GPU Engineering, Jonah Alben. Ujesh responded: because designing GPUs this big is "FAQ'ing hard"."



So as we've been saying for a while, looks like 2010 for anything real, and also like we said back months ago, nV will keep launching paper until then to keep the Fanbois waiting, just like the NV30/FX strategy and the R600 strategy.

Can't run games on paper, can't develop for them either, especially not with a design that complex, emulation would be pointless.

The strange takeaway from these early articles, is that the nV Tesselation will not be in hardware? Makes that interesting, so it won't technically be DX11 capable hardware, it needs to emulate components in software, which blurs what is is to be 'compliant' and 'capable'.
 
Solution
exactly GGA!

They made a hardware which is very general and I am sure it will be able to be capable of running the dx11, however if its emulated by drivers we don't know what kind of utilization it will end up with. Also for developers will be MUCH more easy to just use ATI hardware for developing and now nVidia is in the position of adjusting to the market
 
I remember a conversation elsewheres about no ff tessellator, and it didnt get far enough. The question was thought to have been answered by the compliance need, so maybe not, as the 5870, has one, but it can be done in CS.
As for everything else, exactly like weve been hearing, and I think mostly its been Charlie leading the way, despite it all, as hes been as consistant as those with the "insiders facts", so maybe no CUDA for him, but certainly at least CUDOS
 
Charlie Demerjian has been telling us for the past 10 months that this was going to happen. Just about everything he has said has come true, and when we are in 2010 and still no g300 is available, he'll have been 100% correct.

You have to give some credit to Theo Valich also, strangely enough he was the first one to really push how much it would be 'cgpu', and also give reasonably close specifics on architecture.

Strange but true, these guys do know more about the truth that most do. At least this time around they did.
 
The strange takeaway from these early articles, is that the nV Tesselation will not be in hardware? Makes that interesting, so it won't technically be DX11 capable hardware, it needs to emulate components in software, which blurs what is is to be 'compliant' and 'capable'.

Is this in the Anand article? I came to this conclusion last night in a different thread about the G(T)300, I'm surprised if I got this right.
 

Nvidia did say that they don't feel that DX11 is not all that important and clearly they don't and after seeing the Battleforge results they might have a point.
20218.png
 
Ouch. Yeah, I will agree that DX11 hasn't proven its value yet, but we really need to move beyond DX9, so I'm hoping it will work in the end. Let's wait for a full DX11 game to come out before we write it off though. I really hope we don't have to wait for a DX11 console to come out before it takes hold.
 
Definitely true. As we saw with the DX10 fiasco, until a real game actually shows the benefits (visually or FPS) that DX might as well not exist. The only thing I'd advise against now is getting an expensive DX10 card. Anyone going high end might as well look at ATI's offerings or wait for NVidia's DX11 line.