GF100 (Fermi) previews and discussion

Page 24 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I have not been trying to equate the two and apologise if you got that impression, I have been trying to get across that a 40nm GPU is not the same as a 55nm GPU simply because of the node difference, even though it may be the same arch or based on the same arch, it's different but the impression being put forth is that they are one and the same with absolutely zero changes and that is just plain wrong, especially considering that this coming from people who are supposed to be knowledgeable in these things and as such should be clearing up the confusion rather than adding to it.
 
And there's where a doubt takes me off guard: are graphic engines (devs, actually) gonna adopt DX11 faster than we think?

'Cause if devs move from DX9c to DX10/10.1 instead of working DX11... Well, nVidia might not be so mistaken on their approach. They'll screw us with better old tech (lol) than not-well-developed-new tech. Let's see how Ati plays their card with devs and DX11 (since M$ prolly gives a cr*p about it, lol) and hope they make DX11 engines work out better before the 6xxx. AvP is going to be a good teaser for this, I think; if that game shows a good level of development might prove nVidia wrong in its approach.

Cheers!

EDIT: Missing point, lol.
 
GF100 will regain the performance crown but victory won't last long if rumors are true about ATI having a new architecture ready to battle against GF100 and GF100b/Fermi 2, both coming before the end of the year. We should probably see the first HD 6870 (?) featuring a 28nm GPU in late Q4 2010...
 


Eh, the 6000s are probably going to be here either very late this year or the beginning of next. It's a new arch which means they could have problems as much as nVidia has (well, not as much cause TSMC has fixed their crap, but some of them). It'll be interesting to see how it goes.

nVidia might use some of that time to get the Gen 2 GF100's out sooner as well so they can either match (I doubt it) or not be too far behind the release of the 6000's.

We'll have to wait and see.
 


True.
 


Actually AMD did announce they'd release either a 5XXX refresh or the 6XXX series. For a source, there's a thread in the graphics section i made called "5xxx series refresh coming h2" or something like that, it has a link.
 


I'm not even slightly inclined to agree with the first statement, but we both have respectable opinions - that's for sure. I don't have much caffeine left, so that I won't dwell much deeper in the pain that VGAs, sounds cards and Linux desktop environments can prove to be. Unless you want to use proprietary drivers from Nvidia / ATI - plus to feel the implicit pain on their installation - or to enjoy only 2D decent operation, it's better to stick with any Intel GMA killer graphics. Open source video drivers have always been severely lacking if compared to their closed counterparts - which are already miserable enough when compared to Windows' - and are never up to date with new hardware releases. ALSA plus Phonon can spell every kind of bizarre integration issues and dysfunctional audio symptoms - if audio works out of the box at all. If GNOME or KDE ever crashes, pray that a reboot will do it or have a good and long time playing with your distro's documentation - using your neighboor's PC, that's it. Or just give either your favorite tech support or geek relative a call. I've managed to go through all of this and then some, but there's no chance that most desktop users could face half of the crap. That being said, I still enjoy dual-booting both worlds.
 
You seem to have had a rough time with Linux. I'm pretty new to it myself, yet I've managed to overcome every issue I've run into so far even though I've borked Xorg configs and done numerous other things while playing around with it. When I run into major issues in Windows I just have to reformat because I don't have the ability to fix them, since the error messages are meaningless and therefore the source of the problem can't be determined. Considering I've been using Windows since the late 90s and Linux since about a month ago, that's saying something.

EDIT: Well, if you like ads then WLM is great. I use Pidgin, even on Windows.
 
mousemonkey, watching your posts you seem extremely biased toward nVidia.

Not that its wrong to like some of the teams but denying benefits for users is just not very smart.

DX10.1 IS and HAS always been faster than DX10 in terms of AA

even if its only 5% its still 5% in our benefit. One company denying it because it doesnt have it doesnt mean its stupid or not worth it. ITS WORTH IT. Period.

Hardware manufacturers and software developers are trying to squese every bit of perf you can get and you just disregard things like that. In such case why dont we use Dx5.0 still or first OpenGL?

However I am sure you will say GTX285 is WORTH it over 4890 even if its only 5% better right? so there it matters but on the DX side its not?


Again saying - being fanboy is not bad or wrong. I am ATI fanboy myself but I dont deny mistakes or wrongs of my team. Like I know and approve the problems 5000 series have. I am not very happy with current price blowing from ATI but they do it anyway. I wont be happy either if they say we dont need DX11 and so we wont implement it - I wont just put my 2 side face and say we dont need it or its not worth it

 

Well spotted, have a cookie. DX10.1 may have been a good idea but seeing as it was implemented badly, as in more of an afterthought, together with an OS that even MS have admitted was not the best thing they have ever done considering that it's been superseded by DX11 since then in a very short time indeed I'll stick with my belief that overall it's a waste of time and not worth bothering with. And if my preference for Nvidia cards bothers you so much why don't feel the same way towards those who prefer ATi cards?
 


Yeah, they announced what is probably going to be a summer refresh to compete performance wise with Fermi. At least that is my guess. I'm hoping Fermi knocks ATI down a bit off their pedistal so that they force the refreshes to be a significant improvement.... that'd be awesome for my wallet.
 
I feel the same way to all fan-boys who are bending facts or dismissing right things.

Dx10.1 was not well adopted not because it was not worth it but because MAJORITY of the cards are still DX9/DX10 (and that is more because of nvidia as they are the bigger market yet not supporting it)

So you cant say it sucks as it was not adopted because it was not BETTER adopted because of nVidia.


I am sure you know that its a loop and the chicken and the egg problem - same as 64 bit. Software waits for hardware, hardware waits for software. But someone should make the first step and continue.

I like every bit of new thing/gadget/tool because that gives OPPORTUNITY for future to be used which will benefit us. How well it will be used we don't know but I can tell you that I KNOW how well it will be used if its NOT THERE - IT WONT BE USED AT ALL.

Blind fanboyism is not healthy and I really hope Fermi is much faster than 5000 series and nVidia drops some bomb there (although highly unprobable) because we will see similar war like GTX2__/4000 series again and that was GREAT prices for all of us.

Go fermi I can say as $300 for 5850 is not very cheap price/performance, nor $400 for 5870


 

Some Nvidia cards do support it, even though as I maintain it's pointless because DX11 is now here and that as far as I'm aware includes DX10.1 and how many games have been released between the release of Vista and W7 that were DX10?, the vast majority have been DX9 only as far as I'm aware.

That makes no sense to me, sorry.
 
I wouldn't blame DX10 and DX10.1 adoption being poor because of nVidia...

Blame devs for trying to seek profit based on the consoles HW support (DX9c-ish and OpenGL ES 2.0 ish). Wich I might add, is not their fault; nor it MS's nor Sony's, nor Nintendo's. If the PC market is not "good enough" for Devs to look at, on the long run, graphic improvements will go at console HW develop speed, wich would be sad 8(

In fact, nVidia is trying to say "guys, there's more to GPUs than rendering 3D stuff" and give a little more value to the market. Ati is following that trend, but kinda slower IMO.

Anyway, I hope ATI and nVidia can keep up their R&D 😛

Cheers!
 
http://www.pcgameshardware.de/aid,705737/Geforce-GTX-4xx-Erste-Fermi-Grafikkarte-auf-der-Cebit-gesichtet/Grafikkarte/News/
We have also learned that the smaller version of the Geforce GTX 4xx is supposed to cost about 300 to 400 Euros and the top model about 500 to 600 Euros.
 
20100228152358_NVidiaGTX390-1.jpg

The info was pulled because nVidia didnt want certain things out, thats what was pulled
 

That's a Quadro or Tesla card not a desktop part, note the lack of DVI/VGA port.

Tesla_c1060_3qtr_low.png


Also worth noting in my pic is the cross headed screws holding the end plate on, or what some refer to as 'wooden screws', and that's a working model that is available to buy and use by the way.
 
Status
Not open for further replies.