NVidia and DX11

Theres rumors starting to emerge that nVidia currently wont have any high end DX11 solution.
This is all speculation, and we'll know how it all turns out with their release of their cards.
The rumor goes as follows:
Since they wont have a high end DX11 card, theyll only be releasing low mid end solutions at low low price points.
This, in effect, will discourage devs from DX11 paths into new games, as the low end cards wont help much obviously, and even ATIs high end may nt do alot, as is often seen with new DX releases.
These rumors come from the East, but by nVidia people, not ATI people.
I hope these rumors are nothing more than that, just rumors
Im not trying to flame, start wars, or anything of the kind, but nVidia has shown in the past, when their cards arent ready (ala the original DX10) or more recently (the removal of DX10.1 from AC, and the shunning of it) the lack of DX10,1 thats been out for quite some time, I find these rumors as possible. Thoughts?
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
I highly doubt they will do this or one major reason. They missed out on DX 10.1 and while there wasn't a huge amount of games for it (mostly do to the fact DX11 was around the corner) they were still seen as 'behind the times' in some way. If they don't step up they will lose people who are undecided on who to go with because ATI will look more current and future proof.

Also they won't be discourage to many devs simply because the all powerful Microsoft is laying down the law for DX 11 because they need it to boost sells on Windows 7, and they need content for that. Even Nvidia doesn't have the power to sway the pull devs will feel from Windows because they know if they don't come out with content people will think it's a dud and not buy it later on.

ATI is catching up in sells and in card performance, they can't afford to give people another excuse to get ATI cards and DX11 is a big one in the hardcore market as we all know people like us want the latest and greatest in their cases.
 
Problem with that is, the all powerful M$ relented originally on DX10, thus the need for DX10.1 to begin with, as nVidias card werent capable. And still arent. The called back some of the original DX10 because of nVidias lack of hardware ability.
And we all saw how M$ touted DX10 and Vista together.
Also, theres never been so slow an uptake onto a new DX model before this happened, as it went hand in hand, as they removed the best performance enhancing ability of DX10 by removing the now DX10.1 one less pass option using 4xAA
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810


yeah but since then ATI has had a jump start. If Nvidia says now Microsoft will simply say "to hell with you these other guys are already on board and making them." They have to much riding on DX11 to not get hardware support from someone, even if it means relaying ATI to deliver a amazing line up.
 
http://bbs.chiphell.com/viewthread.php?tid=51645&extra=page%3D1
http://forum.beyond3d.com/showthread.php?p=1320832#post1320832
http://forum.beyond3d.com/showthread.php?p=1320818#post1320818
http://www.anandtech.com/showdoc.aspx?i=3320&p=8
http://blogs.msdn.com/ptaylor/archive/2007/03/03/optimized-for-vista-does-not-mean-dx10.aspx
Above are thoughts, decisions made, actions taken and other proofs and ideas of how things have been, and possibly may be where theyre heading. I hope not.
But each link shows certain things of which has shown manipulation 1 way or another, which Im sure is feeding this current rumor.
 
In Taylors blog, it was clear that the ATI shaders were better at doing DX10 (as it was then at the time, before they removed the DX10.1 part) than nVidias, so, they ended up taking the lowest path, thus hurting ATI
It was plain to see, without AA, the 2900 kept up pretty well to the G80, once you applied AA however, the perf nosedived, since it didnt have the 1 less pass included.
So, yes, you could say M$ this time is going to tell nVidia its either sink or swim, but what of the game devs?
Look at my link regarding Batmans Arkhum Asylum. There, theyve convinced the devs to not include AA usage at all for ATI cards, let alone the recent elimination of physx using anything other than a nVidia card, and that also includes the aigeia cards as well.

Theres been tons of rumor about the G300 simply not being able to hit the timing of DX11/W7 release, and possibly even the end of the year. How are the game devs going to model their games with mid class only compliant HW from nVidia? Is it a M$ all over again?
How much attention or word has come from nVidia regarding DX11?
I can give the Siggraph papers links, where everything from gpgpu to DX11 and LRB was discussed, with everything pointing towards DX11, even on consoles, but again, where is nVidias push?
http://developer.nvidia.com/object/siggraph-2009.html
Heres nVidias Siggraph papers. Ive looked on that page, and couldnt find DX11 mentioned on it, at all
 

Annisman

Distinguished
May 5, 2007
1,751
0
19,810
Just wondering, is there no AA at all of ATI cards in batman arkham asylum....I saw in the demo that only Nvidia cards could do it, but will this be a final release decision ??? That would be too low.
 
Im sure ATI will have them put it in, if not, I agree.
But, the point is, why wasnt it there to begin with? Anyone ever complain about ATI sponsored games that nVidia cards cant use full functions? No? Then why here?

In my links, it shows how ATI typically advises game devs, how they suggest DX10 games compliance, then later may suggest DX10.1 abilities as well.
nVidias and Batman Arkhum devs use only the nVidia path, which can be renamed with workaround using the nVidia name even.
To me at least, even if it does get "fixed", theyve already sunk that low, both nVidia and the game devs.
Im sorry, I am coming down hard on nVidia here, but to be totally honest, Im sick of this backwards cr@p, and I actually mean it when I say, Im hoping for a G300 early surprise, so maybe both ATI and nVidia can work in tandem getting DX11 off the ground.
It means the earlier we see it, the better our games will be, and PC gaming needs the boost, cause even tho you may be in business just for the monies, and you may be in heavy competition, taking care of the larger picture should be in everyones focus, including ATI and nVidia
 
And how exactly was the DX10 transformed? Did the original have exactly the same coding? Or was the coding changed because of the latter adoption? If you have links, Id love to see them, maybe Im wrong, but it seems feasible to me.
Also, you have to take timing into account. What HW changes were made by ATI, scrambling with the release of the 2900 which ran hot due to TSMCs 80nm and couldnt reach their projected clocks, and their HW resolve for 1 less pass was changed, and only minor tweaks could be adopted, which in the end, may have eliminated the 1 less pass option as well?
Lots to consider here, especially taking into account all scenarios
 
"So now let’s discuss how our DX10 plan has evolved.



In Aug and Sept, as we lacked DX10 hardware at all (much less production quality hw) we realized we couldn’t simul-ship with Vista in Jan 2007 since we believed we needed at least 6 months with production quality hw. Getting early hw in Oct, and production hw at the G80 launch in early Nov meant at a minimum May 2007for release of DX10 support; given a 6 month schedule and a perfect world.



However, as FSX Launch occurred Oct 17 and we began to get feedback from the community we realized we needed a service pack to address performance issues and a few other issues. So we started a dialog with the community and made a commitment to delivering a service release. The work on SP1 and DX10 is being performed by the same team of people (the Graphics and Terrain team) and thus delivering SP1 has delayed DX10.



Given the state of the NV drivers for the G80 and that ATI hasn’t released their hw yet; it’s hard to see how this is really a bad plan. We really want to see final ATI hw and production quality NV and ATI drivers before we ship our DX10 support. Early tests on ATI hw show their geometry shader unit is much more performant than the GS unit on the NV hw. That could influence our feature plan."
http://blogs.msdn.com/ptaylor/archive/2007/03/03/optimized-for-vista-does-not-mean-dx10.aspx
So, according to the DX10 devs at M$, the 2900 did AA better than nVidia, what happened?



PS This is also the belief of our missing fellow at large, TGGA
 

Annisman

Distinguished
May 5, 2007
1,751
0
19,810
Btw, I would like to point out (although a off-topic) that such games as Batman are incredibly affected by Nvidia and the 'way its meant to be played' slogan. Not only is Nvidia Physix touted, there is literally things missing if you don't turn physx on (aka, don't own an Nvidia card) some things that have nothing to do with physix at all. Such as with physx off, there are no hanging banners in some rooms at all, not even static ones. With Physix on there are banners hanging up, and 'moving realsiticly'

Physx is great and all for those who can use it, but why downgrade the overall game for those that don't own Nvidia ?

This was just one example, there are many others in batman, and other twimtbp games out there.
 
Without AA, the 2900 held its own.
And, read what I posted, the same team was working on the SP1, which had to be changed, because HW wasnt ready, so, they took the lessor path, which worked on nVidia cards, which hampered ATI cards, and in the interum, the 2900 was too hardbaked, running behind on a crappy leaky node, and had its original design shifted right underneath them by this change in DX, and adding it later wasnt the same thing, and or, AMD was caught off gaurd too late in the process to make further changes. Thats why I asked for links, as this is the common belief as to what happened, and am suprised you didnt know this.
Look up the latest tests 2900vs G80 with no AA, youll see what I mean
 
Early tests on ATI hw show their geometry shader unit is much more performant than the GS unit on the NV hw. That could influence our feature plan."
What else? The shader model was delayed as to how it was to be done, and later, they couldnt resolve this "more performant" action again, after SP1 arrived. They had it before the change.
 
Sorry SS, cant find it. Anyways, thats what the folks think, TGGA does as well as I do too.
But, back on topic, if this comes to pass, its a leveraging Im just no longer capable of being ok with, at all.
I hate fanboyism, and yes, Ive always admitted to preferring the red over the green, but Ive bought both, and enjoyed both, but if this does happen, Id hate to become something I hate, course, soon, theres always Intel heheh
 


Actually, DX10 update has been surprisingly fast when you consider both the XP factor, and how long it took to break DX9 in...
 


In the Batman case, I haven't done the demo, but I would assume those banners (Even static ones) are effected in some way by the PhysX API. And again, nothing is preventing ATI from adopting PhysX. And once you consider how PhysX has already been ported to all the consoles, only ATI is preventing the unified Physics API from being a reality.

I find it odd: Now PhysX is actually being used properly, and people are complaining? Go figure; that API can't seem to win...
 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790
this must have something to do with TSMCs leaky process they must be having a hard time getting such a big chip out the door with such a leaky process out the door, well this seems good news for ati and they need this more than nvidia
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810




They had PhysX and it was shut down. Then the ability to use a PhysX card along side a ATi card was shut down. ATI already adopted it but it's Nvidia that is not allowing it.