Actually the R3xx and R4xx series support Geometric instancing in their VS2.0B. I use(d) it in Oblivion and FartCry quite effectively on the MRX700;
I thought someone might mention that
😉 The Since the 9700/9500 launched ATI has supported GI, unforunately they do not do so in the DX path. Its one of those features that has to be specifically enabled outside of DX.
Its kind of like 3Dc. Its there... but will people support it? Some yes, some no.
Anyhow, you got me
😉
But we all know that nV is absolutely horrible at branching, so it doesn't matter much, like I mentioned the performance gain is negated by the poor implementation.
Yeah, I agree. The only "useful" part of SM3.0 may be down the line when a game requires it to just to work (ala BF2 and PS1.4). A Radeon 8500 does not play BF2 well, but it does play it. It all depends on how long you have to keep your hardware.
That's the plan, but what about nV's 'Hybrid' G80, it'll be interesting to see if that's the first to do so or the last to not.
My guess is that NV's first DX10 GPU will be like all their other GPUs:
Heavy on Check boxes & And light on performance.
My guess the "hybrid" reference is to the Geometry Shaders and Vertex Shaders being unified, but discreet Pixel Shaders.
Personally I see this as a stinky situation. NV's market tactic (which, btw works... do well in current/older games with "check boxes" for the future) would indicate that they are gonna paralyze DX10 just like they did DX9. By not support FP24 and having dreadful DX9 SM2.0 performance it prevented quick uptake of DX9. There is no reason we should have had to wait for 2006(!) for a DX9 standard game.
I see the same thing happening with DX10. They are going to keep the dedicated shaders and thus their peak Geometry Shader & Vertex Shader performance will be WAAAAAY behind ATI (did I mention not even in the same ball park?).
So devs have a choice: Make software that runs poorly on NV hardware and alienate 50% of users (not likely) or work within the constraints of NV's architecture as usual. This would put ATI, theoretically, at a disadvantage because the flexibility of their hardware be ignored (and all those really neat Vertex-heavy designs and radical use of Geometry Shaders) and they would be shoe horned by designs that are best used with fixed function shaders instead of universal unified shaders.
In a perfect world everyone would buy the best hardware (who knows, that could be G80, but past NV hardware makes me cautious)... but the fact is NV has a strong dev-rel department with TWIWMTBP. Getting official patches for ATI specific features has been hard.
Perfect example: Oblivion. ATI was able to impliment HDR+MSAA. Their hardware supports it; so where did Bethesda get the idea that their design was INCOMPATIBLE with the hardware?
Easy: Bethesda is a TWIWMTBP partner and it looks really crappy on NV if their competitor gets a number of KEY IQ improvements (HQ AF, HDR+MSAA).
And I don't expect that to change much. The one "ace" ATI has now is that they have a unified shader part in the 360 and with XNA coming online they could see some advantage there. But then again devs seem pretty slow getting a handle on the 360 as well...