Nvidia: AMD's Lead in DirectX 11 is ''Insignificant''

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I feel the only way NVIDIA will pull a rabbit out of their hat's is if they pull a good old ATI stunt(radeon 9700 days). Their new GPU should be twice as fast as the latest AMD GPU. Then it would be worth the wait. I do agree that 3 months is a long time in the Tech Market for a company to wait, and I also agree that people tend to rush out and buy the latest and greatest cards. Not to mention they are till running native 1680x1050 which almost all recent cards can handle. I am neither a fan-boy of either company. I currently run a 8800gtx which bodes well, but I am eying a 5850 right now for my new I7 920 build.
This is also freedom of speech so let there be speech and lets lay back on ragging on individuals here, Fan-Boy or not.
 
[citation][nom]Drag0nR1der[/nom]Well, I'd say they should be 'doubly' not worried by AMD given the complete lack of availability of AMD's high end dx 11 cards, the 5850 and 5870[/citation]
check newegg before opening mouth and inserting foot.

Both the 5850 and 5870 can be purchased with several brands to choose from, been that way for a couple weeks now.
 
60 days late? if they release thier cards in march that will be more like 120 days, already downplaying how late they really are in articles like this, nvidia is a terribly shameless company, they need to really look at how they are handling themselves, if i was part of nvidia i would be ashamed, i can honestly say through following the corperate strategy over the years, they have handled thier PR they are like an egocentric teenager, its pretty lame. Grow up nvidia. i could go on, page after page about why and about why thier strategys are bad and not going to win them anything in the long run.

ion and tegra are a complete flop. thier processor indeavors have yielded nothing, and they are behind on the money maker, gpu's. not to mention their design philosophy for the long run has them making less money on gpu sales. trying to dominate the market by creating a pathetic monopoly with chez physx, thier 3d stuff is chez, for dx11 im going ati all the way, sorry nvidia your hardware engineers have been behind for awhile now, and now your software engineers are also behind. obviously they cant admit how down they are or people wouldnt buy stock or the remaining loyalists would drift to if they were honest to the public, but time will tell, oh yes nvidia time will tell and eventually fluff and lies wont sell your video hardware.
 
well, im sick of nvidia i sold my entire system and got a new one based on AMD chipset and AMD cpu and ATI card, and guess what guys?? if nvidia is the only graphic chipsets company left in the world i will quit gaming! and i will quit my graphic studies.. when your enemy says "insignificant" that means that he means the opposite.. F.E.A.R is knocking on nvidia's doorsteps :)
 
[citation][nom]blacksins[/nom]well, im sick of nvidia i sold my entire system and got a new one based on AMD chipset and AMD cpu and ATI card, and guess what guys?? if nvidia is the only graphic chipsets company left in the world i will quit gaming! and i will quit my graphic studies.. when your enemy says "insignificant" that means that he means the opposite.. F.E.A.R is knocking on nvidia's doorsteps[/citation]

I call BS. Just sayin.

Truth is Nvidia is needed, ATI will jack up their prices without competition. Same thing with AMD and Intel. AMD has the inferior processors when it comes to top end, but I still support them because competition is needed. Right now I'm very happy with my ATI card, and I don't mind supporting their company, however I don't want to see them as the only GPU manufacturer in the industry.
 
Bold words for a company that doesn't seem to be winning the pricing and performance wars within the GPU market.

I do have confidance though that the 300 series will be quite good. But nVidia tends to price cards a little too high.
 
[citation][nom]deathblooms2k1[/nom]I call BS. Just sayin.Truth is Nvidia is needed, ATI will jack up their prices without competition. Same thing with AMD and Intel. AMD has the inferior processors when it comes to top end, but I still support them because competition is needed. Right now I'm very happy with my ATI card, and I don't mind supporting their company, however I don't want to see them as the only GPU manufacturer in the industry.[/citation]
Right, but im talking specifically about nvidia's recent action, talking about jacking up prices in the absence of competition is a whole another subject, we both know that we won't see a company like nvidia to the ground :)
 
[citation][nom]backin5[/nom]I guess Nvidia decided to stop manufacturing GPUs and chipsets so they can focus all their efforts on negative propaganda against the competition.[/citation]
This is my thought exactly--I always just went to whoever had the best technology, but Nvidia's arrogant attitude and total dismissiveness on legitimate and very solid product have shown me that they're not humble enough to just take one on the chin and move on like they should. For that, I have no problem supporting ATI from here on out...unless they start acting like Nvidia that is. I have a 5850 and am loving it btw.
 
[citation][nom]ripperko[/nom]Nvidia sounds really confident with what they got.[/citation]

I have high hopes for Fermi as well, but in case you haven't noticed Jen-Hsun Huang would (will?) sound confident with a gun to his head - it's like an unwritten law of physics, predetermined at the moment of the universe's creation.
 
this is like the Red Sox saying they have better pitchers then the Yankees when in all honesty..The Yankees won the division and World Series in 2009. Next year the Yankees will buy a young gunner and kick-ass in playoffs while Boston will stay at home revamping their "line-ups". Bottom line: you're only as good as your latest achievement. Don't sit on it. Walk the walk Intel and Sony (Nintendo still has a better track record then you in the gaming industry)
 
wait , i forgot to mention i bought them a couple of 5770's each ... 😀 ... nvidia better make the world see light in fermi , otherwise doom time for the green team !
 
I can't stop chuckling at the Nvidia hatred in these threads. Half of them are make-believe confrontations with people posting like Nvidia hangs on to every teenager's opinion on the forums.

Don't forget, the gaming market for PCs sucks right now. Hardly anyone is making a big push to write your beloved DirectX titles while a single chipset company is making chips to utilize the hardware. For DirectX 11 to have any possibility of taking off, the biggest player has to be in the game. Until then, enjoy Dirt2 at 40fps.

Sorry kids, I'm not a 'fanboi'.. I've migrated between brands as one tops the other a few times. It really has to do with who has the better card when it is time to upgrade. If AMD has a better CPU, I'll grab it, if Nvidia has a better card, sure, I'll grab it. I'm not one to go the budget route, so meh, my last build falls into the AMD/ATI fanboi rage category.
 
[citation][nom]cyberkuberiah[/nom]wait , i forgot to mention i bought them a couple of 5770's each ... ... nvidia better make the world see light in fermi , otherwise doom time for the green team ![/citation]

Yeah, they are totally going 'down'.. They only single handedly own the OEM market. If the world worked the way you kids think, AMD would have been a dead duck way before the X2.
 
Let's see if we can make an educated guess on FERMI'S PERFORMANCE(for gamers)based on known facts.
1. Shaders-Fermi has 512 shader cores-giving it just a bit more than twice the shading power over the previous generation.Practically t should be in league with gtx295(HD 5870 is 485 mm2. With this transistor budget FERMI is almost a RV870+RV770 but in reality we are seeing only a real world performance gain of ~12-15% over RV870. This is because much of the budget is spent in implementing high performence DP and ECC.Though these features are really exciting from a GPGPU standpoint,hardcore gamers couldn't care less about features which have no tangible benifits in games.So in terms of die size NVIDIA is in the same huge disadvantage like in the previous generation.To maintain competetive pricing with ATI it will make much less money from its bread and butter GPUs.
4.DX11- The original DX10 specification that was released for implementation by hardware manufacturers lokked more like the dx11 spes.However NVIDIA stubbornly didn't support quite a few features in that set whereas ATI did all the implementations all right.So with the largest maker of gpus not supporting some features MS stripped off dx10 off those.So NVIDIA came out first with some amazing piece of engineering (8800gtx) and ati came late with 2900gtx with a larger die supporting unusable features like tessallation.So, when dx 10.1 tried to restore some of the original dx10, ati was very fast to enable support for it whereas NVIDIA is still having driver problems with dx 10.1 cards.
So ATi has some years of working experience with many new dx11 features while NVIDIA is building them the first time along with a Compute focused GPU.hence the dx11 implementation and driver support can be a bit patchy(may be not- just speculative)on NVIDIA's side.
So overall the world gets the fastest single GPU but not by a significant margin and with higher power consumption.Then again a dual gpu version may not be feasible at all as it will surely surpass the 300W mark(PCIe 75 Watt+ 6 Pin 75 Watt + 8 Pin 150 watt) unless they implement it with a scaled down version within the power limits.In that case it might be at the same level as 5970.In that case the late arrival will be deadly for NV.
5.Time:This is going to be the biggest disadvantage of FERMI. It comes late to party but not with significantly higher performance in single gpu and about the same performance in dual gpu.And not only its late it will launch closer to the raedon 6000 series than that of raedon 5000 series.Given the resource spent on fermi and the much slower development cycle compared to ATI NV might see the GPU market share reversed.
While 9800pros banged the shit out of fx5000 series nv did caught up and in the next two gens(x850xt vs 6800ultra,and x1950xtx vs 7900ultra though 1950 xtx was a tad faster).Then in the 8000 series it came out way ahead due to the fiasco that was 2900xt(looked better on paper). Then ati changed its strategy(focus towards mainstream gpus with less die size and more value for money).Though it looked like ati had lost the game in the radeon 3000 series it came back with a bang in the 4000 series. Now in hd5000 series ati's strategy looks to be the more sensible one.This may lead to another reversal in the gpu arena for another 1-2yrs.
 
MAN WHAT HAPPENED IN THE PREVIOUS POST?WHERE ARE THE POINTS 2,3 THAT I POSTED.WELL HERE THEY ARE
2. Memory Bus-The 384 bit GDDR5 will give it a bandwidth advantage of about 50% over Cypress.However as we know from overclocking 5870(it's been done here and also at guru3d) these high end graphichs cards are not bandwidth starved.Shading power is more important.So in single precision calculations(the games and many hpc problems) the higher bandwidth s not much of an advantage.However in dp arithmetic it will come in real handy.
3.Die size-With a 3 billion transistor budget and TSMC 40 nm process we are looking at a die size >485 mm2. With this transistor budget FERMI is almost a RV870+RV770 but in reality we are seeing only a real world performance gain of ~12-15% over RV870. This is because much of the budget is spent in implementing high performence DP and ECC.Though these features are really exciting from a GPGPU standpoint,hardcore gamers couldn't care less about features which have no tangible benifits in games.So in terms of die size NVIDIA is in the same huge disadvantage like in the previous generation.To maintain competetive pricing with ATI it will make much less money from its bread and butter GPUs.
 
DX10 was insignificant as well. The only thing I fear is that Fermi will be far too expensive and perhaps obsolete when AMD/ATI release their new 28nm R900 in November 2010. Not a smart move by Nvidia relying on brand loyalty when the end user's experience is the same regardless of which vendor they buy from.
 
@geekrick - Interesting technological points. How do you see it on the business side of things? NV is fabless and doesn't have a CPU part. With things migrating towards on die GPU's, NV could be locked out in the future (although Larabee falling apart is a new wrinkle). It would seem the NV can't position its parts as solely high end gaming parts (who's industry is in the tank). I can easily see NVs urge to position its parts as computational tools with a gaming element rater than the other way around. Going toe to toe with AMD on specs that appeal mostly to gamers won't help them in the long run.

What say you?
 
They also didn't support DX10.1. I wonder if nVidia just thinks they're above the API...

A GeForce 3XX card better show up in the next month, otherwise I'll be joining the ATI camp as well.
 
btw, is it just me or is this website getting cluttered with more and more trashy ad's lately? oh wait.. yes it is. and more broken code too. and articles that don't appear... wow.

anyway, n\/idia firmi will also be a butt load more expensi\/e. guarantee it.
 
First of all, to everyone agreeing with Nvidia that 60 days is insignificant, Nvidia is exaggerating how quickly they will have their cards out. The 5000s came out in Semtember. They won't release their cards until the end of January at the EARLIEST. Not until March or April at the latest. That means ATi has a 4-7 month lead.

Second of all, they are right that DX11 isn't that big a deal, but they, along with the Nvidia fanboys, are acting like the cards are nothing but a DX11 gimic. Even with DX11 completely out of the picture, ATi has every Nvidia card beat right now. They're using record breaking technology that Nvidia can't compete with. Their GTX295 flagship is rated at 1.7 teraflops while ATi's flagship has hit 5 teraflops. The card's $100 more but 3 times faster and you call that a DX11 gimic with a straight face? Give me a break.

Let's also not forget that ATi isn't going to stand still and wait for them to catch up. AMD/ATi has a lot planned for the coming year and by the time Nvidia releases the GT300's or whatever, AMD and ATi will have new technology ready, putting Nvidia a whole generation behind schedule. Delaying your cards also hurts your fanbase. If they think people who like to stay on the bleeding edge with wait 6 months for their cards, or throw away a $300+ ATi to upgrade to their stuff, they're dead wrong. I've been looking forward to Nvidia's cards and was hesitant to buy an ATi but now I own a 5850 and I'm never buying an Nvidia again if this is how they do business.
 
Status
Not open for further replies.