NVIDIA possibly abandoning mid and high end market

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

randomizer

Champion
Moderator
http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx285-gtx275-gtx260-abandons-mid-and-high-end-market/

According to Charlie, The GTX260, 275 and 285 are all EOL within weeks if not already. Take with a grain of salt, obviously. Also, expect NVIDIA PR to deny the whole thing, and for said PR response to show up on Fuddo.

IMO Charlie is milking the "success" of his previous article and using the opportunity to send NVIDIA's public image through the floor.

EDIT: Nice contradictory article right before it too: http://www.semiaccurate.com/2009/10/06/nvidia-will-crater-gtx260-and-gtx275-prices-soon/ :lol:

I love Charlie, he certainly knows how to take the "professional" out of the journalist profession.
 


This.
 
Frankly, I'm going to say it: If anyone thinks NVIDIA doesn't have a working DX11 part, or that the G300 won't have DX11 support, then personally, I think they're insane. Heck, the only ones who keep pushing to roumer on dropping the high end GPU markets, or lacking DX11 support, are the people over at Fud...

Everything that I'm seeing is pointing to a G300 launch mid-december/early-January, with DX11 support and ~5870 performance.

As for DX10.1 cards, considering 11 is a superset of 10.1, im not shocked NVIDIA is going back and adding 10.1 on its low-end refreshes at this point. Its to be expected (as long as you assume they've already worked out DX11)
 



I'm insane.


Right now, I don't think they have a working DX11 part.


G300 will have DX11 support, some of it will be from software though, and that will take time to develop, time they haven't had since they've only got working samples back very recently.



As for DX10.1 cards, considering 11 is a superset of 10.1, im not shocked NVIDIA is going back and adding 10.1 on its low-end refreshes at this point. Its to be expected (as long as you assume they've already worked out DX11)

Why bring out 10.1 now? Why not introduce DX11 across the board? Develop a chip with for a planned shelf life of... 3 months?
 
Yep, I agree.

What constitutes DX11 support? Doing in software what is supposed to be done in hardware? Then wouldn't a Parhellia card be 'DX11', just a very slow DX11 card? And if it needs to be sussed out in drivers (ala intel GMA DX10 support) then at what point does it become DX11 even then? Not at launch unless it does everything fully and properly at launch.

And that's exactly it, why bother bringing out another DX10-DX10.1 part that cost another boat-load to develop instead of even doing DX11 'wrong', where a poor DX11 part that still does DX10/10.1 right is still bringing more to the table rather than showing up to the DX10.1 party 3 years late when everyone has already moved on to the after-party and the DX11 party. It's not even their low end, it's their mid-range, which even when the Fermi does solidify into something more than vapour it will still be too big a chip for the true low-end.

They're stuck making the G80/92 in one form or another to stay in the game, not lead it, just trying not to get mercy-rule'd by taking on DX10.1-ish support in the extra die space of a process shrink (couldn't think of anything better for it since we thought DX10.1 was so useless before).

OOooh what a technical achievement; that makes me confident of their DX11 abilities, especially in light of their playing down DX11 and the rumours of no tesselator, after everyone, including G3K criticizing the HD2-4K tesselator. :heink:

Heck even S3 beat them to DX10.1 almost 2 years ago. :pfff:


 
I really think nVidia should do some market studies to find out what people want from their company. Some marketing geniuses started to give the orders to engineers and it all went downhill from there. Come on...supercomputers...that is 5% of their total market. They are so far off reality that it would take some time for them to recover. They should have learned that small chips targeting the mainstream (like G92) are the future - performance at an affordable price. I understand that there are some people who can afford giving 500$ for a graphics card every year but the majority is not like that, and the revenue comes from the majority. Having a good performing card should not be a premium in these days, that is why I went to ATI. If nVidia will abandon mainstream and gamers then ATI will have no reason not to price their cards at 400$ for the mid-level and then everybody loses.
 
if g300 does not do Tessellation on chip, i can see the 5870 walk all over the g300 in games that use heavy Tessellation, its one of the best things about DX11, so from my point of view the g300 wont be a DX11 card, more like a DX10.1 card with some extra bits added
 
It does indeed seem like the story is true. I was going to buy a new system a few days ago, based around the GTX280. It turns out the biggest shop in this country (Portugal) where i usually shop did not have it, or the GTX275. I spent the last 2 days looking for either of those cards in every shop i could find, and everywhere i called i was told the same thing: nVidia was pulling all the mid to high range cards, and none of the shops carried that product anymore. I was rather stunned by that, because nVidia doesn´t have anything to replace them with. I ended up going with a ATI based system, wich are now selling like hotcakes. Not a very bright move by nVidia.
Don´t know how its going in other countries, but over here ATI now has the monopoly.
 


At least this new monoply will deliver a worth while product so I am selling off a lot of my spare parts and a plasma so I can get a new ATI that would actually hold its own.
 
Please people; NVIDIA will have a part out by January.

As for 10.1, adding full DX11 requires a tessalator, which for low end cards does nothing but drive up the price. Hence, why the lower end cards use 10.1 instead of 11. Makes sense from a budget point of view.
 



From a logical perspective, that is quite possibly the most f**ked up post I have ever seen here.


DX11 adds more transistors, so Nvidia have chose not to do it for cost reasons, as they believe they will see more monetary returns from having an outdated part that will sell poorly compared to an up-to-date part that consumers will want.



Note that this is the same Nvidia that was perfectly happy to build Fermi, a GPU near twice the size of Cypress.



:lol:
 
This looks like it may be the R600 and the NV30 rolled together, where one was late and underperforming, while the other was late, underperforming and had to do some things in SW.
I ranted months ago about this, and didnt want to see it happen, as it puts strains on the pc gaming scenario, and I believe it wont automatically hand over the commanding lead to ATI, and game dev.
This again is another setback for pc gaming, as its just another reason for devs to wait it out til the consoles refresh.
I dont care if LRB will do everything in SW and have little or no fixed hw for most of this, and it seems nVidia wants to show it can do it too, which again, leads to my problems with pc gaming, and its dev.
I understand this is what the devs want to an extent, but moving too fast at once only plays into the hands of slow dev, not better or faster, and possibly fractures any coherency weve had.
Once you wander away from a leading shared source, it causes these things, as we saw the slow dev of DX10, and even going backwards on DX10.1

If nVidia wants to be in the games, then this is the way this game is meant to be played IMHLO
 
Please people; NVIDIA will have a part out by January.

They hope. JHH was seen holding a chip, not a card. Assuming the chip they tapped out in late Sep doesn't need any changes, then January is possible. If anything needs to be changed, it will be later.

As for 10.1, adding full DX11 requires a tessalator, which for low end cards does nothing but drive up the price. Hence, why the lower end cards use 10.1 instead of 11. Makes sense from a budget point of view.

Only if your chip needs to go on a diet. As mentioned, there is more to DX11 then just the Tess. There are also need shaders to program for and SM5. AMD will have fully working DX11 parts in the low end, so why can't Nvidia? Why allow your high end parts "support" DX11, while limiting your mid/low end to DX10.1? That doesn't make sense.

Also keep in mind that because they only have chips right now and not cards, we don't know how the drivers will be. IF they can launch a card in Jan, I wouldn't be surprised to see poor performance while they work out the driver issues. If they are going to handle DX11 in software, the drivers will be VERY critical. I don't see Nvidia changing their problems, its G200 all over again. Large expensive chip that while faster then AMDs offerings won't be as good a value.
 
Well, heres their DX10.1 model coming soon
4ddd891d-bca7-4ed6-8adc-c01a0459fdd.jpg


It looks ok, but its only DX10.1
 
I mentioned a while back that I felt Nv were heading off in a different direction and I have seen nothing in the last couple of months to do anything but reinforce that feeling, it might be a little too early to congratulate ATi on their complete dominance of the gaming GPU market just yet but if it happens soon I wonder how long it will take for the bitter sweetness of the victory to sink in?
 
I just cant believe how Nvidia have capitulated. Their new 40nm stuff is embarrassing, the 9800gtx die shrink to 40nm is getting owned by the 55nm 4850? What the hell is going on there?

Something really, really bad is wrong at Nv HQ. AMD need the money more badly than Nvidia do, that's the only reason I'm celebrating, but if this continues we really might end up with just ATI doing gaming graphics, which is something nobody wants.
 
the new GT220/GT210 are die shrink and cut down version of G200 not GTX9800.

And yes, they are more than embarrasing, these cards doesnt make any sense and I dont know why they are released at all. They are going to loose money on them for sure if they want to sell them. Better save the 40nm for more Fermi chips than this crap.

I really hope they have some jockers in the sleeve or I smell real trouble for them
 
I wouldn't congratulate AMD yet, I still think they are far too well suited to FAQ this up more than anyone else. I would say it's optimistic as best that AMD will pass nV in market share, and that's with what essentially is a disjointed nV at the moment with little attractive to bring to the table. AMD has been unable to execute on some many situations that seem like no-brainers (Get in the Game, Princing, Linux suppport after saying they were commited). Of that only Linux seems to have improved dramatically, and while GITG is talking about a handful of DX11 titles, with the ease of the transistion to DX11 versus DX10 last time, I'm not as impressed or convinced that that has so much to do with AMD as it had to do with the devs themselves who were already pointing in that direction for their own DX11 checkbox.

Where's the tech demos too? WTF, nV had a ton at the G80 launch and the best AMD does for DX10.1 is the ping-pong filled room and then the Goblins right before the launch of the HD5K, but nada for DX11 itself?

Their chance to take this momentum with them into a risky new design (all new designs are risky) next year is something they should capitalize upon, but they don't seem prepared to take advantage of an R9700 style opportunity again, they seem to be very slow to react in the marketing department, where AMD does terribly.

nVidia is definitely not rolling over, their G212 and such parts will keep newer and cheaper parts in the market, they may be jokes to us, but the kiddies will buy them with mom & dad, and nV will market the crap out of them, almost literally making them seem less sh1tty for the Holiday shopper.

Right now ATi should be able to recapture alot of the market they lost to the long & successful presence of the G80, but I doubt they will recoup even half that, and I also doubt this means a split market either. If it was simply about having the best parts in alot of the market segments, ATi should've pulled even or surpassed nV last generation, but they didn't, not even close. They made gains, but still lost the battle, and it's even worse in the workstation market, where ATi finally offered incredible part with massive perf/$ numbers and still it's a slaughter there because they do so little to promote it, and people just go with what's comfortable (marketing over economics).

The only question to me is who's making how much profit during this period, are parts selling at enough of a premium for both that their equal market share would equate to equal revenue/profitiability/health, or does it cost AMD or nV significantly more to achieve parity?
 
the new GT220/GT210 are die shrink and cut down version of G200 not GTX9800.

Yes I know, I was meaning the gt240.

According to Charlie (I know 😛) :-

http://translate.google.com/translate?prev=hp&hl=en&js=y&u=http%3A%2F%2Fvga.it168.com%2Fa2009%2F1009%2F754%2F000000754803.shtml&sl=zh-CN&tl=en&history_state0=

Full review on SA :- http://www.semiaccurate.com/2009/10/11/nvida-launches-uncompetitive-g210-and-gt220/