NVIDIA possibly abandoning mid and high end market

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

randomizer

Champion
Moderator
http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx285-gtx275-gtx260-abandons-mid-and-high-end-market/

According to Charlie, The GTX260, 275 and 285 are all EOL within weeks if not already. Take with a grain of salt, obviously. Also, expect NVIDIA PR to deny the whole thing, and for said PR response to show up on Fuddo.

IMO Charlie is milking the "success" of his previous article and using the opportunity to send NVIDIA's public image through the floor.

EDIT: Nice contradictory article right before it too: http://www.semiaccurate.com/2009/10/06/nvidia-will-crater-gtx260-and-gtx275-prices-soon/ :lol:

I love Charlie, he certainly knows how to take the "professional" out of the journalist profession.
 



You know what reprise means right?
 
Do you mean the cards or the company?

This is the problem that any company faces if it tried leaving its core business. It started losing its chipset business when AMD merged with ATI. Intel told them well before i7 that they won't be able to make chipset for their new CPUs. They also knew Intel would start making larrabee, so selling video cards would be harder as well. With no chipsets to sell, no CPU to sell, and a new major player in the GPU market, its no wonder why they are pushing GPGPU so much.

The problem I see is that they need the "GP" part to perferm at least as well as the i5, and the GPU part to be able to keep up with what "ATI" has out. They then need to hope that either Larrabee fails, or that they can do what LBee can but cheaper.
 

And following that statement is this one
"Nvidia's launch of the Fermi architecture was a surprise not because of what it offered, but because of what was excluded: PC graphics. "
which is somewhat condemning right?
 
http://www.fudzilla.com/content/view/15882/1/

Fud reckons the 5870's are selling, wonder how close that is to the truth.

Charlie also had a go at AMD for 'sucking up to microsoft', as somebody had fed him a line that the 5x series was going to be named the 7x series after windows 7.

It's not really about who he likes so much as who he dislikes. Nvidia are clearly at the top of the list and Microsoft next. I believe he doesn't much like HP either. He isn't an AMD fanboy, he likes intel too much to be one (and I've seen him have a go at intel too recently).
 
Did Nvidia ever sell more than a handful of chipsets anyway, at least outside of laptops? The few people I know who bought PCs using their chipsets had so many problems that they'd decided they'd never buy one again.
 
one shouldn't conclude the credibility of an article from the preferences of the author, he should concludes the objectivity of the author from the logic behind the article.
and wht charlie says is nonsense.
we know ever since the 4xxx was released that Nvidia had to lower the prices of their GTX 2xx series and that they sold them with very minimal profit margin, and as soon as the 4xxx prices started to drop, nvidia was losing money on every chip they sold, this has been the case for about half a year now, and nvidia knew wht position they were exactly, right now the 5xxx is out and sure it puts more pressure on nvidia, but of course nvidia were expecting this, they were expecting the 5xxx in sep 09 and they were expecting it to perform around its current performance and of course they now it will not cost much. so now and all of a sudden nvidia is surprised with the 5xxx and the position they're in !!!!!!!
giving the fact that fermi is from 2 to 4 months ahead, there's little reason why they can't hold on the current chips. of EOL them makes very sense from benefit/cost point of view, but from a marketing and PR point of view, its impossible fot them to let AMD be the only available option, put urself in the shoe of a casual guy that knows very little about GPUs and is about to buy one only to find that only AMD is available, and to find out later that his radeon satisfy him very well(as with most GPUs right now), think of how much PR will Nvidia need later to over come this and u'll understand that there's no way they're doing this. and no, those cumulative loses won't kill them, my guess is that they r big enough to overcome another 4 months of loses, but they can't go like this for long, now it would be silly to think that nvidia isn't aware of all this.
 
What I find interesting is, all the comments awhile ago about TSMCs 40nm process, and whod be out the gate first.
People left and right were all saying nVidia would be on it like right aways, and theyd have DX11 and be killing ATI.
What happened?
If not for G92, nVidia would be about belly up at this point, thats facts, not guesses.
Their lag in improvements and arch' that cant scale properly have left them with only the G92 as a money maker,, and I point out, the G92 is now mid and low end only, while their high end suffers on profits.
So, we hear news that nVidia is just now getting their 40nm parts, which arent even DX11 parts out the door, and are still G92 based, and their new arch is only a hollow piece of plastic, that supposedly scales down, and has 3 billion transistors, and again, according to them will be cheap and affordable, but where are they?
Then we hear that all cards, including ATIs are in short supply, and supposedly both nVidia and ATI are working together to make this a shortage to promote higher card costs? When as Ape points out, it could just as well be the sellers calling the kettle black, and both nVidia and ATI are in transition here, which is fact on ATIs side, and rumors and speculations on the nVidia side.

Reading more into this than whats already known isnt wise at this point, its best to summarize whats best for nVidia and ATI, think of what theyd do at this point, and make conclusions here.

The high end is whats NOT at stake for either company here, but nVidia insistence on harping on its gpgpu livelyhood and excluding DX11 doesnt help either.

As for the mids? nVidia has yet to show us a produc, whether its a piece of hollow plastic, or even a hint as to its follow up of a complete gfx card line up is severely needed, and all those who go on and on about TWIMTBPd program need to hope nVidia is as successful getting to its fans at least as well, and let them know, yes, mid cards will be available, and DX11 will be its high point, along with CUDA, Physx etc, because, the G92 is dying, and without a G200 scalable situation here for the mid end in market, nVidia has shown us nothing, nothing at all, and it isnt business as usual, since there was NO mid end card from the G200 arch, unlike in the past
 
Zen911, you failed to mention that Nvidea doesn't seem to have changed it's strategy with gt300 in terms of die size etc.
of course Nvidea are aware of this....... that's why they need to dump the gt200 and wait it out for the gt300 where they may actually make a marginal profit once again rather than a huge loss.

Trouble is by the sounds of it, it's more of a jack of all trades than a dedicated gaming card, so I personally doubt it will be faster than a 5890 and will cost a hell of a lot more, hence the two choices Nvidea has is Poor Sales or Poor Margins - either way it spells another poor financial round for Nvidea which can't go on much longer before they run into SERIOUS issues.

 


It's almost embarrassing though MM. How can Nvidia be so far behind now? It's incredible to think how far they have fallen.
 

AMD are bringing out an MCM after years of preaching "monolithicism." It's not unusual for a company to go back on their previous ideas.
 
It wasn't to bad technically to rebadge the 8 series to the 9 series. They both supported the same thing. Same DX level, same SM level, etc. The problem Nvidia will have this time is that people want to move onto DX11, which Nvidias old G92/G200 doesn't support. Perhaps they can tweak it to support DX10.1 (and yes, I do find it funny.) but thats NOT DX11. From the company that bragged about supporting DX9 with the entire FX series chips, to the company that bragged they support SM3 with the entire 6 series chips, now all of a sudden can't put out midrange DX11 cards? I'm sorry, but even if the g300 turns out ok, if they can't put out mid/low end DX11 cards I'm of the opinion that these are as dark days for Nvidia as the FX/5xxx days.
 


While true, it's not really the same is it?

12 core cpu's are required to bring AMD back to parity with intel, even if only for a short time.

Dx10.1 is like...really who cares and what does it achieve?

I mean..nvidia didnt care about dx10.1 when it was actually worthwhile coding, why should the care now that dx11 has totally swallowed it up? The answer is, without working dx10.1, nvidia cant go to dx11 until they have working dx10.1 on 40nm. That is what it looks like to me, and if true it's a disaster for nvidia, puts them a year behind probably.
 
Well yes I suppose you are correct in that regard. Nvidia downtalked 10.1, AMD downtalked MCM.

Thing is, AMD are doing it because they are forced to do it, and it should bring them back to parity. Nvidia's dx10.1 is just late as can be, nobody cares about it (due to their own apathy towards it), and even with it they are still gonna be what appears to be light years behind.

I cant think of any good reason why they are so slow, or why they are doing it now. If anyone has the reason then please share, it's got me baffled.
 
I believe the smaller but fast enough terminology used by ATI wasnt seen by nVidia, and came out of the blue.
I really dont think nVidia believed the R600 could eventually become the R700 or R800 for that matter, so no, nVidia didnt see this coming.
When you start a arch its 3 years ago, not only 1, and thats how long the G600 arch has been around, since then, nVidias sat on the G80/92 and then was to come in with the knockout blow, the G200, thinking they had a good enough lead to allow them to use alot of compute power for their gpgpu approach, which is needed anyways, so they had to take it, but played right into the hands of ATIs strategy, unfortunately for nVidia, who, again couldnt have possibly seen this coming, and the G300 only furthers this scenario, which may backfire further on nVidia, and if it does, and theres no scaling, and LRB wins in the gpgpu wars, and LRB takes a chunk of the discrete market, which it will, to some extent, even if its a dog, nVidia could be in serious serious trouble.
Thats why I said, Im hoping for a surprise from them as well, tho again, not too big of surprise, and remember all fanboys, whatever inroads nVidia makes on gpgpu usage/abilities, ATI wont be far behind
 


From an end user and development perspective how is it the same at all?

MCM is a design choice, not something that must be coded for and hampers software implementation or development the way nV's DX10.1 actions did.
 



Hmmm....

Yes, I see what your saying.

However, they are making substantial modifications to the links between the modules to prevent the horrible memory bottlenecking that was evident in the Kentsfield/Yorkfield/Bloomfield/Clovertown/Hapertown series of quad-cores.



Intel's impementation of the MCM was pretty woeful from a memory arch perspective.
 

Latest posts