NVIDIA possibly abandoning mid and high end market

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

randomizer

Champion
Moderator
http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx285-gtx275-gtx260-abandons-mid-and-high-end-market/

According to Charlie, The GTX260, 275 and 285 are all EOL within weeks if not already. Take with a grain of salt, obviously. Also, expect NVIDIA PR to deny the whole thing, and for said PR response to show up on Fuddo.

IMO Charlie is milking the "success" of his previous article and using the opportunity to send NVIDIA's public image through the floor.

EDIT: Nice contradictory article right before it too: http://www.semiaccurate.com/2009/10/06/nvidia-will-crater-gtx260-and-gtx275-prices-soon/ :lol:

I love Charlie, he certainly knows how to take the "professional" out of the journalist profession.
 
Seeing how woeful the performance of these cards are btw...does anybody really expect fermi to beat the 5870? That is if it even shows up within 6 months tbh, it doesn't look likely to me.

5 months after ATI released the 4770 at 40nm, Nvidia cannot even make something half as good. That is *really* bad news for them.
 
I dunno why they would leave it mouse. Yes it's an ever shrinking market but there will always be room for discrete graphics.

I know TSMC's 40nm was bad, but that was supposed to be fixed. Why is Nvidia's so bad still? Can you really see Nvidia going from these embarrassing gt210-240's to fermi within 2-3 months? I don't see how it's possible.
 
Come on guys, the G210/220 were known about for months; just really bad timing on release. Its not like they were dreamed up 3 months ago as a distraction.

NVIDIA isn't going anywhere. And I still see a January release of the G300 series. And yes, I expect better then 5870 performance, based mostly on the few white papers that have been released, as well as the more or less accepted specs.

Besides, which makes more sense: A release in the middle of October, or a early December release followed by a multi-million dollar ad blitz? Also, last I checked, AMD still can't turn a profit, market share means nothing if AMD can't turn a profit at some point.

NVIDIAs GPGPU makes sense though once you remember they lack a CPU. And with Intel moving into the GPU market, and AMD pushing for GPU on the CPU, you quickly understand NVIDIA's approach is one to cover its own but should NVIDIA suddenly need to broaden its horizons beyond GPU's.

Now, if NVIDIA doesn't release by mid January, then you can start screaming "the sky is falling". But everyone knew ATI would be the first ones to release, and frankly, they missed on our performance expectations. No major news here yet, as far as I'm concerned.
 
ATI missed on performance expectations? The 5870 is as fast as a gtx295, a dual gpu card released 9 months ago. The 5850 is 20% faster than the gtx285, a card released 9 months ago. This is on beta drivers.

Nvidias best 40nm gpu to date cannot even beat ATI's 4770, launched months ago. What's more, the 4770 is a tiny gpu.

And you expect fermi to change all this around? I don't see how that is gonna happen. For Nvidia to make this huge scratch-built gpu on 40nm looks more and more like a pipedream.
 


Actually the 5870 is much better than I thought it was going to be, and not just in terms of performance, hence why I went with the red team this time.


 

Unlike some Nv users I have spent the last few months contemplating a PC world without Nvidia. Regardless of all the corporate infighting, forum fans hating or loving it's still down to the human element at the reins of a particular company and how they view things and it seems as though someone at the top feels that trying to compete with two competitors that can make both x86 and x64 CPU's whilst they themselves are denied at every turn is just too hard a fight to take on IF GP computing is indeed the way forward, so rather than fight a war on two fronts better to concede the current war on gaming graphics to ATi and concentrate on the coming conflict that will pitch the all powerful GPU against that tired old but constantly updated old dog called a CPU that according to the Nv mantra of old has 'well and truly had it's day'.

Thing is if Nv walk away from the gaming scene because they stop making gaming cards, will ATi step up to fill in and work with the game developers and all the 'cost' that entails? or will the console makers circle like vultures picking off the ones who will sign a 'lock in' contract, like for instance Grand Turismo on the PC, anyone got that?

CPU's have been gaining cores quicker than than software has been written to use them and someone saw that a while back and thought "how can I use that to my advantage considering I'm never going to get my own CPU" and now he may feel that they've cracked it for him and the company can now be steered slowly away from it's current course and on to a new heading, into fresher uncharted waters as it were.
 

It's a gamble, but better to gamble on one front then two at this stage and the PR dept would have us believe that they have already been making a foray into such fields and if successful may save actual lives instead of just ending the virtual ones of some spawned A.I.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/24145-gpu-technology-conference-nvidias-new-focus-changing-market.html
 
I have to say i agree with TGGA on this, i don't think Nvidia are going anywhere fast, a few generations ago ATI didn't have a decent high end part but they are still here and doing quite nicely at the moment.
The fact that there isn't a decent tech demo or game scene prepared to show DX11 off to the fullest is something that bothers me also it seems, as TGGA said, that they are dropping the ball on a lot of opportunities/possibilities of late.
What will do it for me, help me decide what they are playing at, and what their intentions are as far as the idea that they were going to push decent spec chips at the middle sector at a good price, will be where the 5770 and 5750 land on price/performance.
I fully understand that they will probably mark this generation up a bit as there is no direct competition as far as DX11 performance parts goes. But. If they are expecting me to pay around £100 to get a card that is roughly the equivalent of a 4870 but on a 128 bit bus then they can think again as i would see that as just paying £100 for DX11.
I know its all about personal circumstance and people with 3 series cards would probably jump at the chance but I'm coming from a 4770 so would want 4890 performance before it was worth me upgrading even then im still rather pissed that its a 128 bit bus and not 192 as reported in some rumours.

Mactronix
 
Well, obviously, the middle ground has moved, as the newer cards appear, and it appears nVidia has top solution, and a low end solution, even tho it appears to be woeful, likes been said, marketing will move these units.
Word is, due to nVidia MRP, it gaurantees its pricing to its partners, so, selling off all current stock is important in order for them to drop picing, and maintain current profits, or, later shipments will have a lower MRP, and will also be priced lower.
As for ATIs marketing, couldnt agree more, theyre slowly doin the pooch here as each day goes by, and arent playing on this window of opportunity.

I dont think nVidia is going anywheres, but is doing some major shuffling, as each company has 2 iterations theyre attempting to sell, as witnessed by the 57xx series on down, and these new nVidia cards, with nVidia possibly even having three, to compete in thos shifted mid range levels, with a G200 shrink.
To abandone the market too abruptly doesnt make any sense at all, and theyll be here for sure, its how they play within the market that may change, but time will tell.

As for the gpgpu territory, I think the G300 will surprise, and as Annand said, itll be interesting to see how well nVidia does here, as they claim good sales already for it, and its extremely lucrative, with many inroads for newer markets.
Time will tell here, so we need to wait some more I guess, and hope ATI takes advantage here, nVidia doesnt come in too late, and the gpgpu thing does well
 

Have you learned nothing from John? The GTX295 is clearly better. :pt1cable:
 
If NVIDIA can get DX11 straight; I could see them simply excluding all the elements that make it a good at scientific operations for the sake of the gaming market. sorta like taking down a 12 foot christmas tree of having one too many ortiments.
 
Imagine nVidia as a car dealership.

Customer: I'm looking for a new car

nVidia: Hey, check out this "puppy", it's Fermi.

Customer: That looks nice. When is it available?

nVidia: Oh, it's not, but when it is, you'll want it!

Customer: There's no engine in it, there's not even a steering wheel. Why is the door screwed on? Are you sure this is a car?

nVidia: Yes, positive. The real one is back behind that closed door that says "Super Secret", but it's [strike]not ready[/strike] being worked on.

Customer: Well, the guy down the street has the brand new 2010 models out now.

nVidia:You don't want them. That stuff they have isn't important, this is 2009. We do have these other brand new available "[strike]2007[/strike], I MEAN 2009 model cars. I'm not sure how 2007 got there.

Customer: Ahhh..I don't think I'm done shopping around yet.

Maybe they should stick to GPUs
 
So it is basically a battle between AMD and Intel's on-CPU GPU desire and nvidia's desire for a dedicated graphics card? So, if the GPU is on the CPU that would also mean that every time you wanted a better GPU you would have to buy a new processor, even if what is traditionally known as the CPU isn't the bottleneck? I don't know how much I like the sound of that. Wouldn't that make the CPUs a LOT bigger?

Do you think that tech advancement is starting to slow down, I mean, how many more transistors can they fit on one of those devices? What is going to follow the transistor? Quantum computers? DNA/Organic and synthetic hybrids?
 
Not really. Discrete graphics cards will always be far more powerful than fusion cpu/gpu's.

I believe Nvidia got bluffed into their current path by intel, and abandoned what they knew best in favour of some unknown quantity in cgpu.

Both intel and Nvidia are trying to break into the areas where they both suck. AMD are just doing what they do best, which is make cpus and gpus.

At least one of these companies is doomed to fail soon I think.
 
The general processing with video cards, while it may not be nvidia's strong suit, seems like an incredibly useful technology. Someone told me that the latest version of nero is utilizing it and it took a 1/16th of the time to transcode a video as it would have without it. If you are curious regarding what it normally would have taken he said 8 hours. I don't know if that is true or not as I have not verified that myself but being able to use the graphics processor in other applications definitely seems useful. Couldn't nvidia just make "normal" graphics cards and have a way for developers to write code to use it without going out of their way to make special cards?
 
8 Hrs? I used to use my single core 3500+ with 1GB of ram to transcode a ~3hr .avi file to DVD. It would take about 3hrs to do this, including burning it to DVD at 4x speed. Actual transcode time was around 2.5hrs. My "new" setup is the E6600+ with 4GBs of ram. It takes about an hour and 15 minutes to do the same thing. If you figure about 30min for burn, the new setup can do this in about 45min. Either he is transcoding something much larger then a 3hr .avi, his setup is worse then my old 3500+/1GB ram, or you are REALLY mistaken on the times. The best thing about getting the faster CPU is that ALL things are faster, not just my transcode times. This will be one problem Nvidia faces if they can only get some things faster.

At least one of these companies is doomed to fail soon I think.

If you look at how much money each of these companies make, its not Intel or Nvidia who's about to fail.