Nvidia Halts Chipset Developments

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Honestly this is going to push me away from Intel CPUs. Here at work I've been buying AMD, because they come with Integrated graphics that handles DVI and will allow me to do dual monitor (The second connection is VGA though). I honestly think this may hurt Intel more than it helps.
 
I had a couple of chipsets from NVIDIA for my AMD processors and they seemed to work great. Even though they say they're not developing chipsets for AMD processors. It looked like they haven't made too much effort to develop chipsets for the AMD processors for while.
 
[citation][nom]mowston[/nom]With the new Lucid chips, I guess Intel said "We don't need no stinking SLI". And Nvidia can't afford to disable multi-GPU on Intel chipsets. I wonder if Nvidia will license SLI for use on AMD boards now though, or if AMD will use Lucid.[/citation]
problem is i belive nvidia has licensed intel for sli so by forbiding intel to make sli compatible mobos in the future would mean bearking some licesing contracts before term wich would mean in more loss for nvidia side ... oh well ... like nelson_nel above said business is an unfair realm, funny thing is that intel seeks out to destroy the very companies that makes their CPUs worth something in an end game PC, nvidia and ati(amd). Wonder what would happen if nvidia / ati would pull the plug in offering GPUs for intel platforms somehow.
 
This is bad news for the consumer. Nevermind Intel, nVidia or AMD's business tactics, the result of all this will be that new mid-range laptops with Core processors will be lumbered with Intel integrated graphics instead of the far better nVidia options such as the 9400m.

This will also put back Core powered Apple laptops by six to nine months across the line.
 
Honestly *shrugs* I'm no fan of Intel's practices, but really, it's nVidia's own fault they're now the proverbial man without a country: For years now, nVidia has used their reputation and clout as THE Graphics Performance Leader to muscle, bully and strangle-grip the market.. They got a little too big for their britches, though, burned a few strategically advantageous bridges and now, to quote the Floyd, it's a little "too late to lose the weight [they] used to need to throw around"...

Years of nVid smack-talk (remember all that "death of the CPU" hype?) has pricked Intel enough to attempt some serious direct competition. And you know what? It doesn't even matter if Larrabee is crap this gen, because now that Intel more-or-less has the Green team in their sights, nVidia is seein' MENE MENE TEKEL UPHARSIN scrawled in a large, aggressive lookin' font on the wall: they recognize that, without an x86 and their own production arm capable of putting out a decent CPU, they are effectively OVER in this market. Which is why they're putting SO MUCH EFFORT, now, into promoting GPGPU chic--trying SO HARD to make us realize how "relevent" the multi-use potential of their cards is; FORCE us to see how useful their cards are in the General Processing arena...
 
I read a story yesterday that said nvidia is dropping the 260, 275 and 285 this month. The GTX 295 will soon be dropped as well, according to the article. It looks like nvidia is going the way of 3Dfx.

http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx285-gtx275-gtx260-abandons-mid-and-high-end-market/
 
chaohsiangchen: I think building an all-AMD HTPC was a far better choice anyways... LGA1156 + IGP just doesn't make a lot of sense, since there isn't (supposed to be) a northbridge there...
 
they are getting ready for Larrabee, and they've just invested quite a bit into their own integrated graphics AND next quarter intel integrated graphics will even be built on chip next to their cpus. I'd say intel knows exactly what they're doing. They are eliminating their competition on the graphics front.
 
Anyone with common sense understands that competition is the best consumer advocate. Once Intel kills its competition, there will be no incentive to provide attractive pricing or to improve innovation.

Rick
 
You better pray Nvidia stays around, unless you just WANT to pay over $2000 for any video cards. Since, obviously AMD/ATI would own the market.
 
Reply to ohim quote " Wonder what would happen if nvidia / ati would pull the plug in offering GPUs for intel platforms somehow" Nvidia might could pull this off but I beleive AMD has a cross licesing agreement with Intel which grants Intel permission to use any of AMD's technology licenses & same with Intels technology. I'm surprised Nvidia won't make AMD chipsets either seems like Nvidia would get all the revenue they could get in this tough economy.
 
This is a potentially unfortunate development, depending on whether Nvidia is able to work out some sort of licensing agreement for the current and future Intel/AMD CPU architectures. Love them or hate them, Nvidia does benefit the consumer, both directly and indirectly, not only with their products, but more importantly by providing strong competition in multiple market segments.
I've been a PC tech for a number of years, and have used Intel/AMD/Nvidia chipset solutions in different builds all the way from enterprise server level to entry-level desktop hardware configs. I truly believe that without Nvidia as a major competitor in the market, my customers would be paying more for inferior products. Competition drives prices down, certianly, but also has historically been an arms race of sorts in driving new technology.
From a personal consumer standpoint, market aside, I feel somewhat indifferent towards this news. My engineers that I support have had many issues with Nvidia chips (integrated and discrete solutions), as have I. However, I'm currently very happy with my 9800 in my desktop at home.
Pure speculation: I use a MacBook daily as my main personal computer (last thing I want to do when I get home is support a PC 🙂 ) and wonder what effect this could have on future Apple product development. Apple traditionally uses an IGP chipset, and moving forward, could they eventually switch away from Intel/Nvidia and adopt a pure AMD solution? I would imagine this could drive the Apple tax slightly southward...maybe!
 
n\/idia is just getting all kinds of crap recently... First someone saying they're dumping the industry (which i don't belie\/e) and the gt3xx series coming after the holiday season in q1 or q2... and the debacle with tsmc and it's low yeilds... and it all started back at the 86xx series crapping out in cases with crappy airflow... This seems to be the result of either REALLY BAD LUCK or just some poor decisions...
 
[citation][nom]XZaapryca[/nom]I read a story yesterday that said nvidia is dropping the 260, 275 and 285 this month. The GTX 295 will soon be dropped as well, according to the article. It looks like nvidia is going the way of 3Dfx.http://www.semiaccurate.com/2009/1 [...] nd-market/[/citation]


n\/idia has responded to that article.. head o\/er to hardocp to get the scoop on that.
 
Most of the folks complaining about Nvidia chipsets don't buy the right boards for their project or simply don't have a clue on how to set them up. Say you purchase EVGA's 3-way SLI Classified board....I hope you know what you're doing. That board is for overclocking and you need to be familiar with adjusting bios settings. If you're not going to overclock or run SLI then get some other board.

I found this out the hard way a while back and thought something was wrong with the older chipsets until I educated myself. Now that I'm familiar with them I've been building nothing but Nvidia / Intel set-ups ever since. Just built a Core i7 extreme / Quad SLI combo rig a week ago and its the best PC I've built yet. So stop blaming the product and "learn yo'sef" something.

For Physx, I don't see what the problem is with it not being compatable with ATI / AMD hardware. Nvidia owns the rights and manufactures accordingly. Complaining about that is like sending Microsoft a nasty letter demanding they support PS3 games on the 360.....thats just silly and outright laughable.

Intel is stepping on its own d*** here. An Intel / Nvidia team-up would make more business sense. The only reason I can fathom for Intel's lawsuit is if they intend to make their own graphics cards and compete directly with Nvidia....but lets face it....Intel's graphics tech is a joke compared to Nvidia and would likley never be able to compete on a level playing field. Hence Intel's lawsuit. Intel will almost certainly lose on the legal field, but damage to Nvidia is already being done and that seems to be Intel's goal. A stupid, self-centered, and self-destructive goal at that.
 
Most of the folks complaining about Nvidia chipsets don't buy the right boards for their project or simply don't have a clue on how to set them up. Say you purchase EVGA's 3-way SLI Classified board....I hope you know what you're doing. That board is for overclocking and you need to be familiar with adjusting bios settings. If you're not going to overclock or run SLI then get some other board.

I found this out the hard way a while back and thought something was wrong with the older chipsets until I educated myself. Now that I'm familiar with them I've been building nothing but Nvidia / Intel set-ups ever since. Just built a Core i7 extreme / Quad SLI combo rig a week ago and its the best PC I've built yet. So stop blaming the product and "learn yo'sef" something.

For Physx, I don't see what the problem is with it not being compatable with ATI / AMD hardware. Nvidia owns the rights and manufactures accordingly. Complaining about that is like sending Microsoft a nasty letter demanding they support PS3 games on the 360.....thats just silly and outright laughable.

Intel is stepping on its own d*** here. An Intel / Nvidia team-up would make more business sense. The only reason I can fathom for Intel's lawsuit is if they intend to make their own graphics cards and compete directly with Nvidia....but lets face it....Intel's graphics tech is a joke compared to Nvidia and would likley never be able to compete on a level playing field. Hence Intel's lawsuit. Intel will almost certainly lose on the legal field, but damage to Nvidia is already being done and that seems to be Intel's goal. A stupid, self-centered, and self-destructive goal at that.
 
I like my intel products and all, how well they run... but after reading about their business tactics I dunno, a product is a product to me but it is good to have competition. also it is odd how microsoft strong arms apple so much when if they put apple out of business, the government would probably do something to kill off microsoft because it'd be a true monopoly at that point.
 
Status
Not open for further replies.