Nvidia Graphics Chip Market Share Nosedives

Status
Not open for further replies.

pelov

Distinguished
Jan 6, 2011
423
0
18,810
[citation][nom]NuclearShadow[/nom]Which they no doubt left for a reason and must be a good one at that.Go ahead and sell as I buy.[/citation]

They don't make a lot off of the tegras. The chips are cheap to produce and have to remain cheap to be sold in such quantity, nevermind the insane competition when it comes to ARM nowadays. Their GPUs price/performance is coming in 2nd best to AMD.

Project denver and the new line of GPUs better be absolutely mind-blowing, otherwise this trend will continue.
 

Tomtompiper

Distinguished
Jan 20, 2010
382
0
18,780
[citation][nom]NuclearShadow[/nom]Which they no doubt left for a reason and must be a good one at that.Go ahead and sell as I buy.[/citation]

It was a very good reason, they could not compete. And in the tablet space they face stiff competition from PowerVR whose tile based architecture is faster and more power efficient than traditional z-buffered approach preferred by Nvidia and AMD. At least have an ARM license to fall back on so they have somewhere to retreat to when the Bulldozer arrives.
 

Tomtompiper

Distinguished
Jan 20, 2010
382
0
18,780



Onboard graphics on most low end PC's netbooks and motherboards. 50% by unit, not by value.
 

milktea

Distinguished
Dec 16, 2009
599
0
18,980
Never owned any Nvidia products. Their names doesn't come up often in the mainstream consumer segments. Even with the introduction of smartphones, PowerVR is more heard of than Nvidia. It's sad.
 

dgingeri

Distinguished
I find this odd because I have bought more video cards this year than any year previously, and most of them have been Nvidia cards. I've purchased 2 GTX470's, a GT430, and a G210 for myself alone, and 6 more for family members. I've previously bought more ATI cards, but their recent driver quality and the problems I've had hooking them to HD TVs has discouraged me from buying more of them.
 
G

Guest

Guest
memadmax 05/04/2011 11:33 PM
Hide
-0+

The last time I had a Nvidia card, the driver was so bad that it was almost unusable... since then, AMD baby...

You are such a fanboy! Everyone knows AMD drivers are the worst and Nvidia are the best. As far as cards go...both are good but the drivers Nvidia wins hands down!
 

dgingeri

Distinguished
[citation][nom]tomtompiper[/nom]Onboard graphics on most low end PC's netbooks and motherboards. 50% by unit, not by value.[/citation]

This is mostly because most consumers, including many IT managers who make the buying decisions, are, quite simply, idiots. They don't realize just how many problems they'll have with driver issues and program compatibility. They just think cheap="good enough". So many people just don't realize that you get what you pay for, and paying a little more to get good quality is well worth the investment.
 

Trialsking

Distinguished
Mar 2, 2007
733
0
19,010
[citation][nom]dgingeri[/nom]I find this odd because I have bought more video cards this year than any year previously, and most of them have been Nvidia cards. I've purchased 2 GTX470's, a GT430, and a G210 for myself alone, and 6 more for family members. I've previously bought more ATI cards, but their recent driver quality and the problems I've had hooking them to HD TVs has discouraged me from buying more of them.[/citation]

You just need to buy like 1,000,000 more for everyone in your state to help Nvidia's market share.
 

kingnoobe

Distinguished
Aug 20, 2008
774
0
18,980
You maybe you're the fanboy. I've never had issues with amd/ati drivers. I've only ever had issues with nvidia go figure. Not to mention even the general census is amd is getting better with drivers.

Not to mention the fact I've never had an amd/ati card fail on me. Yet both nvidia cards I've owned have. Now you can blame the manufacter, but if nvidia allows it's name on it at the end of the day it's their fault.

Now with all that said I will admit my bias. I will never buy another nvidia even though I know that both amd and nvidia make good and bad cards.
 

dgingeri

Distinguished
[citation][nom]kingnoobe[/nom]You maybe you're the fanboy. I've never had issues with amd/ati drivers. I've only ever had issues with nvidia go figure. Not to mention even the general census is amd is getting better with drivers. Not to mention the fact I've never had an amd/ati card fail on me. Yet both nvidia cards I've owned have. Now you can blame the manufacter, but if nvidia allows it's name on it at the end of the day it's their fault. Now with all that said I will admit my bias. I will never buy another nvidia even though I know that both amd and nvidia make good and bad cards.[/citation]
I've had both sides. When trouble starts coming up, I switch to the other vendor until they start showing trouble. My ATI issues started with the Radeon 9700, so I switched to a Geforce 6800. I went with Nvidia until I started having trouble with the 7800 drivers, so I bought a Radeon 4870X2. That worked until the 10.1 drivers came out, so I bought two GTX470s. Those gave me issues with the 268 drivers, but I had worse problems getting the Radeons to work on family computers, so I stuck with Nvidia cards for my server and my HTPC. (The Radeons left this huge black border around my screen, and I couldn't get it to go away. The G210 and GT430 are great with my TV. Yes, my server is hooked to my TV, but only so that I can directly work with it if I have to.)

However, my dad's HTPC and TV worked better with a Radeon, so the one I left in his HTPC is a 4650. It has issues with Bluray playback, but he doesn't use it for that much. He uses it for Netflix and Skype calls to see his youngest grandson.
 

sykozis

Distinguished
Dec 17, 2008
1,759
5
19,865
[citation][nom]dgingeri[/nom]This is mostly because most consumers, including many IT managers who make the buying decisions, are, quite simply, idiots. They don't realize just how many problems they'll have with driver issues and program compatibility. They just think cheap="good enough". So many people just don't realize that you get what you pay for, and paying a little more to get good quality is well worth the investment.[/citation]
Most companies don't need dedicated graphics cards, which is exactly why they buy systems with Intel's integrated graphics. If all your company does is work in MS Office or Corel Office, it's impossible to justify the increased cost of a dedicated graphics card. The only time the cost of a dedicated graphics card can be justified in a business environment is if the company works with graphics.

[citation][nom]YouSuckBigTime[/nom]memadmax 05/04/2011 11:33 PMHide-0+The last time I had a Nvidia card, the driver was so bad that it was almost unusable... since then, AMD baby...You are such a fanboy! Everyone knows AMD drivers are the worst and Nvidia are the best. As far as cards go...both are good but the drivers Nvidia wins hands down![/citation]
Sounds more like you're the fanboy. Considering the countless number of complaints of stuttering with Fermi cards that come and go with different drivers....You might want to look at actual facts. Also, nVidia's overall support has been going downhill....
 

AppleBlowsDonkeyBalls

Distinguished
Sep 30, 2010
117
0
18,680
Looks like this is gonna be a great year for AMD. Fusion seems like the best bet they've made in years. If Bulldozer and Llano can be competitive with Sandy Bridge at the high-end, mainstream, and low-end this is gonna be one of their best years ever.
 
G

Guest

Guest
It was a huge design win for AMD GPUs to be included in all of Apple's newest 15 & 17" Macbook Pros and the just released Sandy Bridge iMacs. While many may discount Apple's products, their market share IS increasing and by including AMD GPUs rather than Nvidia GPUs in their systems, it's not surprising that Nvidia's market share has significantly dropped.
 

bin1127

Distinguished
Dec 5, 2008
736
0
18,980
are they counting the sandybridge chipsets that needs IGP? they really should split up the categories into discrete and integrated.
 

campb292

Distinguished
Mar 18, 2010
50
0
18,630
This sounds great for consumers. If I am reading the stats right, it looks like NVIDIA and AMD both have about a 25% market share with Intel having 50%. Again, that is all segments which I find only modestly interesting.

The big stat I notice is discrete graphics which I would assume most of us really care about - unless you love on-board graphics and would never be caught building a new unit without those "big clunky video card addons". Here, NVIDIA still whoops some booty with 60% share to AMD's 40%.

All the competition is good guys. All that matters is that some R&D guy sees the other company with a new product at x/$ with y/performance and figuring out how to leap past that with something that will sell.
 
[citation][nom]dgingeri[/nom]This is mostly because most consumers, including many IT managers who make the buying decisions, are, quite simply, idiots. They don't realize just how many problems they'll have with driver issues and program compatibility. They just think cheap="good enough". So many people just don't realize that you get what you pay for, and paying a little more to get good quality is well worth the investment.[/citation]

From my own personal experience you are absolutely correct.
 
The only reason why I have yet to move over to ati is the usual cuda and physx besides the common drivers issues. As for driver issues with nvidia I have only had issues related to only just one game but every thing else has been rock solid. I only update when it is required. I use older more stable drivers for my existing ati cards that I know that I can rely on. Both are modded and are better suited for my demanding use than normal cards. As for current generation Fermi cards it is not drivers but very poor manufacturing quality of most cards leaving hot spots that make the cards more unstable as well a short life span. To me I expect 15,000 to 20,000 hours life out of a card before it is completely retired unless a required upgrade interrupts that cycle. 10,000 to 15,000 hours for mechanical hard drives. A typical gtx 460 and non reference gtx 560 will have a typical life span less than 10,000 hours due to the lack of proper cooling of the whole card. Similarly the same for many ati cards if the thermal pads are not properly applied to the power vrm phases for either core or vram.
 

alextheblue

Distinguished
[citation][nom]YouSuckBigTime[/nom]memadmax 05/04/2011 11:33 PMHide-0+The last time I had a Nvidia card, the driver was so bad that it was almost unusable... since then, AMD baby...You are such a fanboy! Everyone knows AMD drivers are the worst and Nvidia are the best. As far as cards go...both are good but the drivers Nvidia wins hands down![/citation]Nvidia has had some driver issues at various points in their history as well. I've got Nvidia and AMD hardware, and AMD drivers are just as good these days.
 

sinfulpotato

Distinguished
Dec 4, 2008
204
0
18,690
With Graphics on the same chip as the CPU Nvidia will lose more shares. Their chipsets are simply unneeded and the low end is where the money is.

[citation][nom]nforce4max[/nom]The only reason why I have yet to move over to ati is the usual cuda and physx besides the common drivers issues.[/citation]

Professional software that took advantage of CUDA also use the parallel processing capabilities of ATI cards as well now. CUDA is a non-issue.
 

spectrewind

Distinguished
Mar 25, 2009
446
0
18,790
[citation][nom]dgingeri[/nom]This is mostly because most consumers, including many IT managers who make the buying decisions, are, quite simply, idiots. They don't realize just how many problems they'll have with driver issues and program compatibility. They just think cheap="good enough". So many people just don't realize that you get what you pay for, and paying a little more to get good quality is well worth the investment.[/citation]

Not idiots. They're financially minded and do not always listen to their technicians. In my experience, 3D capable discrete adapters are for compositing/rendering/cad/entertainment purposes. Motion video acceleration, perhaps?
Most business software I have ever seen is done in 2D: financial, graphical, whatever. Pushing deeper into the "video not needed" segment is that I support the installations of headless computing in deployment. Those desktops are supported by VNC/dameware (which is incompatible with 3D acceleration).
 
Status
Not open for further replies.