Linux Creator Drops F-Bomb on Nvidia, ''Worst Company''

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Well, you are kind of mixing apples and oranges here.

Android now enjoys a 60% smart phone market share, beating out iPhone by a comfortable margin. So yeah, it's a lot of users. But this is a market traditionally dominated by ARM based CPU manufacturers. Nvidia is just now starting to make serious entries in this arena.

I had trouble finding statistics on college and university usage, but I think your statement is a bit sweeping. I think "some" universities use Linux, but they also use Windows, OS-X, etc. Someone taking C# classes or trying to run the latest version of Photoshop is going to have a rough time on a Linux box...

It's true that "Most" Internet servers these days run Linux, followed by Unix, Windows, and a tiny percentage of OS-X based servers. But this is mostly irrelevant, as most of these servers are running Apache / Tomcat, Java, etc. 3d graphics support is not required. Heck, most of these setups are command line only, you don't even need 2d acceleration.

Now... When it comes to actual home desktop use, Linux is just now starting to come into it's own. I'd say 4% market share VS Apple's 5~6% market share, and that's all the different distributions combined. To further complicate things, each distro has their own way of doing things. HW on one distro does not necessarily work on another. Considering all this, I think Nvidia's driver support in Linux is amazingly good.
 


A Finnish redneck is easy to find: in France, during Summer (sensitive skin...) All joking aside, the guy isn't a CEO, nor a shareholder, and as far removed from a PR role as possible - he's an engineer. He takes practical decisions that work. If something works, he leaves it alone. If it doesn't, he fixes it. If it's of a (for him) brain-dead design, he flames the author overtly. He actually shouted at the authors of Nouveau (the reverse engineered Nvidia driver project) a while ago. The result? Nouveau got much better.

So, it seems to work.

Linux (the kernel) can power everything from wrist watches to supercomputers, with everything in-between: laptops, servers, set-top boxes, rendering farms, coffee pots, cars.

It isn't an innovative design. It isn't fancy. It doesn't come with a flashy GUI or an apple logo, it doesn't chime when you log in. It doesn't boot with a huge Trademark at boot (at best, you get a penguin). When it craps out, it tells you so: "Oops: this went wrong. Fix it." It doesn't reboot instantly, trying to make you believe nothing happened, it doesn't tell you to visit the Genius Bar because you're too stupid to fix your computer. It does what it's supposed to do, no more no less, and it does it well without a word. And when it can't, it tells you things as they stand.
 
The issue isn't with linux based mobile devices though it's with the PC market. So, Yes, linux is only at around a 2% market share and you know darn well that not all of those people use Nvidia graphics solutions. So who's really getting the short end of the stick here? It's the people utilizing a linux based OS that also run an Nvidia graphics solution. What kind of market does that leave for Nvidia to place millions of dollars into further developing their drivers for those OS's. Their a business plain and simple. If you saw the amount of funds that they would displace by doing what Linus requests I'm sure you'd agree.
 
@blankie: due to that very reason, Nvidia lost a (close to) $1 billion market to AMD - a Chinese state corporation wanted to buy 270 million chips (or something like that) on the condition that the hardware maker provided an open source driver, and they approached Nvidia first. Nvidia said no to the driver requirements - and was shown the door. Then AMD got it.
Developing a proper driver for several generations of chips and supporting it for years, optimizing it along the way while doing reverse engineering costs around $ 15 millions (I can't remember where I read that). However, paying a couple of (good) developers for 3 years costs, what, $100,000 * 2 *3 = $600,000? Considering they already have the specs for the hardware? You may add $1-2 million in legal fees to ensure you're not opening anything 3rd-party patent-protected (yes, legal costs more than actual development) and you'd still be under $3 million in investment - doesn't that justify itself in front of a $866 million contract?
 
Status
Not open for further replies.