Nvidia Issues Diplomatic Response to Linus Torvalds' F-Bomb

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
this guy"Linus Torvalds" needs to grow up...if he wants nVidia to get more involved with Linux OS then he needs to pay nVidia...it's their code and can do whatever they want with it.
 
[citation][nom]metathias[/nom]Nvidia seemed to enable 30 bit color in linux in a recent linux release. I found that an impressive contribution, we windows people dont get that without workstation cards.[/citation]
not true...all windows based cards from nvidia support 32 bit color
 
[citation][nom]lp231[/nom]So the SLI bridge isn't from Nvidia?[/citation]

Right, Nividia manufactures every little SLI bridge for every motherboard manufacturers' motherboards that support SLI?

Moron, they only hand out the schematics for the SLI bridge and let the motherboard manufacturers figure out how to put it in their products.
 
[citation][nom]nebun[/nom]not true...all windows based cards from nvidia support 32 bit color[/citation]

No, they don't. It's only 24 bit color. 32 bit color is a marketing term.
 
[citation][nom]lp231[/nom]So the SLI bridge isn't from Nvidia?[/citation]

The SLI bridge isn't being paid for; the motherboard is being paid for and the bridge just happens to maybe come with the board. Furthermore, can you prove that he/she bought boards that came with SLI bridges anyway?
 
[citation][nom]nebun[/nom]not true...all windows based cards from nvidia support 32 bit color[/citation]

Actually it's 24 bit color. If I remember correctly, 32 bit pixels are when there are an additional 8 bits per pixel for alpha. That's still just 24 bit color. Some systems can do 30 bit color, but I don't think that anything above that is very common in consumer usage if anything above it even exists at all in the consumer markets.
 
Indeed, Blazorthon and others. Your monitor only supports 24-bit color. RGB. 8 bits per channel, 256 levels per color. There are some newer screens that are coordinating gamma information too, but this is a derivative and is not Linear Lighting Workflow or anything like that.

Consider: For a monitor to display an image of say, a sunset, with all the brightness information that you would see with your eyes... it would have to be as bright as the sun. While this is possible, it would really hurt your eyes! And it would require an enormous amount of energy.

File types such as .jpg and .bmp are often only 24-bit, but the .bmp file will handle an extra 8-bit channel usually used for alpha masks. These files are rather defunct for graphics design, as the .exr file handles 32 bits per color channel OR MORE and has as many channels as you want. Even so, the .exr file can only be DISPLAYED as a 24-bit image, just like every image you've ever seen on your screen.

To say the Nvidia cards are only capable of 24-bit output is silly, to say the least. These cards can handle whatever you want 'em to, now let's develop some monitors which use fission detonations to power their brightness so we can all burn our eyeballs out!
 
==--
Look guys, OpenGL was invented so you could write to the interface and didn't HAVE to write drivers for every new chip.

Nvidia supports OpenGL real good, so what are y'all bitching about? Do you want to go back to the medieval DOS days when every game software wrote directly to the video registers and memory, and programmed the GPU in machine-specific assembly language?

Seriously, what's your problem, guys?

-faye
 
[citation][nom]Faye_Kane_girl_brain[/nom]==--Look guys, OpenGL was invented so you could write to the interface and didn't HAVE to write drivers for every new chip.Nvidia supports OpenGL real good, so what are y'all bitching about? Do you want to go back to the medieval DOS days when every game software wrote directly to the video registers and memory, and programmed the GPU in machine-specific assembly language?Seriously, what's your problem, guys?-faye[/citation]

OpenGL has inferior performance (at least when the programs aren't specifically coded for it and have to have DX emulated/translated/whatever into OpenGL) and does not work with everything right now. It's an arguably superior API, but it's not in prevalent enough usage. Besides, it would still need drivers anyway, so your point doesn't work.
 
[citation][nom]Faye_Kane_girl_brain[/nom]==--Look guys, OpenGL was invented so you could write to the interface and didn't HAVE to write drivers for every new chip.Nvidia supports OpenGL real good, so what are y'all bitching about? Do you want to go back to the medieval DOS days when every game software wrote directly to the video registers and memory, and programmed the GPU in machine-specific assembly language?Seriously, what's your problem, guys?-faye[/citation]

How is what you said related to what the people are "bitching" about? This is not an offensive rhetorical question. I'm really just wondering.
 
Why don't they publish the internals for optimus instead of giving linux users the finger? Rock on Linus!!
 
So what if the drivers aren't open source? Does Nvidia not provide the best 3d support in Linux by far? Every experience I've had with ATI in Linux has been a complete nightmare, especially after finding out that the latest beta Catalyst drivers dropped support for cards older then 2 years. And, I don't even want to go into Intel GPU support. Sorry Linus Torvalds, sometimes you just can't have your cake and eat it too. Be happy there is at least 1 GPU company that has good Linux support, and that company is Nvidia.
 
[citation][nom]behelit123[/nom]So what if the drivers aren't open source? Does Nvidia not provide the best 3d support in Linux by far? Every experience I've had with ATI in Linux has been a complete nightmare, especially after finding out that the latest beta Catalyst drivers dropped support for cards older then 2 years. And, I don't even want to go into Intel GPU support. Sorry Linus Torvalds, sometimes you just can't have your cake and eat it too. Be happy there is at least 1 GPU company that has good Linux support, and that company is Nvidia.[/citation]

And thus , with the crybaby LinT bird, another great amd fanboy FANTASY of lies, is born...

For years we will have to listen to amd fanboys shriek about nVidia and it's demonic mistreatment of linux... even as amd drools into their puss bucket and dies
 
[citation][nom]behelit123[/nom]So what if the drivers aren't open source? Does Nvidia not provide the best 3d support in Linux by far? Every experience I've had with ATI in Linux has been a complete nightmare, especially after finding out that the latest beta Catalyst drivers dropped support for cards older then 2 years. And, I don't even want to go into Intel GPU support. Sorry Linus Torvalds, sometimes you just can't have your cake and eat it too. Be happy there is at least 1 GPU company that has good Linux support, and that company is Nvidia.[/citation]

AMD's Linux drivers are better than Nvidia's drivers right now. Even before Catalyst 12.7, the 7970 beat the 680 in most Linux workable games such as World of Warcraft and others despite some of these games generally favoring Nvidia on the Windows platforms.
 
How exactly are they better? You gave one example, with one card, with one game. Furthermore, you are comparing two brand new cards... Give it a few months, they will probably trade places a few times. Anyone who has used Linux extensively knows that Nvidia has a much better track record overall. I'm not a Nvidia fanboy or anything, but that's just the way it is. ATI has a extremely short attention span when it comes to driver support.

Edit: ATI wins out occasionally with their flagship cards, but Nvidia is the bread and butter 3D card to use in Linux.
 
[citation][nom]behelit123[/nom]How exactly are they better? You gave is one example, with one card, with one game. Furthermore, you are comparing two brand new cards... Give it a few months, they will probably trade places a few times. Anyone who has used Linux extensively knows that Nvidia has a much better track record overall. I'm not a Nvidia fanboy or anything, but that's just the way it is. ATI has a extremely short attention span when it comes to driver support.[/citation]

That used to be true, but in the last one or two years, the AMD cards have had better drivers. I've been using Linux on and off for years and I can attest to how poor AMD driver support used to be and how much better it can be now. Ati didn't like Linux, but AMD cards have much better support. Nvidia did have a better track record for the longest time, but they have failed to stay better in light of the other side improving as Nvidia did not improve properly.
 


This is what pisses me off about AMD:

http://support.amd.com/us/kbarticles/Pages/AMDCatalyst126beta.aspx

Scroll down to the bottom and look at the "Hardware:" section. That's right, they dropped support for anything older then the HD 5000 series. That means anyone trying to use the latest drivers with a card made before 2009 is just out of luck. Nvidia, on the other hand, openly supports cards made over a decade ago. Forgive me, but I don't really see that as much of an improvement on AMD's part. More like... "Lets just make drivers as long as they are profitable, screw anyone with legacy hardware."
 
[citation][nom]behelit123[/nom]Forgive me, but I don't really see that as much of an improvement on AMD's part. More like... "Lets just make drivers as long as they are profitable, screw anyone with legacy hardware."[/citation]
Well, if that's the case, maybe they're having a change of heart now. If pure profit was their goal right now, they wouldn't support Linux'es on PC's, at least in the current state its share is on that platform as of now.
As for the lack of driver support for older cards, they may or may not be working on that, and they might've just started getting serious for Linux support and prioritized and focused on newer cards first thus the better performance. I doubt you can use the same drivers for every single GPU architecture under their brand. They may also need a push from the users to know that there are a substantial base of users who'd benefit from these Linux supported drivers.
 


Haha, it's not just Linux... They are dropping support for older cards in Windows AND Linux. This was really the final straw for me.

I was putting Fedora 17 on a laptop for a nephew and to my horror, the HD 3200 "Seen on laptops sold as late as 2011" is no longer supported by the latest AMD drivers. No big deal, just use an older driver, right? Too bad the latest xorg doesn't support the older ATI drivers... And, I've never had AMD / ATI drivers "just work" in Linux, it's always been a royal pain in the ass. Now, all that being said, ATI is in the business of making money. Linux is not a big money maker for them. I get that.

But here is the deal... After getting shafted several times by ATI/AMD's shady business practices, I'm much more inclined to give my business to a company who stands by their products. Even, when that product is no longer found on the market place. Heck, who knows? Maybe I DON'T want to throw away my 2 year old video card and replace with a brand new one. Maybe I want to use that older card in a media center or something. ATI/AMD just doesn't give me the warm and fuzzies in this regard.
 


AMD's not dropping support, they're extending support. AMD is simply prioritizing their driver releases for modern cards because their's not much more that can be done to optimize the older cards since they've already been getting optimized for several years now. Saying that AMD's dropping support for them is like saying that Intel is dropping support for P4 CPUs just because they're not making new drivers for them either. I guarantee that I can load up Windows 8 on a P4 computer despite their drivers having not been updated probably since 2007 or 2006 and the older drivers wouldn't cause problems.

Besides, people with 2009 and older cards are already out of luck overall because their cards are getting very outdated unless they are lower end gamers or unless we're talking about Radeon 4870X2s or similarly performing DX10 and older setups (of which there are only a few and none of them are very common). There's a huge difference between dropping support and prioritizing the driver update schedule. Heck, my laptop's Radeon 1270M is using a much older driver, yet it still does its job just fine in both Linux and Windows.

AMD reorganizing their driver update priorities so that they can optimize the newer drivers for the newer cards to give them more performance boosts is purely AMD progressing and not wasting the potential of their newer cards for the sake of constantly updating drivers for older cards that really don't need updates right now. Anyone's Radeon 4850 or whatever will still work fine so long as it doesn't get broken regardless of how long it runs on Catalyst 12.4 and the occasional revision over the years. You're acting as if you think that AMD is going to make the older cards unusable and this is completely not going to happen any time soon. If anyone makes older cards not usable, it will be MS when they make new OSes that need graphics cards that have GPUs that are compatible with the languages that the new Windows version uses.
 


Yeah, well, I guess you missed the post about how the latest drivers don't support the HD 3200 "or any HD card older then a 5000 series"... And the latest drivers are required if you want to use the latest xorg stuff in Linux. Funny example, because the HD 3200 was on new laptops as late as 2011. So, for someone trying to use the latest version of Fedora, you might see two different scenarios...

ATI/AMD user: "Hmm... When I try to install older drivers, it bricks the OS..." 2 hours and some forum searching later.. "I need the latest drivers to get 3d support in xorg, lets install them... WTF, it bricked my OS again?!" 2 hours and some forum searching later... "CRAP the latest drivers don't support my card at all!" *throws card out the window, user is very angry*

Nvidia user: "Wow, that was easy..." *user is very happy*



 

Could you elaborate more on that issue with incompatible languages? 🙂
 
Status
Not open for further replies.