Linux Needs GC Lingua Franca(s) to Win

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


This is faulty logic. A third party app that isn't multiuser friendly would act the same on both an NT system and on a Unix system. Those aren't used anymore, not even the archaic financial / banking systems are like that.

"Management" is patching / security contexts / user accounts / access control to resources and system policies. AD makes life in that regard much easier. You can even deploy ADFS and your productivity portals would be tied into your domain credentials, this enables single login. You then have your users login using either the older UN/PW combo or some PKI-based token / smart-card system. When your managing thousands of client systems and a few hundred servers this becomes very important, else you spend 100% of your time fixing things and become reactive vs proactive. Centralized management is a requirement, not an option for enterprise grade systems.

We once tried to get our Unix / Linux systems to authenticate to the domain as some hair brained scheme for universal access control. Didn't work too well and was very unpredictable. The default instructions that are posted all over the intarwebz doesn't work on a DISA STIG'd enclave. The Unix / Linux system simply doesn't know how to negotiate with the AD servers, and of course the developers doing Linux / Unix stuff have absolutely ~zero~ desire to add this functionality. We eventually got the Solaris 10 systems to work pretty well, Sun (Oracle now *barf*) has pretty good documentation on how to make it work in an enterprise including how to configure the Unix Services for Windows part and which group ID's to map. The linux stuff was not so good.
 
Its clear that Games are a major reason for users to choose Windows over Linux. But I feel that gamers are only one section of the market. Linux has improved so much in usability and multimedia that I feel more comfortable with it now.
But there is one major area in which I feel Linux absolutely sucks- and that is an Office Suite. True there is Ooo, Libre Office, IBM Lotus Symphony etc, but they are all based on the same core and even if they claim to be advancing- they simply suck in performance- opening medium sized MS Office files (say over 5-6 MB) still takes ages and there is no formatting retention.
The end user needs Office Software, Games, Multimedia Tools and a Simple Secure OS to work on. Most people and organizations hold on to Windows simply because of the quality of MS Office for which they are prepared to pay a price.
Linux beats Windows in Multimedia, Speed and Security. I can't say about games, as I'm not too much into them- but their Office Software sucks, and with the Oracle- Document Foundation debacle- the single tool which would allow Linux to enter Offices en masse instead of just the Server Rooms.
Kernels and Distros should not matter so much if we look at software as they run on all platforms- and that is what is holding Linux back- no matter what distro you choose- you get the same office Suite that is not MS Office cross compatible!
 
[citation][nom]vikramsbox[/nom]Its clear that Games are a major reason for users to choose Windows over Linux. But I feel that gamers are only one section of the market. Linux has improved so much in usability and multimedia that I feel more comfortable with it now.But there is one major area in which I feel Linux absolutely sucks- and that is an Office Suite. True there is Ooo, Libre Office, IBM Lotus Symphony etc, but they are all based on the same core and even if they claim to be advancing- they simply suck in performance- opening medium sized MS Office files (say over 5-6 MB) still takes ages and there is no formatting retention.The end user needs Office Software, Games, Multimedia Tools and a Simple Secure OS to work on. Most people and organizations hold on to Windows simply because of the quality of MS Office for which they are prepared to pay a price (as there is no option).Linux beats Windows in Multimedia, Speed and Security. I can't say about games, as I'm not too much into them- but their Office Software sucks, and with the Oracle- Document Foundation debacle- the single tool which would allow Linux to enter Offices en masse instead of just the Server Rooms is see sawing. All the hard work that made OpenOffice a known brand has to be done again.
Kernels and Distros should not matter so much if we look at software as they run on all platforms- and that is what is holding Linux back- no matter what distro you choose- you get the same office Suite that is not MS Office cross compatible![/citation]
 
Although I by and large agree with the author about the benefits of interpreted languages (also known as script languages), my main gripe is that the author choose to use the obscure term "GC". What he means are script languages like Python, Ruby, and Java, Matlab, or R.)

A computer-vision package called OpenCV is taken as an example, and its (very) low rate of reuse is blamed on the fact that it is written in C++ with STL and roll-your-own matrix classes instead of a script language with garbage collection, as the C++ code is really a thicket.

What I agree with is that C++ is a very poor choice for code that is both complex, experimental, and meant to be adaptable and re-usable, because C++ requires you to pay really really close attention to its (complex) underlying semantics. You ignore the fiddly time-wasting language subtleties on pain of crash by null pointer or general screw-up by memory corruption. And as soon as people use e.g. template programming, you're basically looking at opaque and undebuggable code.

This is the main reason why e.g. the IMSL library sells for thousands of dollars and why Fortran is still prevalent in engineering code. You can take most lines of Fortran code (or IMSL calls) at face value, but not C++ code. I also agree that C++ is best used for code that is mature and stable, and just needs to be optimised for speed, size, portability, interfaceability etc.

What I also agree with is that it's highly counter-productive to roll your own matrix classes. You're more likely than not to mess up the numerical aspects, you're extremely unlikely to get better performance than by using BLAS (or ATLAS) function calls, and what you want in matrix classes is basically something with the efficiency, power and brevity of Matlab. But there is really no need to go and roll your own matrix classes. A decent wrapper to the BLAS and LAPACK routines is enough (that's a large part of what Matlab and R offer anyway). Incidentally, Fortran90 comes pretty close in that respect.

We see something along these lines in the use of Matlab (or one of its Open Source clones such as Octave or Scilab) and R. The script language has powerful primitives (e.g. for matrix manipulation), and allows one to express algorithms in a very terse and readable way while the numerical heavy lifting is all done in by C++ or Fortran subroutine libraries that are called by the script language.

The result is *far* higher programmer productivity, *far* less errors, and a *far* wider community of experts who are very capable at formulating interesting algorithms, but don't have the time and the expertise code those in C++ (because they're scientists and engineers instead of code monkeys).

What I don't see is why this means one should use a language like Python, Ruby, Java, Ocaml, R, or Matlab, throughout. There has to be a balance, and specifically the computationally intensive primitives should not be written in any script language. Those should be in optimised, numerically sound, native code.

Oh, and did I mention that because of the overarching desktop marketshare of MS Windows, any serious application had better be able to run under MS Windows as well? In that respect interpreted languages provide an added benefit for applications: they're much less tied to a specific operating system than compiled code.
 
Just wrote a long reply that seems to have gone into /dev/null.

Anyway, Mr. Curtis is all over the place. For one thing, I don't really see what this has to do with linux or the kernel. He acknowledges that he's focused on userspace, and a lot of that code isn't even very OS-specific. And that dig on the baby boomer generation (which I'm not) makes me wonder if it's even worth taking this seriously. However, because he seems to touch on a problem I've been considering for years, I will.

I agree that it's more work than necessary to integrate disparate libraries into rich applications. IMO, the solution lies not in converting OpenCV and other low level libraries (and he should know by now that FOSS developers generally don't take well to Centralized Planning), but in providing a higher level interface than the C/C++ ABI. The standard UNIX response to this deficiency is to write commandline wrappers and create virtual file interfaces (e.g. /proc/), but this approach also has some serious deficiencies. IMO, what is needed is probably something akin to COM, though perhaps built on more modern technologies, like defining these datastructures as XML DOM trees.

But, getting back to my original point, he needs to decide whether he wants open source applications everywhere, or the linux kernel everywhere. If you want the latter, then a decoupling layer like this could actually be counter-productive, since it would make apps more portable (which is one of the benefits needed to get app developers to use it). I actually don't care if linux is everywhere. I use it at my job, I run it on machines at home, and it is important to me to have on open source option for my OS. I care much more about my data, and about vendor lock-in to things like office suites, media creation/distribution/playback solutions, and database software. And the kernel only has to do with that in so far as vendors like Microsoft, Apple, and Oracle use it to push their applications on us.
 
Status
Not open for further replies.