Linux Needs GC Lingua Franca(s) to Win

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

dbranko

Distinguished
Jan 13, 2010
2
0
18,510
[citation][nom]bernardv[/nom]So, Mr. Curtis thinks we need to reinvent JVM? E.g. OpenJDK is available for al major distributions. Or am I just stupid and I don't get the depth this article?[/citation]
I think what he tries to say, is that linux user program developers should start using OpenJDK/Mono/whatever, that gives productivity boost over C/C++ for majority of user applications, and not stick to C(++) most of the time. C++ would not be such a problem, if it had more de-facto or otherwise standardised libraries.
 

bernardv

Distinguished
Jan 12, 2009
38
7
18,535
@dbranko
I'm fine with "whatever", as long as it has nothing to do with Microsoft. Mono will always be just good enough for people to get a taste of things. If you want the real thing (latest version, all features 100% ...), you are quietly invited to switch back to good old Windows. I don't think this helps Linux one bit.
 
As others have said the problem with "Linux" is no common program management methods. Every distro likes to do its own thing, this inflicts a level of clumsiness that is insurmountable to the average PC user. The fact that you have two completely different package installation camps is utterly madness, there should be absolutely no reason for .rpm and .deb to exist separate of each other. Then you have the "just compile it" method, asking an average user to compile a piece of software is an absolute nightmare.

Next is all the god damn subversioning going on. Its like every single required library needs to be updated to the latest in order for a program to work, but then another program needs an older version of that same library. Now the user must make a choice, they can only use one of those two programs because the user isn't savvy enough to tinker with installation dependencies and library installation paths. Installation of "drivers" is ... utterly painful. Windows just requires you to download and run an .exe setup file, Linux requires you to recode sh!t.

There ~needs~ to be a central authority / release control on common library's and dependency trees. At least in MS Windows the application can often choose to install the libraries it needs into its local folder and use them from there without needing to put them in the system folder. This results in every application getting its own set and solves the compatibility issue.

And the lack of GUI's for everything is unacceptable. It may feel really cool and geeky for tech's to do everything on a command line, but average use's ~NEVER~ want to see a command line. The entire reason MS Windows because popular is because it did everything it possibly could to hide the command line from the user. And what little interaction with the command line the user had to do was kept to a absolute minimum with simple easy to follow commands. Unix in general practically requires people to learn RegEx's and programming to utilize the command line.

I'm a Solaris 10 administrator, I practically live in bash. I don't mind it because its what I do, but I would never expect my parents or my gf to have to do what I do every day.
 

bernardv

Distinguished
Jan 12, 2009
38
7
18,535
[citation][nom]palladin9479[/nom]As others have said the problem with "Linux" is no common program management methods. Every distro likes to do its own thing, this inflicts a level of clumsiness that is insurmountable to the average PC user. The fact that you have two completely different package installation camps is utterly madness, there should be absolutely no reason for .rpm and .deb to exist separate of each other. Then you have the "just compile it" method, asking an average user to compile a piece of software is an absolute nightmare ...[/citation]

When was the last time you checked a Linux distro? You have GUI for package management everywhere, no need to care about rpm etc. I'm using RHEL 6. I can't think of any missing GUIs in general. What do you think should be covered additionally, any examples?

As for compiling; why would an average user need to compile anything? All regular stuff is available in binary form.
 
Stop thinking "distro", last I checked MS Windows doesn't have a "distro". If I want to download / install a MS Windows program I go out and download the .exe or .msi and *bam* installed. If I want to download something for CentOS I get it in either .rpm (if I'm lucky) or source and I'm forced to compile / install it.

~Every~ distro's "software management" required the support of that distro, and if that distro doesn't have their own version of the software I want, then I'm stuck doing the compile / install dance. There is no universal installation / package management method, if I write a program then I'm forced to chose a distro in order to distribute it in binary form.
 
G

Guest

Guest
On my openSUSE, installation of any of 1000's pieces of software is a few clicks away: Main Menu / System / Configuration / Install-Remove Software. On Ubuntu and derivatives it's even fewer clicks. The casual "granny-level" users can find whatever their heart desires, from productivity to Internet, Science, development, games, you name it! Chances are, one new-comer to Linux will be overwhelmed by the variety available. And you know what? Everything in the repositories is guaranteed to work, no matter its dependencies, because it is the responsibility of the maintainers to be so. That's why I always recommend to people to use well-established distributions with good package management. Really, software installation is a "problem solved" thing for Linux these days.

The advantage (and huge vulnerability) of Windows is that you can grab a .exe from the Internet and just run it, so a potentially infinite number of programs / applications is out there, waiting to be downloaded, installed and run (and ruin your OS, delete or steal your personal data, I might add;)

Granted, it is often more difficult to do the same in Linux, although many pieces of software are available as RPM and / or DEB packages to download and install (all the major -- sans IE, of course -- browsers, VoIP programs, etc). It is more the mentality of free software that dominates Linux that largely causes this difficulty: open-source apps are often distributed in source form, or RPM / DEB packages do not include their own copies of required libraries. These mean, it is very probable that a user will need to compile something at some point. But "big" distros have readily available a very broad range of software, which should cover most -- if not all -- needs of a user. It is just that Windows has made everyone so lazy!

For me, Linux's major problem is hardware compatibility: new laptops coming out with new hardware that doesn't work correctly, like card readers, flaky suspend-resume, cheap wireless chips, etc. These are stuff that can quickly ruin the experience for a newcomer to Linux.
 

bernardv

Distinguished
Jan 12, 2009
38
7
18,535
@palladin
Having a choice is sometimes more complicated than not having one. But I guess that is the cost of freedom and as such cannot be a bad thing.

By the way, if you let's say chose RH or some derivate, you are certainly NOT stuck with software that is available within RH. For example RPMForge is a open repository of rpm's. You can add it to your list of sources and then install from it with the same GUI (YUM front end) as you did from the start. No command line necessary, let alone compiling.

I have to add that I see the RPM system with YUM as fundamentally far superior to dll hell on windows.
 

Vladislaus

Distinguished
Jul 29, 2010
1,290
0
19,280
[citation][nom]razor512[/nom]Another annoyance is that many command line processes such as installing a program, will require you to do the same things, this is how a tutorial can tell you step by step how to install a tar.gz or other format, you can even copy and paste the commands in and just change the namesIf they want more people to use linux then it needs to get rid of the useless work.For example, if a user fines a awesome new app and it is not a .deb, when the user double clicks on the tar file or the rpm file or what ever other annoying format is used, it will not provide them with the option to install or run the program.What the OS needs to do is pop up something like, I see that you are trying to use this tar.gz file, would you like me to install it for you?[/citation]
The reason you need to use the cli for tar.gz files it's because what you've download is the source and needs to be compiled. If you download a rpm/deb file all that is required is basically a double click on it.
 

bounty

Distinguished
Mar 23, 2006
389
0
18,780
"We need to get these paranoid programmers, hunched in the shadows, scribbled secretes clutched in their fists, working together, for any of them to succeed."

BS, proprietary programmers succeed just fine, it's the OSS guys that can't get the market share.



"it is the proprietary licensing model that has infected computing, and science."

What's you problem? Boo hoo, somebody made a program and wouldn't give you the source. I suppose you would have us get rid of copyright and patents all together? That way we could all suffer Linux with you. No.


Linux needs to get rid of these paranoid programmers, hunched in the shadows, blaming other licensing models for their lack of coordination. Sorry, but I'm sick of zealots trying to preach software religion to users. Make a compelling product and let that speak, leave the religion at home.
 

dertechie

Distinguished
Jan 23, 2010
123
0
18,690
It would be really nice if you actually told us what GC actually stood for. I shouldn't have to read the comments to find out you mean Garbage Collection. A potentially interesting article is reduced to gibberish without that information.
 

Tjik

Distinguished
May 17, 2006
28
0
18,530
[citation][nom]palladin9479[/nom]As others have said the problem with "Linux" is no common program management methods...
Next is all the god damn subversioning going on...
There ~needs~ to be a central authority...
And the lack of GUI's for everything...
[/citation]
You know that all mainstream distributions include a package manager GUI, and hence a user doesn't have to bother about package formats. In the rare cases you depend on direct download rpm and deb are established standards and hence most software vendors offers both. The benefits of repository software outweighs rare inconveniences.

A well maintained Linux system is leaner thanks to less duplication of libraries. Maintained software will get updated upstream in accordance to new versions of libraries. This is the sane and secure way of handing software.

Something that applies to both these subjects is the existence of a Linux Standard Base (LSB) and Filesystem Hierarchy Standard (FHS). If you're interested in the subject, in a sense you should if administrating Solaris, you probably know about recent discussions about a /run directory for better support of systemd. In those discussions you see common collaboration between distributions to achieve a recognized standard. These standards makes it possible to with ease immigrate configurations between different Linux distributions.

I'm surprised you think that a program only can run if supporting system libraries. Nothing prevents a program to run with its own set of custom libraries. Some developers of games do this at times. You don't even need to install such software and hence let them reside in your /home directory. This isn't in accordance with established standards, but if you like to break them, which you in another context doesn't seem to support, it's easily done. You say that “every application getting its own set and solves the compatibility issue”, which in its consequence also means it adds complexity, less security, bulk and potentially issues very difficult to solve. I've dealt with this several times as an Windows admin dealing with maintained up-to-date software. There's a reason for why Unix and Unixe-like systems follow stricter standard.

The holy grail of super easy – or in most cases not easy but established complicated sequences of mouse clicks – depends on costly support. There's a similarity between the dark ages and the Windows generation of average users. We pay the price in form of taxes covering up the mess, and some pay extra money for bad support. There's no ideal solution here, making the super easy the technically best choice; instead there's choices between different directions, all having different sets of demands concerning money, knowledge, freedom, personal security and so on.
 

AGPC

Distinguished
Nov 27, 2010
258
0
18,790
IT would take YEARS for linux to catch up and get as big as Mac or Windows
and as long as games or made for Windows i will always be using it.
 

bernardv

Distinguished
Jan 12, 2009
38
7
18,535
[citation][nom]AGPC[/nom]IT would take YEARS for linux to catch up and get as big as Mac or Windows and as long as games or made for Windows i will always be using it.[/citation]

Games are made for consoles, PC gaming is dying. The only thing keeping Windows alive is MS Word and I don't think that can last indefinitely.
 
G

Guest

Guest
AGPC

Word is slowly dying. The reason is two fold.

First, Word is not compatible with Word. If you have an older edition, or even sometimes just a different version of Windows, then Word cannot open Word documents. If you have MS Office on a Mac, then you are used to this scenario, but it exists on Windows too.

Second, Open Office, Libre Office, Gnome Office, Word Perfect Office, IBM Symphony, etc. People are discovering that Word is not the world. Some are even realizing that the compatibility problems they have opening documents from anything but the latest Microsoft product are really problems in Word. The reality of the Web with it's expanded document possibilities is driving this point home. Microsoft's refusal to allow anything else to open up in word without problems is coming back to haunt them. Many businesses are discovering that the 'safe' choice made 5 to 10 years ago that would keep their business documents always available is not so safe, as they can no longer open the documents. Even large corporations are looking at alternatives. Any of the alternatives can do nearly anything that Word can do. Some do things that Word cannot do. Excel is still an anchor for some, as it has been used for many things that aren't appropriate for a spreadsheet, but that is being realized also.

Microsoft Office isn't dead or even dying, but it is in decline.
 
G

Guest

Guest
I think the point in this article is being missed by most of the commenters here, which is unfortunate, because it is a good one. What matters most is not how the software is brought to consumers. Right now, Linux is as neck-deep in kruft as any product from a lumbering corporation. The development of Linux has always been hampered by a large group of its developers seeing it as nothing more than a way to avoid ever learning anything new. They want to stay with their bash and emacs for the rest of their life, and damn anyone else for thinking of something as crazy as a relational filesystem or the like.

A GC Lingua Franca (by the way, at no point in the article did you explain what GC stood for... you might want to add that for readers who don't know, or are unsure. The wiki link to "Lingua Franca" surely is less useful than one to "Garbage Collection" would be.) would be a boon to Linux. It could, actually, be C#. If Mono or an equivalent were to become truly robust and fast, I think most Linux developers would find C# to be a very easy language to transition to. It's like C++, but designed by someone who was actually thinking about how manageable the code would be. Originally it looked like a Java clone, but today it stands out greatly with things like LINQ and all of the functional-language enhancements they have been adding. C# is an ECMA standard language, and it could do great things on Linux if people could get the idea that it's "Microsofts language" out of their head.

Another approach might be to look for a better paradigm in coding. It would certainly be more risky, and given the Linux communities reluctance to embrace new things (Why wasn't there a Linux relational filesystem released a few months after Microsoft started talking about WinFS? That could have been an EPIC score for Linux, even if it didn't work as well as people imagined it might.) I don't expect it to happen. But how about something like 'coding in layers'? Similar to using multiple layers in Photoshop or other similar applications. The code itself wouldn't change that much, except you would have multiple layers of code that get overlaid on one another by the compiler, providing logical abstraction to the developers, extra flexibility to the compilers. For instance, you might have an "error handling" layer. When viewing the primary "functionality" layer, you would not see a scrap of the error handling code. A simple hotkey could enmesh the two and let you see how they will line up. They would share scopes, and be able to reference the code in other layers easily. Tools could easily operate on a single layer. Sure, getting the interface down right would take some experimentation. But the benefits could be huge. Layers would be for separating the things which currently can't be broken out into objects, or even separate functions, because they truly belong together even though they are conceptually separate to anyone reading or writing the code. What do you think?
 
Its amazing to see the Linux fan-boys defending what is commonly known as a serious flaw in Linux. You keep pointing me to this "remote downloading" software installation feature, but that ~ONLY~ works if the distro maker has that software inside their repository rebuild and it is the supported version you want. You know for a people who are supposedly all about freedom of choice and openness you guys see to be espousing developer controlled style application control. And please stop trying to treat me like some lost-in-the-woods kiddy, I'm neither. I live and breath inside Solaris, one of the most archaic UNIX OS's in existence. AIX and HPUX were breezes compared this this monster. At the house I use CentOS as my server OS of choice, primarily because its just RHEL re-branded. Anything designed to work with RHEL will work with CentOS without modification.

And this is the exact type of compatibility I'm talking about. The entire Linux industry needs to get behind one software installation standard. Once location for configuration files, one unified system service method. I shouldn't have to be googleing each distro's method for handling system services if I want to fix / customize something. Get one method for configuring and initializing network adapters cause I swear each distro seems to want to put its net config files in different locations and with different formats.

For all its evils the one thing MS did right, the one thing that set them up for success is they set standards for everything. And while there was wiggle room in implementations, especially by the HW driver manufactures, generally if you designed your program for Windows it was guaranteed to work on 90%+ of the installation base. That "DLL hell" people talking about is what made Windows the biggest desktop OS in the world, because its not just "granny" that is using this. Its sister, its father, its mother, brother, cousin, friend, and stranger. Its the pizza delivery guy, its the car mechanic, the hospital nurse, the doctor, the lawyer, the judge, the school teacher and yes even the lowly computer geek. You don't design your philosophy around the marginalized not-profitable geeky crown, you design it for the largest demographic possible. The reason RHEL got to be the "go-to" Linux OS in the enterprise world is that they sat down and standardized everything, provide materials for that standard and created whole suites of tools to make the enterprise admin's job easier. Ubunto has made impressive headway's to doing this for the "desktop" world but its still lacking in many areas. This is due to every damn linux guy wanting to go off and do things their way, causing all sorts of headache for the average user.

If you can't wrap your head around those concepts then you deserve to be stuck in the dark.
 

Tjik

Distinguished
May 17, 2006
28
0
18,530
[citation][nom]palladin9479[/nom]Its amazing to see the Linux fan-boys defending...[/citation]
I'm sorry but I draw the line between purposeful discussions and meaningless arguments here.
 


Except that I'm a Unix guy through and through. I employ Linux extensively both at home and at work. I'm just forced to use Windows 7 x64 for my heavy gaming box due to everything needing DirectX nowadays. I'm also the crazy bastard who was able to compile GL-Quake, GL-Doom, eDuke32 and Dos Box 0.74 on my SunBlade 2000 under Solaris 10.

SunBlade 2000,
2xUltraSparc IIIi 1.13 GHZ
8GB SDRAM (SUN's propriety implementation)
66MHZ PCI-X XVR-1200 graphics card. Its basically two WildCat IV GPU's on a single card
2 x 146GB 10K FC-AL drives for local storage.

Was able to actually do 3D FPS video gaming using openGL, was a weird experience as the graphics card was designed for CAD / CAM type work and doesn't particularly do video games well. Still worked with playable frame rates. Also was able to get Windows 3.11 installed and operational inside a DoxBox running on this system, so technically Windows on a SPARC. Next project is to get Bochs working right so that I can install Windows 98, and possibly even 2000 into the system.

So please dispense with the hand waiving nonsense. There are obvious fan-boys in this thread who refuse to acknowledge one biggest down sides to Linux currently. You can't fix a problem until you've acknowledged its existence first. Linux is suffering from too many various distro's utilizing their own implementation methods. Installing software that is not specifically vendor provided (aka from the distro's repository) is a nightmare. I experienced this first hand when I was trying to get a Linux OS for my SunBlade 2000. Debian was a complete waste of time as their philosophy completely prevents their distro from running on any semi-recent Sun system. OBP doesn't allow cards to load firmware prior to boot, instead the OS must supply the firmware in binary format. The onboard FC-AL controller has freely obtainable binary firmware, they just don't supply the source due to industry secrets. The XVR-1200 graphics card has drivers freely available along with the binary firmware for it (its a super 3DLabs Wildcat IV basically) but because of its closed source licensing model most linux Distro's won't supply a driver for it. Debian won't properly install, and after modifying the install disk (forcibly including the firmware binary into the DVD iso and forcing it to load) the OS wouldn't load X11 as it didn't have a FB driver available for the Wildcat IV. Installing it was insanity incarnate, and it still didn't do OpenGL acceleration. Eventually scratched the whole project and just Installed Solaris 10 onto the system and went from there.

Which reminds me, something SUN did extremely well with Solaris was package / software management. Not only is there a universal GUI for managing software (though I still prefer the CLI tools) but this GUI can be used for any software not only "SUN Distro Site" software. There are so many available pkg's for everything that compiling from source is only needed if your doing something really crazy. I could of gotten game engines for those above systems in pkg format, but I wanted them optimized for SparcV9 and UltraSparcIIIi CPU's with better SDL support for my GPU.

This is something all the Linux guys need to seriously consider doing. Use Solaris 10 for awhile and you realize how stable and standardized the OS is. Everything from raid array management (mpxio / fmadm / ZFS) to network link aggregation (dladm / nettr) can be done easily. Even using SUN's built in virtualization (zones) is simple, although learning to shift your brain from thinking of "VM" as emulated hardware to thinking of it as environmental compartmentalization is a bit tricky. Take what this does well and marry it with all the good things that Linux does and you'd have the ultimate consumer OS. RHEL has taken great strides in doing this for the enterprise sector, now just to do this for the desktop market.
 

Vladislaus

Distinguished
Jul 29, 2010
1,290
0
19,280
[citation][nom]palladin9479[/nom]Its amazing to see the Linux fan-boys defending what is commonly known as a serious flaw in Linux. You keep pointing me to this "remote downloading" software installation feature, but that ~ONLY~ works if the distro maker has that software inside their repository rebuild and it is the supported version you want. You know for a people who are supposedly all about freedom of choice and openness you guys see to be espousing developer controlled style application control.[/citation]
A distro comes with a set of repositories by default but these can be changed at any time. Also many software houses now have repositories of their own for people to add to their distros, making it easy to install and update.
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
[citation][nom]Keith Curtis[/nom]I find widespread fear of having ideas stolen in the software industry, and proprietary licenses encourage this. We need to get these paranoid programmers, hunched in the shadows, scribbled secrets clutched in their fists, working together, for any of them to succeed.[/citation]


So in order to avoid ideas being stolen, the author of the idea needs to hand it over to a bunch of other people that can then claim ownership. Yes makes sense. In the same way that not showing everyone how your software works encourages people to steal it - according to you.

As for the programming part of this - I have no ide what GC is (guessing some programming language and not global catalog), but I understand that standardizing on a language helps development. But it doesn't help innovation. Any innovative technology is limited by the platform it is developed on. That's why all pc games are 'alike' (all written with the same limitations imposed by directx and resource limitations). That's why network technology doesn't evolve (limited by ieee standards that can't keep up).
Innovation stops when you start hearding all cattle the same way. From a money making perspective that is good I assume, and it does save resources. It also makes thieft eaiser (goes hand in hand with porting being easier really), and it makes everything more predictable and in turn less interesting.

But it makes sense nontheless. Microsoft's desktop got big by being relatively stable, boring and very much predictable. Almost all software behaves pretty much the same way (using the same controls etc). But it doesn't make it innovative, as everything's limited by said controls and mechanisms.

I believe if Linux wants in on the desktop marked, it's not good enough to just try to copy microsoft. It does take a joint effort, but I don't believe simply be
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
Bah stupid laptop just had me posting half a reply! stupid mousepad that can't be deactivated.

Anyway, to finish up real quick :

I'm convinced a Linux desktop won't be viable until they do something better than microsoft - not merely just as good.
That means perhaps building on the directx idea of adding features in new files without breaking the old ones (linux library people could learn something here), and on top of that enable coders to write their own features if those present aren't sufficient. Linux needs to innovate - not just be. That's not good enough if you aren't already the king of the hill.
 


This idea is where people keep getting things wrong. Distro supported repositories work for OS level updates and common library updates, maybe even updates for included applications. Distro's homepages should not be the default goto site for software, people should be going to the producer's site themselves. This is the equivalent of MS trying to host installations for quicken, turbo tax, games and what not. It just wouldn't work well.

The only reason distro's took to offering large repositories is because manually installing software was so painful on any UNIX based system. UNIX was never designed with simplicity as a priority, it was designed to be highly customizable, efficient and offer a very fine grain control of the system. It was developed for mainframes and supercomputers, not desktops / laptops. The sole reason Windows has been so successful is because of its easy of use and compatibility. If you want to play around you can do something on Linux, if you want actual productivity then you use MS Windows + Office. This isn't some MS crap fanaticism, its the direction the market has gone in, the path of least resistance. From an administration point of view, MS Windows is about 100 ~ 1000 times easier to manage en mass (think thousands of client) then any UNIX system I've ever seen. Microsoft has had some form of easy to setup central account management system for how long? Unix has the mess that is NIS / NIS+, or some home developed thrown together mess of LDAP + Kerberos.

To be accepted as a desktop OS, for both home use and enterprise use, Linux needs to create a standards baseline for everything at the system level. Include built-in tools for all administration tasks, I shouldn't ever have to drop to a command prompt to do a common task. CLI is fine for deep analysis or troubleshooting, but day to day things should be done via easily accessible GUIs. If it had all these then I'm positive most of my organization would change over to a Linux / Unix software platform. And we're not a small organization by any definition of the word.
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
what organization is that?

ps. some parts of ms world is considerably harder to manage than linux - for instance default user profiles or 3rd party software written without multiuser systems in mind.
 
Status
Not open for further replies.