MATE accepted by Ubuntu ??

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

nss000

Distinguished
Apr 18, 2008
673
0
19,010
Gents:

Just been playing on my Xeon system running U_12.04.x and .... discovered as automagic the entire new (1.6?) MATE ecosystem ready for download. Since when ... & is this Canonical support for MATE well known except for me?

SPL sort-of hinted at this in a previous post, but I did not believe him.
 
Haha yeah I know that page well - could probably recite it from memory. I spent a lot of time messing around with this stuff for Debian 6 when I learnt how restrictive it is for wireless (I had been using it on wired internet previously with no issues). Not to worry - I'll just stick with Windows 7 and Xubuntu.
 

randomizer

Champion
Moderator


I don't think they feel privileged. They make a distro in the way that suits them best (usually), and if this also works for others, great, but it's not the end goal. Debian and Scientific Linux are probably fairly similar with regards to how much they can be "trusted" by a casual (or any) user. Debian has an enormous userbase, and SL is more or less a clone of RHEL, which also has a large userbase and commercial backing (this commercial backing will only indirectly benefit SL, of course). They are also both similarly difficult to use. Neither is built specifically with home desktop users in mind like, say, Ubuntu.



The drivers aren't crippled, they just aren't included by default. They are usually pretty easy to get via the non-free repository though, which is why the Free Software Foundation won't bestow its blessing upon the distro.
 

nss000

Distinguished
Apr 18, 2008
673
0
19,010
BigR:

Rewording your comments I might conclude(?) that Debian & SL are **neutral** toward usrland. Neutral, rather than hostile makes each worth-a-try. Modest abilities aside, all system responsibility is mine. When the digital chickens nest, I alone gather the eggs and make quiche ... er, or suchlike ...






 

randomizer

Champion
Moderator
Neutral is probably a good word to describe it. Neither distributions' developers have a vested interest in whether you use it or not. In fact this is fairly true for many (most?) distributions, which are made to suit the vision of the developer(s) first and foremost. The same can't be said about Canonical, which does plan to eventually make some money from Ubuntu, but thus far has had no luck.
 
When people start developing things for the sake of monetary value instead of pride, quality starts to degrade in some important ares such as security, stability, code management, etc.. I'm sure there will be a horde of people to jump on me for calling Ubuntu crap, but I don't care.
 
Ubuntu puts customers/users first. That is a good business strategy, but a poor development strategy. Customers/Users do not care how good the software is programmed. They just want results. So if you start hacking things in just to get a product out, then build quality sucks, even if it works well.
 

nss000

Distinguished
Apr 18, 2008
673
0
19,010
They don't care? You need to split a few hairs. The RedHat_pimps running GNOME **care very much** (their own words) about establishing the REDHAT ... er, GNOME brand as a specified set of non-modifiable on-screen behaviors and **forcing** those specified UI behaviors on all GNOME lusrs. As many have, I read the dev-posts to precisely that effect; they were well publicised. Go look them up !!

And are you saying that Rich Stalin ... er Stallman and the monk-like bearded-onzzz at Debian/ARCH/GENTOO do not fantasize conjuring (infecting?) their own "byteboy" behavior on as many *nix lusrs as possible? Anyone close, actually ... You sir are unredeemably innocent to believe such virginal twaddle.




 

amdfangirl

Expert
Ambassador
It's no secret Red Hat wants to move to GNOME 3. It's well documented they intend to move to GNOME 3. They have people paid to contribute to GNOME. Cynics say that GNOME is a Red Hat company.

But we still have 7 years of support (and 10 years if you're a Red Hat customer) for GNOME 2.

By then LXDE might actually be a stable desktop xD.

7 Years is a long time. I dare you to go and boot up Fedora Core 5 from 2006.

10 Years is a longer time still. Keep in mind GNOME 2 first released 11 years ago.

Time is still on our side. Especially now that there are more developers more connected than ever.

Don't worry, Scientific Linux keeps RHEL straight. They add the packages that make RHEL usable.

For example: They have an IceWM option on their official repositories so that if the GNOME 3 situation gets so bad you have a fallback of sorts. They might even decide on MATE support.

Remember what interests Scientific Linux supports: Laboratories (CERN and FermiLabs)

Two things are likely to happen with Scientific, either they continue supporting icewm as an alternative desktop manager or they adopt MATE for version 7. Retraining is expensive, so there is a good chance that they will include/contribute to MATE rather spend resources retraining staff. Remember, these people want to run particle accelerators and that kind of stuff. Any eye candy is a waste of resources.

If worst comes to worst you have at least 7 years before the updates run out.
 


Bad code leads to bugs and poor performance. At the very least it makes the code more difficult/unpleasant for the next dev to work on, leading to bugs and poor performance. I'd say the quality of the software impacts users in a very direct and noticeable way, and I'd point out that of more than thirty distributions I've tried, Ubuntu was by far the most stable and least glitchy of the lot (along with Kubuntu and Xubuntu). And I have tried Debian (spent quite a bit of time with it infact). So whatever Canonical are doing, I'd say they're doing it right.
 
Ubuntu is okay for users who use it like Windows. For users who like to mod the system, it sucks. Can easily become unstable. And that's not even modding the kernel or the OS itself. But like if I want to install an app that requires some libraries that are either not provided by apt or are not the correct versions and I want to compile them all myself, they usually work fine. Some times, when that compilation fails and I start debugging things to fix it and finally get the app I want successfully working, upon the next boot, I lost my dm or Ubuntu doesn't even get to my login screen.
 
Ah well I'd definitely say I use it/them like Windows. I'll screw around on the command line in Arch and any other distro that requires it, but only when it's required. What kind of modifications are you talking about? And what distribution would you recommend that would be better suited for making those modifications?
 

randomizer

Champion
Moderator
Not a fan of the command line eh? :) I actually use it even when not necessary, because I usually find it faster to do what I want than most of the available GUI tools, but I'm by no means a hardcore CLI purist.

I'm not a great fan of Ubuntu. I don't really like Unity (die top panel, die!), and I don't care for the social fadware that comes preinstalled. I normally go for Linux Mint if I want an Ubuntu derivative. I like their development model: they have their own vision of the desktop but it is actually influenced by users, unlike Mark Shuttleworth's One True Vision. My only gripe is their minor customisation of Firefox, but I do understand why they did it, and it's relatively easy to "fix".
 
Any stable distribution like Alpine Linux, Debian, Slackware, Fedora, Gentoo. Ubuntu is relatively stable, but because they are on high demand from their customers with all the new an fancy software that are not tested to the point of destruction as some of the core distros, they can have corner cases that can cause things to break. FreeBSD is a really good alternative, but it is not a Linux distribution.

I've broken many distros before, but I can usually fix them by undoing my changes. Ubuntu, openSUSE, CentOS, and Arch are the ones that really gave me a really hard time if I mess up something. For some reason, just undoing my changes doesn't fix it. Not sure what other files are referencing, but I guess that I killed them. I usually run these fixes off a LiveCD and then try booting into it to test the fix.

System modifications like rearranging partitions (I run /usr, /etc, /bin, etc in separate partitions), modifying /etc/alternative symlinks, compiling and installing new kernels to test them, compiling and installing new versions of ffmpeg, and other stuff. Too long to list. Just whatever I have fancy for really. I love trying out new things and keeping things relatively up-to-date. But when I do that, sometimes when compiling, make, or make install fails, I go and debug. Try to figure out what dependencies are missing and I have to install those, etc. Just things like that doesn't always play nice with Ubuntu, Arch, CentOS, and openSUSE. Ubuntu and Arch especially. CentOS and openSUSE actually are not as bad, but support for them just suck. I can't Google anything for them.

As for CLI, I live off CLI -- tmux, vim, everything. I go very simple with awesomewm. No top/bottom menu bar.
 

amdfangirl

Expert
Ambassador
Very true - My advice about RHEL version x and derivatives: "If that version you want isn't in repositories specifically for RHEL/CentOS/SciLinux version x, don't bother*"

*Unless your game is office suites or something like Flash that's fine.

But that's besides the point. If you've read any of nss000's threads you'll know he favours stability over every thing else.

The main audience for a distribution like CentOS or RHEL is people/companies/laboratories that want things to be working smoothly, as reliable as possible. Ideally security updates and bug fixes are prioritised over new features.

Likewise the majority of CentOS users won't do what you do and install software outside the repositories. That's why there aren't many threads or support articles.

I'd definitely say I use Scientific Linux like Windows. I avoid CLI where possible (with the notable exception of installing software - it is so much faster to type
Code:
su -c "yum update -y"
than going through the update thingy.
 

nss000

Distinguished
Apr 18, 2008
673
0
19,010
Even fly-tiers know if the fish-don't-byte he's failed! There's a nasty (unrepeated) name for **display flys** better placed in a museum-case than a trout-stream. Frankly sir, there exists **no** other (real ... like a true Welshman) measure of computer code-quality than how the resulting programs runs vis' **the customer**. Efficient/elegeant code == transparent, stressless ongoing user use. Period! No *PLATONIC HEAVEN** exist for the computer coder, like those Elysian Fields of the pure mathematicians. May West assures us that none care about her beautiful liver! Gauss is **not** the programmers hero nor Euler, but rather some 1950s un-named COBAL-jockey who got company paychecks out on-time and enabling his CEO to follow the robot process while their chief designer sketches yet-another-toothbrush on a cocktail napkin.

"Coding" water-wheels starts in the middle-ages, cotton-spinners go back to the 19-th century; punchcard census analysis to 1900; Church & Turing were un-needed, mischievous maths.envious freaks. In what make-believe XANADO or OZ do your entirely self.absorbed, self.directed, self.gratifying, self.congratulating coders exist?




 

nss000

Distinguished
Apr 18, 2008
673
0
19,010
You disclose an important issue (name?) of close-to-the-surface code weakness in some usrland Linux distros. Even pure usrland types like myself find that important. Please note that my work-day tasks ( what pays my rent etc ) depend 95% on a smoothly working Linux system. I hear you arguing for "multi-level" system stability which -- tho rarely used -- may be lacking in say UBUNTU or FEDORA and someday unexpectedly rear-up and bite-yo-butt!

This sub-surface OS strength must be balanced against the usrs day-to-day tasks, talent and time. If I must spend 60-hr enabling a Linux variant to function, instead of running 60-hrs of lab experiments using a solid legacy WinXP datalogging system then I must justify that loss of experimental output! But, your point is that there (may) always be a gotcha that kills you, taking the quick-way thus justifying all initial extra effort. Serious issue.



 
@nss000: yes, there are trade-offs. That was the point that I was trying to make. While Ubuntu is great for customers and those who use it like Windows, it will work great out of the box. But for those who like to dig deeper into the system and try to make modifications, Ubuntu won't handle it as well as other more stable distributions. I understand that very few people actually mod their systems. The majority of Windows users have no idea what's in their system, let alone make changes. The majority of OS X users are the same.
 

nss000

Distinguished
Apr 18, 2008
673
0
19,010
Yes you raise the devils-own conundrum ... beyond your preference for challenging experiment and uerlands desire for blunt-headed stable production. If I understand correctly, you imply the latter **always** fails, and thus usrland should buy insurance by sampling the experimental computer waters.

I will soon see some of that software novelty when hardware upgrades are finished on my Xeon system. Appreciate you raising crucial points.



 

nss000

Distinguished
Apr 18, 2008
673
0
19,010
Ms AMDgurl:

I certainly do follow your advise to keep U_12.04.x gropy-paws away from my legacy production box like it was a cross between Lector and Gacy!!! Took me two (2) years from my initial installation of U_10.04LTS to get stable sound behavior. Two years of garbage! Every OS update hammered/vaporized/contorted the speakers like John Henry beating on steel rails .. I kid you not. Sure I'm risk-adverse. Finally toward the end of 2011 the sound system stabilized and that's no-small-thing as I have a classic legacy Altec Lansing (2003) sound system that reincarnates the Dead every time I stream TERRAPIN STATION.



 

nss000

Distinguished
Apr 18, 2008
673
0
19,010
**vis'** Single-minded attention to a static, stable computer system leads ... to an unstable computer system. The only certain stability is dynamic(= experimenting) stability and the most secure system OS enables this dynamic most comfortably. If I catch your drift ... Ozymandius after-all ends up face down eating sand .