Best Linux distros for reviving an old PC

My main Laptop uses Manjaro Linux, but with really Old Gear I prefer MX Linux. Linux Mint is a really good starting point for newcomers coming from Windows. I've also rediscovered Fedora (41) and openSUSE again. Not a fan of GNOME myself, prefer XFCE. But that's the beauty of Linux, although I've been using it since the mid-90s.
 
Last edited:
  • Like
Reactions: bit_user
PSA:
There is no linux that will make your old hardware better than it is, if your goal is to watch modern youtube videos or any other media from the web, forget it.
A decade old PC like the article talks about will be doing decade old things.

The only reason to prefer linux instead of windows is for the safety when going online, but as said above, there is very little reason to go online since you can't do much useful other than download files.

Old PCs are good for watching DVDs and that level of videos from your hdd, and for playing old games that have issues with running on modern OS.

All the above is for low end decade old systems, if you had like the super duper GPU/CPU from 10 years ago you will be good for youtube and heavier stuff but then you would also have no issue with running windows on it.

Rant continue:
Also if you have a decently modern PC and just don't want to use windows because MS is getting pretty hardcore with forcing all sorts of things on you then try bazzite which is basically steamOS just not from valve.

Also since "retro" is pretty hip these days, battocera recalbaox and lakka are the best OS/frontends for retroarch, from most heavy to most light, lakka only loads up the PS crossbar and is pretty barebones.

Lastly, just for funsies, pimiga supports the Pi as well as x86 CPUs, is a full Amiga installation, and has a full linux Os underneath.
(You have to provide your own Amiga kickstarts(bios) )
 
I'd generally recommend against using hardware too much older than a decade, due to "code rot". Basically, the stuff that works best in Linux is the stuff lots of people are (still) using. Once you go old enough, you can end up with a somewhat esoteric hardware configuration where many of the others even using Linux on it aren't running a very recent kernel. So, you might hit bugs that've crept into a device driver or kernel codepath somewhat specific to your system.

It depends on just how popular the machine was, however. So, for AMD machines, I might not even want to run something much older than Zen, whereas I might still feel comfortable still running an Intel machine as old as Nehalem. At this point, I'd definitely avoid using 32-bit machines.

With that said, there's an old server we have at my job, with a Core 2-era Xeon CPU, and it's been running openSUSE Leap-15.6 just fine. We basically just use it to serve up an iSCSI volume over NFS, though. So, that's a fairly narrow bit of functionality we're exercising.

BTW, I once installed Linux on an old Pentium M laptop, back before it was too old. It was a slight pain, due to the fact that those CPUs didn't support a feature called PAE (which enabled 36-bit addressing on old 32-bit CPUs). Well, I believe PAE support recently got dropped from the kernel. So, that's one thing about using a leading-edge kernel on old hardware that would go much more smoothly.
 
Last edited:
PSA:
There is no linux that will make your old hardware better than it is, if your goal is to watch modern youtube videos or any other media from the web, forget it.
A decade old PC like the article talks about will be doing decade old things.
Web browsing on a Sandybridge-era machine (now 14 years old!) is still very viable, especially if you increase your privacy settings (which cuts down on ads) or run an ad blocker. You might be right about video decode acceleration, unless the GPU in it is a bit more recent.

Speaking of GPUs on old Linux PCs, you can apparently even play Indiana Jones (which requires ray tracing) on old AMD GPUs, like the RX 480:

Old PCs are good for watching DVDs and that level of videos from your hdd, and for playing old games that have issues with running on modern OS.
I once had a Pentium 4 that struggled to decode 1080p H.264 video files. But, a software decoder on a Core 2-era PC could play them with ease. And Core 2 is almost 20 years old!

DVD playback is something a Pentium II-class PC could handle.
 
Last edited:
A decade old PC like the article talks about will be doing decade old things.
Decade old PC = Skylake, which is fast enough to do plenty of things. Once you hit quad-core Skylake, you're basically golden for anything that isn't a newer CPU-intensive game or something that "needs" Threadripper levels of performance. This level of performance is sold to this day as quad-core Alder Lake-N (e.g. Intel N100).

SteamOS may get its "official" desktop release within a year or two (perhaps timed with Windows 10 support ending), and that will make these lists from then on.
 
  • Like
Reactions: bit_user
Decade old PC = Skylake, which is fast enough to do plenty of things. Once you hit quad-core Skylake, you're basically golden for anything that isn't a newer CPU-intensive game or something that "needs" Threadripper levels of performance. This level of performance is sold to this day as quad-core Alder Lake-N (e.g. Intel N100).
Yeah, so why do you need to "revive" such an system?!
This system is still alive (going strong) and can even run windows 11.

That's why I talked about high end of the time, try doing anything with a skylake celeron... Or since you brought up the atom replacements, try doing anything with a 10 year old atom cpu.
 
Incidentally, these are some of the best distros for someone on a brand new computer who doesn't want advertisements sent to them on their start menu. Revival isn't the only reason to go down this path for Linux.

Why should anybody have to do a W11 install and then shut this off, shut that off, tweak this, tweak that, go through this go through that other thing, buy this program for $3, buy that other program for $8........ all to restore a Windows 10 or 7 look and feel. All of that is just too much compared to a nice out of the box experience at minute 1. (and, BTW, be forced during the W11 install to have no choice but go in and touch the terminal for local accounts. That's so 1980s to type in a bunch of commands.)
 
  • Like
Reactions: bit_user
That's why I talked about high end of the time, try doing anything with a skylake celeron...
I have a Skylake i3 laptop that's dual-core 2.3 GHz (plus hyperthreading), running Ubuntu 24.04. I used it as my streaming box, hooked up to my TV, until last fall, when I switched over to using a PS5. Not because the laptop was too slow, but just because one streaming service refused to work on Linux (tried both Chromium and Firefox) and some of the streaming apps for PS5 had features you don't get in their browser equivalents, like HDR and Dolby Atmos support.

Or since you brought up the atom replacements, try doing anything with a 10 year old atom cpu.
Well, if you want to take a platform that was barely adequate when new, and then age it 12 years, obviously that's going to be painful. I say "12 years", because that's when Silvermont was launched. Apollo Lake didn't come until 2016, but then it had the same generation of iGPU (and video decoder) as Skylake. Skylake's iGPU is still pretty well-supported (since basically the same one got used up through Comet Lake) and should offer video decode acceleration.
 
This article came at the perfect time for me. I’m looking to put Linux in my old 6700k for older steam games, emulators and some YouTube when windows 10 runs out of support (win 11 would probably work too until ms breaks it and goes “wel yeah we never said we’d support that cpu”).

I hope Nvidia has some decent drivers for Linux so the 980ti will work (as far as they have decent drivers for anything these days).

I’m leaning towards either Linux mint or Ubuntu after reading this as this seems easiest to get going for someone with very little Linux experience. Do the more knowledgeable people here see any reason to pick something else?
 
  • Like
Reactions: bit_user
I hope Nvidia has some decent drivers for Linux so the 980ti will work (as far as they have decent drivers for anything these days).
I have one generation newer (GTX 1050 Ti) and still get driver updates. The drivers are in maintenance mode, which means they don't get any new features. However, it might still have all the features you need. TBH, I've never dabbled in Linux gaming.

I’m leaning towards either Linux mint or Ubuntu after reading this as this seems easiest to get going for someone with very little Linux experience. Do the more knowledgeable people here see any reason to pick something else?
I have no experience with Mint, but Ubuntu makes it easy to get the proprietary Nvidia drivers. I think Ubuntu is a more popular and therefore easier to find fixes and guides.

FWIW, I actually run a sub-project called Kubuntu, which is just Ubuntu with the KDE window manager. You can also just install regular Ubuntu and switch window managers, if you find the default isn't to your liking.
 
I have one generation newer (GTX 1050 Ti) and still get driver updates. The drivers are in maintenance mode, which means they don't get any new features. However, it might still have all the features you need. TBH, I've never dabbled in Linux gaming.


I have no experience with Mint, but Ubuntu makes it easy to get the proprietary Nvidia drivers. I think Ubuntu is a more popular and therefore easier to find fixes and guides.
Thanks for taking the time to reply, I’ll give Ubuntu a try then
 
DVD playback is something a Pentium II-class PC could handle.

Most people don't realize or connect with the fact that a large percentage of the tasks they do every day qualify as "decade old things", mainly in the idea that it requires less processing power than initially expected.
You guys are missing the point, one step above DVD quality is 720p video and that will be encoded in x264 or even x265 and will have a really hard time playing on anything that old unless you have a GPU that will take over.
 
You guys are missing the point, one step above DVD quality is 720p video and that will be encoded in x264 or even x265 and will have a really hard time playing on anything that old unless you have a GPU that will take over.
My Pentium 4 could decode H.264 @ 720p no problem. It's only 1080p where it had trouble keeping up. Even then, I bought a commercial H.264 decoder, from a company called CoreCodec, and it was just enough faster than I could watch 1080p on my CPU. Probably, the open source decoders are now optimized to at least that level.

I didn't have a sense how strenuous H.265 is, so I just tried it. I picked the largest and smallest 1080p clips from here:

On a Sandybridge i7, mplayer was able to decode at 3.574x and 1.154x of realtime on the lowest and highest bitrate files, respectively. However, then I noticed it was basically using just one thread. So, I tried it again with 8 threads (command line opt: -lavdopts threads=8) and the high-bitrate file ran at 2.864x of realtime!

Using ffmpeg, I could achieve even faster decode times of up to 3.06x of realtime, again using pure CPU decoding. I'm not sure why the difference.

At higher resolutions, or maybe with HDR content, you're probably going to reach a point where it can no longer keep up. That's when having a newer GPU, either dGPU or iGPU, is going to come into play.

As for youtube, it auto-negotiates the format to one your PC can handle. You might not get the best quality, but it's usable.
 
Last edited:
You guys are missing the point, one step above DVD quality is 720p video and that will be encoded in x264 or even x265 and will have a really hard time playing on anything that old unless you have a GPU that will take over.
Just to add to Bituser's comment,

Built-in decoder chips have been pretty much mainstream ever since going back to the AMD Kaveri days, and that's over a decade ago. And they were common prior to that.

YouTube can stream lower quality codecs, it doesn't all have to be x265 while still being 720p.

A computer lacking a decoder chip is probably at least 15 or more years old.
 
Built-in decoder chips have been pretty much mainstream ever since going back to the AMD Kaveri days, and that's over a decade ago. And they were common prior to that.
Well, yeah... but stuff like web browsers on Linux have a rather poor track record of using them.

Here are some tips.
  • In Firefox, look at this page: about:support#media
  • In Chromium, the best way seems to be actually loading a page that's playing the media in question, then loading the Developer Tools and switching to the Media tab. Now, you'll see a list of the current media streams that are playing - select one and it'll tell you the various details about it, including whether it's using hardware acceleration for each of the streams.
  • When using an intel GPU, you can run sudo intel_gpu_top to see if anything is using the video codec portion of your GPU.
  • The roughly equivalent thing for nvidia is nvidia-smi -l 1 - if you have their proprietary drivers installed.
  • Someone wrote an amdgpu_top but I think it's not maintained by AMD and I've never tried it.

On my newest machine (Alder Lake with Xe iGPU) with up-to-date Ubuntu 24.04, Chromium is still decoding in software. From what I'm reading, you must still explicitly enable hardware decoding in it, on Linux. Seems like that's the case for all GPUs. In contrast, Firefox is indeed benefiting from HW acceleration, out of the box.

In media players, the way to access hardware acceleration varies. mplayer only really seems to support Nvidia (same for VLC), but there's a fork of mplayer called mpv, which I've gotten to accelerate H.265 on an Intel Xe iGPU, but not AV1 (which is also supposed to be supported by the hardware).

A computer lacking a decoder chip is probably at least 15 or more years old.
Well, yeah. But, the matter of how well your driver supports that hardware and how well your media player supports the driver are significant variables. In general, AMD GPUs have excellent support on Linux, even going back to the pre-GCN days. However, I'm not sure the same can be said of their video decode acceleration. Intel GPUs likewise have excellent support, but Intel has changed around its Media SDK several times and I'm not sure how far back their VAAPI support will go.

We're kind of getting into the weeds, here. This is really a topic people are going to have to research for themselves, depending on their specific hardware setup. There's a lot of info on the web - too much to cover, here.
 
Last edited:
  • Like
Reactions: ezst036
In general, AMD GPUs have excellent support on Linux, even going back to the pre-GCN days. However, I'm not sure the same can be said of their video decode acceleration.
You bring up a lot of good points throughout.

This one did stand out though, since (it reminded me that) AMD GPUs only got whitelisted as of a month ago with the release of Firefox 136 even though Firefox supported Linux decode as far back (AFAIK) as Firefox 78.

I had been seeing it working for such a long time I didn't even realize it was on the blacklist. The distros must have been enabling it on their own.

https://9to5linux.com/firefox-136-p...-decoding-for-amd-gpus-on-linux-vertical-tabs
 
I start high and go low. Ubuntu default, see if everything is compatible, then run a command line that installs xubuntu-desktop. This is because my low power celeron computer actually has better compatibility with things like the gdm3 display manager than the lighter alternatives that come with lighter distros. As it has a 1280px screen instead of 1360px I always have to rotate the screen (weird, I know). My computer is a bit old (not really old) but is very low powered and I like to turn down compositing, which I find is better than going to Ubuntu with xorg.
 
I once had a Pentium 4 that struggled to decode 1080p H.264 video files. But, a software decoder on a Core 2-era PC could play them with ease. And Core 2 is almost 20 years old!

DVD playback is something a Pentium II-class PC could handle.
While technically this is true (and a 486 can play CBR MP3s near 100% CPU as well, VBR not so much), back in the day everyone got a REALmagic Hollywood Plus or Creative DXR2 hardware H.262/MPEG-2 decoder card just so the CPU wouldn't noisily run near 100% doing this and the CSS decryption in software. That plus Plextools or another utility to keep the DVD from occasionally ramping to max speeds while watching the movie provided for a much better watching experience.

It was a pretty big deal when Adobe Flash first allowed hardware acceleration of streaming H.264 from Youtube as most older GPUs with fixed-function hardware for decoding had only been able to accelerate playing local files.

I've tried MX Linux on Pentium 4 and A64 machines and while everything works, it's pretty unpleasantly slow and some of those Atoms are no better, so Core 2 level performance is about the minimum for any of these + anything slower should probably drop to more niche distros like Puppy. Nehalem was pretty much the same cores only without the FSB so the big jump in performance wasn't until Sandy Bridge (a similarly large jump occurred recently between 11th and 12th gen when Intel finally got off 14nm++++++++++)