Booting Ubuntu on an 8-bit Chip: the Lowest-end Linux PC

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
@iamvortigaunt

To add to that, over the last 10 years [November 2001 to November 2011] the top super computer on the Top500 list went from 7.2 TFLOPS to 10510 TFLOPS. That is a 107% gain, on average, every year. If there is no purpose for over 1000x performance gains in a decade, then why spend hundreds of millions or billions to do so?

Over the 8 years before that [November 1993 to November 2001] the average increase was only 67% per year, so it looks like they are improving these uselessly powerful super computers at even faster rates...



 
Well, 20 MHz to 24 MHz is overclocking by 20% -- like a 3GHz machine up to 3.6 GHz ... I'd guess that without any special cooling [just a heatsink and fans], that is probably as far as you want to go without serious risk of overheating and/or CPU damage...
 
@rosen380
Are you suggesting a heatsink and fan were common on 20MHz CPUs? They weren't even common on 386s. I didn't see these as common until the 486 came out, and even then it was very often just a heatsink. Of course you could have mean the heatsink and fan *was* special cooling, and you'd be right, that's damn near exotic for an 8bit.
 
No, what I am saying is that on the modern CPU, without an upgrade from the stock cooling, you probably aren't overclocking by much more than 20% on those either.
 
Perfect for ballistic nuclear missile launch 10 hours countdown. Give the president time to recall a launch command in case of a oops decision.
 
[citation][nom]iamvortigaunt[/nom]You seem like you know what you're talking about and clearly have a better sense for the history, but I can't help but think what you're saying is nearly ridiculous. You seriously think computers did the same thing 30-40 years ago that they do today? Algorithm structure may be similar, but the speed, power efficiency, and small size of modern CPUs is so ridiculously beyond the old CPUs that many, many, many things are now possible that were not back in the day. Think about scientific modeling, 3d medical imaging, and engineering that are possible with modern computers. Think about modern automobiles that have microprocessors doing millions of calculations of various handling variables and such. Could old processors have technically executed all these instructions? Maybe. But the time it would take to do so and the power consumed in the process would make these things that are now so commonplace rare and sparsely used. Have programmers gotten less efficient? Perhaps. But to say that 30 year old processors can do anything modern ones can seems fairly absurd.[/citation]

I can see where you're coming from, but you really need to think this through. What's in the space shuttle? Hmmmm, maybe 8086s? You know we made it to the moon, right? Well, how old do you think the processors in that were? If you right code efficiently, you can do a lot more than you'd think. I mean really, think about 20 million instructions per second. Per second. Every second (on average). Think about how much work can be done with that. It's just so much of it is wasted.

And yes, you can get prettier screens, and higher resolutions, and stuff like that. But, fundamentally, nothing has really changed much. You had microprocessors, and even full processors running a lot of mechanical devices, and even things that got us into space and the moon. The details are different, but the major point is pretty much the same.
 
>> ... the developer noted that the system is "somewhat usable". Typed commands deliver replies within a minute,

I guess his definition of "useable" is significantly different to most people's, including mine.
 
[citation][nom]TA152H[/nom]I can see where you're coming from, but you really need to think this through. What's in the space shuttle? Hmmmm, maybe 8086s? You know we made it to the moon, right? Well, how old do you think the processors in that were? If you right code efficiently, you can do a lot more than you'd think. I mean really, think about 20 million instructions per second. Per second. Every second (on average). Think about how much work can be done with that. It's just so much of it is wasted. And yes, you can get prettier screens, and higher resolutions, and stuff like that. But, fundamentally, nothing has really changed much. You had microprocessors, and even full processors running a lot of mechanical devices, and even things that got us into space and the moon. The details are different, but the major point is pretty much the same.[/citation]
There are still efficient coders out there. Embedded solutions often deal with 32K (or less) EEPROM and they get the job done.
 
"There are still efficient coders out there. Embedded solutions often deal with 32K (or less) EEPROM and they get the job done."

But in those cases aren't we talking single-purpose machines, than the multi-purpose computers that our 3+ GHz CPUs are in?

The TI-85 and TI-86 graphing calculators only had 6 MHz CPUs and that works because the scope of their function is pretty limited. Outside of being reprogrammed to do other things, there would be limited benefit in putting a 3 GHz CPU in there.

 
yes, but don't bother with the video. constantly moving the camera and either video out or clock cut out of frame makes for an extremely irritating video. i can't understand why the camera could not have been correctly positioned before beginning recording and deciding once only what could be left out of frame.
 
[citation][nom]iamvortigaunt[/nom](you postest to someone else) You seriously think computers did the same thing 30-40 years ago that they do today?~~ But to say that 30 year old processors can do anything modern ones can seems fairly absurd.[/citation]
True on the absurd part... hence, todays toaster overs were yesterdays mainframes.

But yes, in general - what I did in the 1980s (late) on my computer is the same today, but far less programming and knowing how to actually make things work. Word processing, playing games, playing music, painting... Things are fancier... but still dots on the screen that we click on to move around.
 
1 MIP = 1 VAX 11/780. Back in the day (aka 1978+) that was the HIGH end computer to run UNIX on - it had been developed on 16 bit machines with far less power, and 64k or less RAM. We had Berkeley BSD (3BSD) running very nicely on one in 1981.

Early Linux ran on an Intel 286; 5 MHz clock but it needed several clock cycles per instruction.

8-bit? Not possible. However even early "8-bit" processors were really 8/16 i.e. 16-bit addressing, and with some extra logic you could access more than 64k, which I guess would be enough to run early Linux. But no virtual memory (not really supported until the 386 or the 68010), and certainly not a current Ubuntu distribution.

However ... emulation ... if it can be made to address enough memory, Turing says any Turing-complete machine can emulate any other...
 
Status
Not open for further replies.