Discussion When Did It Begin? 386 to i9-13900KS

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I kept my Atari 1040STE as main home PC until first 386. By that time it was equipped with extended RAM, 2 x 20 GB HDDS, extra floppy disk and a data CD ROM as well as printers and assorted peripherals. Also installed an adapter for CPU with BIOS from Apple (forgot which one)and SW DOS emulator which run DOS faster than 286 PCs. It's OS, Atari TOS (Tramiel Operating System) was far advance from Windows 1 and 2.
I hang to it couple of years after 386 to use it for games and doing homework using 3d Cad which was compatible with ACAD but much faster.
 
  • Like
Reactions: jnjnilson6

SunMaster

Commendable
Apr 19, 2022
216
195
1,760
I don't recall the size of the hard drive or if it even had one. It might not have... I know I booted the system off Workbench floppies... :ROFLMAO:

The A500 came with integrated an floppydrive - getting 880k of a double sided 3.5" disc (which the PCs formattted to 720k). A harddrive option wasn't available until a fairbit later in the A500s life.
That system was amazing for the day. If commodore only did better in marketing...
Atari ST was in some ways even better running at 8GHz. Synthesizer was good too. But it lacked software.

I also had an ST - a 520+ (in practice identical to the 1040). While the ST was great in many ways, it was crushed by the Amiga in almost every way. They custom chips agnus paula and denise in addition to a multitasking OS made the Amiga quite formidable. The ST had a good blitter though - and a better synth chip than most home computers.
 

punkncat

Polypheme
Ambassador
Having lived threough those years, I don't miss it at all.


Me either.

I enjoy the simplicity and power of a modern system and much of the development of what is possible with one today.

My first system took me minutes just to be ready to try and get "online", which was nothing like todays internet. Cradle modems, (cassette tape) drives, cartridges....
 

Exploding PSU

Honorable
Jul 17, 2018
477
154
10,870
Love this retro stuff.

1984

View: https://youtu.be/Mu8zyHh4vTg



1987

View: https://youtu.be/VsE0BwQ3l8U

I actually didn’t get my first Windows PC till the Pentium 75 in 1997. I started with the Amiga 500 in 1988… after using an Apple IIe in grade school.

Ah, Computer Chronicles. The show is way before my time, but I recall finding it in high school, when it was still just freshly uploaded to YouTube, and watching the heck out of it. I'm sure I've watched every episode uploaded by that channel, multiple times. For a while, Chiefet's voice was my lullaby before I sleep.
 
D

Deleted member 2838871

Guest
Ah, Computer Chronicles. The show is way before my time, but I recall finding it in high school, when it was still just freshly uploaded to YouTube, and watching the heck out of it. I'm sure I've watched every episode uploaded by that channel, multiple times. For a while, Chiefet's voice was my lullaby before I sleep.

It's an amazing trip down memory lane. Gary Kildall was a genius... I didn't realize how important he was to the computer industry as a whole.

I haven't watched every episode yet... but I'm getting there. One of my favorite episodes is this one from 2001.

View: https://www.youtube.com/watch?v=26DPpaG5haE&t=348s


It's a favorite because I ran that Chameleon demo over and over when I built my AMD Athlon XP 1800+ system with GeForce 3... the card they are demonstrating.

My new AMD system is my first since that one. I've been running Intel for 20 years but now was the time to swap. I still have that Chameleon demo on this PC... along with the Wolfman demo that came out with the GeForce 4 series. (y)
 
  • Like
Reactions: jnjnilson6
Programs which could run on 1 MB RAM and a 386 in the 90's now require many Gigabytes of RAM.
Which obviously isn't true.

One random example I found looking at a magazine from back then, Samna Ami, apparently the first Windows-based word processor. Can't work on multiple documents at the same time. Can't show the bottom of one page at the same time as the top of the next. No tables. No footnotes. No thesaurus. Price equivalent to over $450 today, by the way.

Name me one such program that could run in 1MB RAM in the 90s that now requires many GB and has no extra features?
 
  • Like
Reactions: jnjnilson6

jnjnilson6

Distinguished
Which obviously isn't true.

One random example I found looking at a magazine from back then, Samna Ami, apparently the first Windows-based word processor. Can't work on multiple documents at the same time. Can't show the bottom of one page at the same time as the top of the next. No tables. No footnotes. No thesaurus. Price equivalent to over $450 today, by the way.

Name me one such program that could run in 1MB RAM in the 90s that now requires many GB and has no extra features?
There was a lot of software, like Borland C++ back in the day and furthermore, architectural software, software concerning finance and financial management, data and knowledge bases, compilers and software doing mathematical calculations which could only be solved with the power of a computer back in the day. Today there's software, surely graphically enhanced, which does the exact same things and requires thousands of more resources. Software could be written tens of thousands of times shorter if there was a beacon of smart light over the minds of current programmers and very complex tasks could get done under minimalistic resources. Big, badly written structures, dragging alongside unneeded code and implementations of code can use 16 GB RAM for software which could do the same things on 6 MB if written by somebody who has a great knowledge of programming, keeps to a clean, smart and stable structure void of anything unnecessary and uses logic for purposes which otherwise leak out from the edge of the cup.
 
  • Like
Reactions: ex_bubblehead
So no example then. Just vague erroneous claims that "software now does the exact same things as back then".

Big, badly written structures, dragging alongside unneeded code and implementations of code can use 16 GB RAM for software which could do the same things on 6 MB if written ...
Ignoring the irony of all these claims being made by using twenty words where two would do, this is also obviously not true.

LibreOffice has minimum system requirements of 0.25 GB, recommends 0.5 GB and will usually use more. It's code is open source and free for all to read, suggest changes or even fork, and yet still nobody's reduced its memory requirement by the factor of 100 that you're convinced is so possible.
 
  • Like
Reactions: jnjnilson6

jnjnilson6

Distinguished
So no example then. Just vague erroneous claims that "software now does the exact same things as back then".


Ignoring the irony of all these claims being made by using twenty words where two would do, this is also obviously not true.

LibreOffice has minimum system requirements of 0.25 GB, recommends 0.5 GB and will usually use more. It's code is open source and free for all to read, suggest changes or even fork, and yet still nobody's reduced its memory requirement by the factor of 100 that you're convinced is so possible.
A wise man once said,

I may not agree with you, but I will defend to the death your right to make an -- of yourself.
 

jnjnilson6

Distinguished
So no example then. Just vague erroneous claims that "software now does the exact same things as back then".


Ignoring the irony of all these claims being made by using twenty words where two would do, this is also obviously not true.

LibreOffice has minimum system requirements of 0.25 GB, recommends 0.5 GB and will usually use more. It's code is open source and free for all to read, suggest changes or even fork, and yet still nobody's reduced its memory requirement by the factor of 100 that you're convinced is so possible.
Let's be reasonable here... I think we are both grown men, and, perhaps, we have both acted out of spite initially, yet sometimes, in some spheres, there are points which you may not argue on indefinitely. So it's better to just switch the topic.

I hope, one day, you'll be sitting on a wonderful, flower-strewn terrace, watching the fields roll underneath a misty sky, cup in hand, reminiscing about the ghostly melody of mellow days and the fallen beauty of redolent, unchronicled hours... I hope you unravel good revelations in the day, stemming back to the nuances of character most potent within languorous, indolent reveries... And that this conversation appears diamondlike through your dreams and that with the negativity of nothingness is appended the beauty of having known a conversation worthwhile, like within the Philosophies of Kant and the softer words of Classical authors...

Reasoning in flowery hours
The Processors of expiated days
And swiftly addressing the sours
A conversation, swiftly, moodily, empowers
A sway of prose in words benign, unheard
And with the day again we'd fathom out in newer words
A handshake for undoubtful reveries
Now past... Now past... And gone, unfortunate distress
To brood upon, alas...

The poetry and prose are by myself. Have a good day!
 
May 19, 2023
6
7
15
When did things change? Well being a programmer myself, I have been analyzing this myself for years. I think there are multiple places and mindsets that has enabled this terrible realm of computer programming. I have programmed on a 14 Mhz 8086 processor before. I had a massive 192 KB of ram. And a huge 1.2 MB of drive space. And I was limited to 64 KB file sizes. I created a clone of one of my favorite video games from the 8 bit black and white gameboy. It did it just for fun. I had to write it in C. I had to learn how to write efficient code to do this.



In college, in the early 2000’s they were just teaching the kids to copy and paste the routines and do minor modifications on the copies. This is probably one of the mindsets that played a role in creating bloated code. I tried that method in my game and quickly ran out of space and ram. So bad methodology in the schools is one of the issues.



Then there was object oriented programming. It really seems like a novel idea. And I think it works well with multithreading but not with a non multithreaded routine. When I programmed C, you just created the variables you needed and erased them from memory when you were done with them. With Objects, you load the Classes down with endless variables and routines you may or may not use. Then you initialize that object, which will load a ton of worthless code and variables into ram that you don’t use. So, how much ram is used? Maybe a few MB per Object.



So imagine a FPS game. Each lazer is an object. So imagine if you need to create 1000 lazers, which are both on and off the screen. Now you have 1000 lazer objects which take a good few GB of ram. In the old days, you would just have a 2D array (matrix) which took like 5 to 10 KB of ram to keep track of all the lazers. And you would loop through that array to draw them all. today, you have to loop through 1000 lazer objects and process a few GB of data. See the inefficiency?



Schools claim that object oriented programming (OOP) programming languages is more secure than non OOP. However, Linux, Unix, Windows, and Max OS are all made out of non OOP languages. They are usually made out of C and ASM. The most secure OS kernel in the world is made out of C and ASM. So the colleges are lying to people. This mindset is another reason why things have gone down hill.



.NET is another disaster in the programming world. It has MASSIVE overhead. All of the .NET languages are required to communicate with one another and share variables and routines. This requires a large overhead. Windows XP was efficientish. It took me 59 MB of ram to run the OS on a new install. I could use 128 MB of ram at a minimum for XP and my programs. When I had 1 GB of ram, I could only reach 800 MB when I was doing video editing. Otherwise, I could not go above 800 MB.



Now, look at windows 7 and vista. It required close to 1 GB of ram to just run. Why is this? Because microsoft reprogrammed windows XP into the .NET language and created Longhorn (Vista). Vista SP2 became windows 7. This shows the difference with the slowness of languages. Windows XP was made from various languages, like C, C++, VB, ASM, and some others I don’t remember.



When I tried to port my 8086 game from C to C# .NET, it required 10,000 times the amount of processing power and 10,000 times the amount of ram to get it to run at the same speed…



Back to the schools: They keep using the same old “industry standards” for teaching people how to code. They don’t teach the kids to find a better way to make an algorithm. They say “this is the way it so, so always code it this way.” for example: for collision detection of sprites, they would teach a kid to test each side of the sprite to see if something entered into that side. However, this would require like 9 checks, or something (been a while since I looked at it). I used this with my 8086 game. My game ran slow. So I did the reverse, I checked to see if the sprite did not collide with anything. And this took like 4 checks. My game sped up so much due to this.



They don’t teach you to count CPU cycles anymore. Each check and computation is at least one cpu cycle. And there are a limit of how many CPU cycles there are per second. And just a minor change in a loop can make the difference between where a stock market trader makes a million or loses a million. They don’t teach this stuff.



Schools teach people to use someone else’s code. So look it up on the internet and copy the code! So you have tons of people who just look up the code and copy it into their program. Then make minor adjustments for your needs. So you have lost your ability to think critically to create efficient code. And the kids are not taught to make efficient code.



To further extend this, there are module addons you can add to Visual studio which comes with hundreds and thousands of precoded functions. So all you have to do is drag and drop these functions into your program and it will work. When I worked at Microsoft, I knew a guy who did this. He didn’t know anything about programming. He just dragged and dropped what modules he needed into his programs and would pop out programs for his clients within a few days.



Now for the further complications: when you just reuse code over and over, it becomes really inefficient, especially in .NET. .NET is object oriented, so tons of objects are being called. And objects will call other objects. And those objects will call other objects. And so on. So assume that each object will require about 10 mb of ram. Now a programmer initializes a single object. And that single object is linked to about 9 other objects. So 100 MB of ram is used to call a single object. Most of that is all wasted code that will never be used. And then if you need another copy of that object, then you would have to initialize a new one. And that is another 100 MB. And this is how a lot of programs operate these days. It may not use an exaggerated 100 mb of ram per object, but all of these little things add up.



Then schools teach kids to create objects instead of global variables. So you end up with worthless overhead of using objects, instead of single variables.



The industry is a major factor. They tell you to get your program out as fast as possible. It is all about money and greed. They do not want you to try to optimize something. Just throw it together using some framework so they can sell it.



Another is the mindset of not caring about optimization. Some people have mindsets that it is just better to buy faster and more powerful hardware to make things run better. There are arguments on whether it would cost more to buy new computers vs making the program more efficient. Money is a really large factor in inefficient programs.



Frameworks are issues too. They can be really nice to have. And they can help you complete your work really fast. But frameworks eventually become bloated, because they try to do everything. The general public keeps asking for features to be added to frameworks and the creators of the frameworks add the features. After a few years, the frameworks are polished, look nice, run smooth, are bloated and inefficient.



I can remember looking for a javascript calendar for my website. I eventually ran into JQuery. All the web pages told me to download JQuery (1 MB + in size) and use it, along with their 100+ lines of code to make it work. It didn’t give me exactly what I wanted, so I went to work on making my own, using dropdowns. It took me 10 lines of code, using 28 bytes of space. But the people wanted me to use a framework that had 5k times the amount of code and was 35,000 times in size. Yea… no. and now, JQuery is on its way out.



We have reached the point where it is better to reinvent the wheel (programming wise) than to use someone else’s Frankenstein alien morph of X number of peoples different attempts of making the previous person’s code better.
 
Status
Not open for further replies.