Discussion When Did It Begin? 386 to i9-13900KS

Status
Not open for further replies.

jnjnilson6

Distinguished
When do you think that moment first started to take place, inadvertently at first, later massively and boldly, onto the exponential cadence of hardware and software development?

Say, in the early 1990s there was a lot of software doing synonymous things the software of today does. Yet you could not play back good quality video or games or create complex and real-life animations on the average home computer.

About 2003-2005, I suppose the time appeared in which hardware began massively accelerating, some personal computers harboring up to 1 GB RAM and a lot of storage and processors exceeding 3 GHz in frequency. I believe that was the point in time in which hardware began stepping up the impervious ladder of faster development and began surpassing the requirements of software. Only a decade before that it was precisely the other way around.

Now, what we are facing today, in the beginning of the 2020's, is that hardware is extremely powerful, more powerful than ever it has been in the past. There are home computers harboring 16000 cores in video processing power and CPUs with 32 cores running at incredible speeds for the average user. There, however, has appeared a new phenomenon. Software has been getting greedy, incredibly greedy with this increase in performance I've been talking about. Programs which could run on 1 MB RAM and a 386 in the 90's now require many Gigabytes of RAM. That is due to the fact that instead of writing software smartly and cleanly programmers write impossibly big structures which are fully known only by a few people, and structures which end up providing innumerable errors if any one segment from any part of them is touched. Now, the old timers, the good old programmers, know that is due to impractical thinking and programming; the same software could easily be written better and maintained easier with the core not being touched at all by peripheral changes which in the previous sentence proved to be greatly injurious. This is all because the generations have shifted and new programmers haven't the capacity and capabilities of the ones born in the 70's and before that time.

So we've gone from having software run on endlessly slow hardware to having endlessly fast hardware and badly written software taking up hundreds of times more resources than it should. And where does this leave us? What's bound to happen in the future? Going from 386 CPUs up to the Core i9-13900KS took us quite some time; so did going from practically written software to software written in a way which proves we've forgotten the very basics, the route from which knowledge spurts.

Do write up and

Thank you!
 

DSzymborski

Curmudgeon Pursuivant
Moderator
The Industrial Revolution.

Untitled.png
 
When do you think that moment first started to take place, inadvertently at first, later massively and boldly, onto the exponential cadence of hardware and software development?

Say, in the early 1990s there was a lot of software doing synonymous things the software of today does. Yet you could not play back good quality video or games or create complex and real-life animations on the average home computer.

About 2003-2005, I suppose the time appeared in which hardware began massively accelerating, some personal computers harboring up to 1 GB RAM and a lot of storage and processors exceeding 3 GHz in frequency. I believe that was the point in time in which hardware began stepping up the impervious ladder of faster development and began surpassing the requirements of software. Only a decade before that it was precisely the other way around.

Now, what we are facing today, in the beginning of the 2020's, is that hardware is extremely powerful, more powerful than ever it has been in the past. There are home computers harboring 16000 cores in video processing power and CPUs with 32 cores running at incredible speeds for the average user. There, however, has appeared a new phenomenon. Software has been getting greedy, incredibly greedy with this increase in performance I've been talking about. Programs which could run on 1 MB RAM and a 386 in the 90's now require many Gigabytes of RAM. That is due to the fact that instead of writing software smartly and cleanly programmers write impossibly big structures which are fully known only by a few people, and structures which end up providing innumerable errors if any one segment from any part of them is touched. Now, the old timers, the good old programmers, know that is due to impractical thinking and programming; the same software could easily be written better and maintained easier with the core not being touched at all by peripheral changes which in the previous sentence proved to be greatly injurious. This is all because the generations have shifted and new programmers haven't the capacity and capabilities of the ones born in the 70's and before that time.

So we've gone from having software run on endlessly slow hardware to having endlessly fast hardware and badly written software taking up hundreds of times more resources than it should. And where does this leave us? What's bound to happen in the future? Going from 386 CPUs up to the Core i9-13900KS took us quite some time; so did going from practically written software to software written in a way which proves we've forgotten the very basics, the route from which knowledge spurts.

Do write up and

Thank you!
Modern Pc life didn't start with 386 but with 8088 and 8086 while still in competition with Motorola and Zilog processors. It's IBM that decided outcome choosing upstart Intel to provide standardized software. IBM being "serious business company" with own mainframe computers as opposite of"Toy" computers that helped and developed business SW /programs. That's the base on which everything is built on even nowadays. Even newest processors are still "x86" in reference to 8x86 architecture. It's business. once realizing potential, that required HW and SW development which had to go hand in hand to advance at speed.
 
This is all because the generations have shifted and new programmers haven't the capacity and capabilities of the ones born in the 70's and before that time.
I doubt that very much, back in the 70ies everybody that used a PC had to be an expert at PCs, any computer would come with manuals as thick as phone books, do phone books even exist anymore?! , anyway back then half of what a program does was down to the end user typing in huge commands or data sets, now everybody that uses a PC barely knows how to click an icon and that's the most they know so software has to compensate a lot of user knowledge.

Sure there is also software that wastes a lot of space and resources just to be flashy but that is also a part of what I'm saying, it's not targeting pros anymore it's targeting the randoms that think that flashy equals must be well made.

If you look at todays software that is made for expert PC users it's just as efficient, and just as difficult to use, as software from the 70ies.
There is a big list of even freeware software that I bet most of us here use and have always installed on our systems that is very small in size and still does a lot of work.
 
I doubt that very much, back in the 70ies everybody that used a PC had to be an expert at PCs, any computer would come with manuals as thick as phone books, do phone books even exist anymore?! , anyway back then half of what a program does was down to the end user typing in huge commands or data sets, now everybody that uses a PC barely knows how to click an icon and that's the most they know so software has to compensate a lot of user knowledge.

Sure there is also software that wastes a lot of space and resources just to be flashy but that is also a part of what I'm saying, it's not targeting pros anymore it's targeting the randoms that think that flashy equals must be well made.

If you look at todays software that is made for expert PC users it's just as efficient, and just as difficult to use, as software from the 70ies.
There is a big list of even freeware software that I bet most of us here use and have always installed on our systems that is very small in size and still does a lot of work.
Sw should take advantage of HW advancements and vice versa. HW, OS and SW are one whole, any part missing and you end up with a brik.
 
  • Like
Reactions: jnjnilson6
D

Deleted member 2838871

Guest
Having lived threough those years, I don't miss it at all.

Hahah… yeah when I said love it I was talking about the trip down memory lane. 😂😂

Can’t say I miss the hardware all that much compared to today obviously… but man… the Amiga was such a great machine for the time period.

One of my earliest computing memories was doing cartwheels when I upgraded from a 2400 baud to 9600 baud modem because it made BBSing so much faster!
 
  • Like
Reactions: jnjnilson6

jnjnilson6

Distinguished
Modern Pc life didn't start with 386 but with 8088 and 8086 while still in competition with Motorola and Zilog processors. It's IBM that decided outcome choosing upstart Intel to provide standardized software. IBM being "serious business company" with own mainframe computers as opposite of"Toy" computers that helped and developed business SW /programs. That's the base on which everything is built on even nowadays. Even newest processors are still "x86" in reference to 8x86 architecture. It's business. once realizing potential, that required HW and SW development which had to go hand in hand to advance at speed.
That's very true; even 8-bit machines could perform computations completely impossible on pen and paper and do great things if in the right hands.

The 286/386 era, maybe a little toward the 386 side was when the above CPUs seemed laughably slow and inconvenient. The 386s were monsters in comparison, widening out the horizons almost indefinitely; it was a huge leap ahead.
 
Seems really strange that right after the date you believe hardware started massively accelerating, is when the gains per generation actually dropped to single-digit performance increases after Core 2, where before it had approximately doubled. Not that it mattered, since software stagnated so badly.

Yes, you used to need a massive boost in hardware to install the very next generation Windows, but then the stated Windows system requirements stayed exactly the same from Windows 7 all the way through Windows 10 and we got used to that. Now look at all the gnashing of teeth from Windows 11 finally requiring newer hardware.

Few people nowadays realize a 386 couldn't even play an .mp3 file (while a 486 could--at 100% CPU load), so they were more comparable to an Arduino than Core 2, which can run Windows 10 (and even 11) perfectly adequately. So hardware used to improve way quicker back then than it does now, it's just that software quit improving even more.
 
  • Like
Reactions: jnjnilson6

USAFRet

Titan
Moderator
Few people nowadays realize a 386 couldn't even play an .mp3 file (while a 486 could--at 100% CPU load), so they were more comparable to an Arduino than Core 2, which can run Windows 10 (and even 11) perfectly adequately. So hardware used to improve way quicker back then than it does now, it's just that software quit improving even more.
Ray tracing:

486 - a single frame @640x480 = "Come back in the morning"

Today @1080p = 60 frames per second
 
  • Like
Reactions: jnjnilson6

jnjnilson6

Distinguished
Seems really strange that right after the date you believe hardware started massively accelerating, is when the gains per generation actually dropped to single-digit performance increases after Core 2, where before it had approximately doubled. Not that it mattered, since software stagnated so badly.

Yes, you used to need a massive boost in hardware to install the very next generation Windows, but then the stated Windows system requirements stayed exactly the same from Windows 7 all the way through Windows 10 and we got used to that. Now look at all the gnashing of teeth from Windows 11 finally requiring newer hardware.

Few people nowadays realize a 386 couldn't even play an .mp3 file (while a 486 could--at 100% CPU load), so they were more comparable to an Arduino than Core 2, which can run Windows 10 (and even 11) perfectly adequately. So hardware used to improve way quicker back then than it does now, it's just that software quit improving even more.
Below is a picture of the performance of one of the fastest Mobile CPUs in 2003 (Pentium M 1.6 GHz) next to that of one of the fastest Mobile CPUs in 2023 (Core i7-13700H), today.

Screenshot-2023-05-28-203622.png


If we take in Avg. Multi Core Speed, the difference between the CPUs is x141.13. That's the difference of 20 years.

Now let's go back in time 21 years further to 1982 (because Intel released no CPUs in 1983 and the 286 from 1982 was the fastest for both years) and take a look at the 286. Back in that year the models released ran at 12, 10 and 6 MHz. Let's take the 12 MHz one.

Now, let's divide the speed of the Pentium M @ 1600 MHz into 141.13. We'll get 11.33 MHz. Surely, the 11.33 MHz of a Pentium M would be much faster than the 12 MHz of the 286 from 1982 because of the newer technologies regarding multitudinous computation embedded, yet it does seem that we have been advancing at the same speed in the last 20 years as that of the 21 years before that, at least in terms of CPUs.
 

jnjnilson6

Distinguished
Seems really strange that right after the date you believe hardware started massively accelerating, is when the gains per generation actually dropped to single-digit performance increases after Core 2, where before it had approximately doubled. Not that it mattered, since software stagnated so badly.

Yes, you used to need a massive boost in hardware to install the very next generation Windows, but then the stated Windows system requirements stayed exactly the same from Windows 7 all the way through Windows 10 and we got used to that. Now look at all the gnashing of teeth from Windows 11 finally requiring newer hardware.

Few people nowadays realize a 386 couldn't even play an .mp3 file (while a 486 could--at 100% CPU load), so they were more comparable to an Arduino than Core 2, which can run Windows 10 (and even 11) perfectly adequately. So hardware used to improve way quicker back then than it does now, it's just that software quit improving even more.
Thank you for the enlightening comment. :) It is surely a good foundation of erudition upon the previous years.
 
it does seem that we have been advancing at the same speed in the last 20 years as that of the 21 years before that, at least in terms of CPUs.
By time, sure, I can agree with this as you can see it in the Moore's Law line. In the olden days we just didn't get a new generation every year, so the generational changes could be massive, for example the jump from P4 to Core 2. After that the per-generation improvement was smaller as it became much more frequent than before.

BTW your Pentium M example was pretty much a Pentium III core with extra L2 ,married to the Pentium 4 FSB. Netburst turned out to not easily scale to 10GHz as expected, so they went back to the past and dusted off PIII to ride again for low-power ultraportable laptops as it was 24w maximum. In comparison your i7-13700H example can draw 115w for up to 28 seconds at max Turbo, nearly 5x as much. So Pentium-M was more akin to today's Raptor-Lake-U since at the time, all the large performance laptops had up to 89w desktop Pentium 4 in them.
 
  • Like
Reactions: jnjnilson6

iTRiP

Honorable
Feb 4, 2019
929
87
11,090
When did it begin? I don't really know, and for that part it doesn't bother me too much, but what I do care about is that this whole thing or occurrence or other wise business got me into something only to be described as something never ending and gave me purpose to venture about and explore and master any part of it as I saw fit, With much hope may it never become too expensive or fade away into something boring, or become something the world we live in doesn't have power for or the thing that we don't have time or energy to further improve Apon for the end result of maximum pleasure & accomplishment.
 
Last edited:
  • Like
Reactions: jnjnilson6

jnjnilson6

Distinguished
By time, sure, I can agree with this as you can see it in the Moore's Law line. In the olden days we just didn't get a new generation every year, so the generational changes could be massive, for example the jump from P4 to Core 2. After that the per-generation improvement was smaller as it became much more frequent than before.

BTW your Pentium M example was pretty much a Pentium III core with extra L2 ,married to the Pentium 4 FSB. Netburst turned out to not easily scale to 10GHz as expected, so they went back to the past and dusted off PIII to ride again for low-power ultraportable laptops as it was 24w maximum. In comparison your i7-13700H example can draw 115w for up to 28 seconds at max Turbo, nearly 5x as much. So Pentium-M was more akin to today's Raptor-Lake-U since at the time, all the large performance laptops had up to 89w desktop Pentium 4 in them.
That's very true. I had a Celeron Tualatin at 1.3 GHz (overclockable to 1.5 GHz) myself.

However, back in 2003, there was a HP Compaq NX7000 with a Pentium M 1.6 GHz at home. It was extremely expensive, and as stated in reviews it brought the absolute highest enthusiast performance possible and the only downside making it inaccessible to most people would be the exorbitant price, for which price though you were to ride on the crest of technology with the best of the best should you obtain it.

'The nx7000 is the best notebook computer I have seen in a long time, but as is usually the case in instances like this, all this quality comes at a price and this machine is far from cheap. With a retail price just under £2,000 you need to have pretty deep pockets to even consider the nx7000. That said, if you’ve got the money and need a fully featured mobile computer with a high-resolution display, you’d be hard pushed to find a better one than this.'

The fastest Pentium M in 2003 ran at 1.7 GHz and I do think those CPUs, for the stated year, corresponded to the highest such of today.
 

jnjnilson6

Distinguished
By time, sure, I can agree with this as you can see it in the Moore's Law line. In the olden days we just didn't get a new generation every year, so the generational changes could be massive, for example the jump from P4 to Core 2. After that the per-generation improvement was smaller as it became much more frequent than before.

BTW your Pentium M example was pretty much a Pentium III core with extra L2 ,married to the Pentium 4 FSB. Netburst turned out to not easily scale to 10GHz as expected, so they went back to the past and dusted off PIII to ride again for low-power ultraportable laptops as it was 24w maximum. In comparison your i7-13700H example can draw 115w for up to 28 seconds at max Turbo, nearly 5x as much. So Pentium-M was more akin to today's Raptor-Lake-U since at the time, all the large performance laptops had up to 89w desktop Pentium 4 in them.
Found this little tidbit about the Pentium 4 Ms.

'Running with very low average power consumption and much lower heat output than desktop processors, the Pentium M runs at a lower clock speed than the laptop version of the Pentium 4 (The Pentium 4-Mobile, or P4-M), but with similar performance - a 1.6 GHz Pentium M can typically attain or even surpass the performance of a 2.4 GHz Pentium 4-M.'
 
You could buy a Pentium 4 (not 4M) laptop as late as 2004 when the last 3.4GHz Northwood came out. It's still way better than the Pentium-M in some tasks today, such as Youtube (in low resolution as VP9 and AV1 are pretty demanding without hardware acceleration, and Pentium-M just doesn't have enough clockspeed).

Shortly after that, laptops started coming with Prescotts which were rated 115w (like i7-13700H!), but that was a lie and they drew much more than that. I got one and switched it with a Northwood right away as it was insanely hot
 
  • Like
Reactions: jnjnilson6

jnjnilson6

Distinguished
You could buy a Pentium 4 (not 4M) laptop as late as 2004 when the last 3.4GHz Northwood came out. It's still way better than the Pentium-M in some tasks today, such as Youtube (in low resolution as VP9 and AV1 are pretty demanding without hardware acceleration, and Pentium-M just doesn't have enough clockspeed).

Shortly after that, laptops started coming with Prescotts which were rated 115w (like i7-13700H!), but that was a lie and they drew much more than that. I got one and switched it with a Northwood right away as it was insanely hot
This too may be a little interesting.

3277.png


Doing the math in regards to the above, the Pentium M would perform like a P4 @ 2.97 GHz in this game.

I had a P4 @ 2.66 GHz machine and many P4 520s and a P4 520J. The P4 520s and 520J froze on a very expensive and expandable ASUS motherboard, then a MSI motherboard, I think afterward on a Gigabyte motherboard too (about this I am not sure though if I remember correctly), and finally on an ASRock motherboard there was no freezing and they ran goldenly. Had the P4 520J in 2005 and had an issue in which it would freeze up after 10-20 minutes of use on the ASUS motherboard. Then bought a large number of P4 520s from Ebay in 2013 and tested them on the aforesaid motherboards - same problem - until I got an ASRock mobo. On it the P4 520 CPUs performed continually well and provided wonderful performance.
 
D

Deleted member 2838871

Guest
Now let's go back in time

I vaguely remember my Amiga 500 system specs... Dad got it for me in 1988... 14th birthday.

Ummm... 3MB ram... Motorola 68000 @ 7.16mhz... I was using a 1084S monitor... I don't recall the size of the hard drive or if it even had one. It might not have... I know I booted the system off Workbench floppies... :ROFLMAO:
 
  • Like
Reactions: jnjnilson6
Computer programming is becoming the art of the esoteric once again. There are tribal knowledge secrets for game engines that only a few people know that they share amongst themselves in places like discord. This is especially true for unreal engine 5.

Code bases have simply grown way too large for any one man any more. Thus you are using toolsets to build up a new toolset or app. But those base toolsets are not optimized simply because you don't have time to optimize for it.

WCF for example, while complicated could be highly optimized. Address Binding Contract. You got to chose how every aspect of it worked.

But the web was well established. But it's a complex pattern of inter compatibility problems with javascripts, toolsets, incompatible css rendering. There's a large amounts of inefficiency there too.

But the tools are there and everyone uses them....Javascript to easy to hang yourself with...well we got another tool for the TypeScript which works on top of Javascript. Then theres bootstrap, react, and blazor, and so many other standards. But all of it is a bloated mess.
 
I vaguely remember my Amiga 500 system specs... Dad got it for me in 1988... 14th birthday.

Ummm... 3MB ram... Motorola 68000 @ 7.16mhz... I was using a 1084S monitor... I don't recall the size of the hard drive or if it even had one. It might not have... I know I booted the system off Workbench floppies... :ROFLMAO:
That system was amazing for the day. If commodore only did better in marketing...
Atari ST was in some ways even better running at 8GHz. Synthesizer was good too. But it lacked software.
 
D

Deleted member 2838871

Guest
That system was amazing for the day. If commodore only did better in marketing...
Atari ST was in some ways even better running at 8GHz. Synthesizer was good too. But it lacked software.

Indeed.

I had a blast playing games on the Amiga back in the day... it truly was revolutionary for its time even though it never took off like it probably should have due to the fails of Commodore.

View: https://www.youtube.com/watch?v=zjuVBnvA1EI


I wore this game out... LOL
 
  • Like
Reactions: jnjnilson6
Status
Not open for further replies.