When did things change? Well being a programmer myself, I have been analyzing this myself for years. I think there are multiple places and mindsets that has enabled this terrible realm of computer programming. I have programmed on a 14 Mhz 8086 processor before. I had a massive 192 KB of ram. And a huge 1.2 MB of drive space. And I was limited to 64 KB file sizes. I created a clone of one of my favorite video games from the 8 bit black and white gameboy. It did it just for fun. I had to write it in C. I had to learn how to write efficient code to do this.
In college, in the early 2000’s they were just teaching the kids to copy and paste the routines and do minor modifications on the copies. This is probably one of the mindsets that played a role in creating bloated code. I tried that method in my game and quickly ran out of space and ram. So bad methodology in the schools is one of the issues.
Then there was object oriented programming. It really seems like a novel idea. And I think it works well with multithreading but not with a non multithreaded routine. When I programmed C, you just created the variables you needed and erased them from memory when you were done with them. With Objects, you load the Classes down with endless variables and routines you may or may not use. Then you initialize that object, which will load a ton of worthless code and variables into ram that you don’t use. So, how much ram is used? Maybe a few MB per Object.
So imagine a FPS game. Each lazer is an object. So imagine if you need to create 1000 lazers, which are both on and off the screen. Now you have 1000 lazer objects which take a good few GB of ram. In the old days, you would just have a 2D array (matrix) which took like 5 to 10 KB of ram to keep track of all the lazers. And you would loop through that array to draw them all. today, you have to loop through 1000 lazer objects and process a few GB of data. See the inefficiency?
Schools claim that object oriented programming (OOP) programming languages is more secure than non OOP. However, Linux, Unix, Windows, and Max OS are all made out of non OOP languages. They are usually made out of C and ASM. The most secure OS kernel in the world is made out of C and ASM. So the colleges are lying to people. This mindset is another reason why things have gone down hill.
.NET is another disaster in the programming world. It has MASSIVE overhead. All of the .NET languages are required to communicate with one another and share variables and routines. This requires a large overhead. Windows XP was efficientish. It took me 59 MB of ram to run the OS on a new install. I could use 128 MB of ram at a minimum for XP and my programs. When I had 1 GB of ram, I could only reach 800 MB when I was doing video editing. Otherwise, I could not go above 800 MB.
Now, look at windows 7 and vista. It required close to 1 GB of ram to just run. Why is this? Because microsoft reprogrammed windows XP into the .NET language and created Longhorn (Vista). Vista SP2 became windows 7. This shows the difference with the slowness of languages. Windows XP was made from various languages, like C, C++, VB, ASM, and some others I don’t remember.
When I tried to port my 8086 game from C to C# .NET, it required 10,000 times the amount of processing power and 10,000 times the amount of ram to get it to run at the same speed…
Back to the schools: They keep using the same old “industry standards” for teaching people how to code. They don’t teach the kids to find a better way to make an algorithm. They say “this is the way it so, so always code it this way.” for example: for collision detection of sprites, they would teach a kid to test each side of the sprite to see if something entered into that side. However, this would require like 9 checks, or something (been a while since I looked at it). I used this with my 8086 game. My game ran slow. So I did the reverse, I checked to see if the sprite did not collide with anything. And this took like 4 checks. My game sped up so much due to this.
They don’t teach you to count CPU cycles anymore. Each check and computation is at least one cpu cycle. And there are a limit of how many CPU cycles there are per second. And just a minor change in a loop can make the difference between where a stock market trader makes a million or loses a million. They don’t teach this stuff.
Schools teach people to use someone else’s code. So look it up on the internet and copy the code! So you have tons of people who just look up the code and copy it into their program. Then make minor adjustments for your needs. So you have lost your ability to think critically to create efficient code. And the kids are not taught to make efficient code.
To further extend this, there are module addons you can add to Visual studio which comes with hundreds and thousands of precoded functions. So all you have to do is drag and drop these functions into your program and it will work. When I worked at Microsoft, I knew a guy who did this. He didn’t know anything about programming. He just dragged and dropped what modules he needed into his programs and would pop out programs for his clients within a few days.
Now for the further complications: when you just reuse code over and over, it becomes really inefficient, especially in .NET. .NET is object oriented, so tons of objects are being called. And objects will call other objects. And those objects will call other objects. And so on. So assume that each object will require about 10 mb of ram. Now a programmer initializes a single object. And that single object is linked to about 9 other objects. So 100 MB of ram is used to call a single object. Most of that is all wasted code that will never be used. And then if you need another copy of that object, then you would have to initialize a new one. And that is another 100 MB. And this is how a lot of programs operate these days. It may not use an exaggerated 100 mb of ram per object, but all of these little things add up.
Then schools teach kids to create objects instead of global variables. So you end up with worthless overhead of using objects, instead of single variables.
The industry is a major factor. They tell you to get your program out as fast as possible. It is all about money and greed. They do not want you to try to optimize something. Just throw it together using some framework so they can sell it.
Another is the mindset of not caring about optimization. Some people have mindsets that it is just better to buy faster and more powerful hardware to make things run better. There are arguments on whether it would cost more to buy new computers vs making the program more efficient. Money is a really large factor in inefficient programs.
Frameworks are issues too. They can be really nice to have. And they can help you complete your work really fast. But frameworks eventually become bloated, because they try to do everything. The general public keeps asking for features to be added to frameworks and the creators of the frameworks add the features. After a few years, the frameworks are polished, look nice, run smooth, are bloated and inefficient.
I can remember looking for a javascript calendar for my website. I eventually ran into JQuery. All the web pages told me to download JQuery (1 MB + in size) and use it, along with their 100+ lines of code to make it work. It didn’t give me exactly what I wanted, so I went to work on making my own, using dropdowns. It took me 10 lines of code, using 28 bytes of space. But the people wanted me to use a framework that had 5k times the amount of code and was 35,000 times in size. Yea… no. and now, JQuery is on its way out.
We have reached the point where it is better to reinvent the wheel (programming wise) than to use someone else’s Frankenstein alien morph of X number of peoples different attempts of making the previous person’s code better.