[citation][nom]Houndsteeth[/nom]Yes, starting off, code written in the 50s and 60s was unbelievably more efficient than code written today since it was written directly to the hardware. Once you got away from programming in machine code, you then had to introduce an entity that would control access to the hardware and interpret hardware calls (the operating system) and coding languages that were easier for the programmer to understand (assembler, COBOL, FORTRAN, C) that had to be compiled to machine code to the hardware could understand it. Each of these innovations added inefficiency into the code that finally ran on the hardware. With each new level of hardware, OS, coded language, IDE, etc., you added layers of compatibility and new features, yet more inefficiency. With Java, Python, PERL, HTML, Flash, PHP and many others, we are looking at code never intended to run on actual hardware, but rather through a virtual machine, appliance or framework, which brings yet another layer that has to run before you can run the code.Yes, code is a lot less efficient, but it has to be. There would not be enough developers in the world who could keep up with the demand if everything was still written to the metal as it was in the beginning of the computer age. So, we accept a certain amount of inefficiency in our code so we can get the code done in a timely manner with the features that we have all come to know and love.[/citation]
Actually, people were NOT writing most applications in machine language in the 60s. FORTRAN and COBOL were dominant languages of that time, and were not machine code. C++ is compiled to machine language, and is the dominant language today.
The problem is sloppy, bloated coding. Back in the early 90s, I wouldn't waste a single instruction, and was very conscious of it. I remember someone asking me why I went through an array of strings backwards instead of forwards. The answer was obvious - I already had my pointer there and it saved an instruction by not reseting it.
C is a very efficient language. Obviously, object oriented languages are bloated, but not that much. They aren't necessary at all anyway, but they do make bad programmers less bad. Structured programming is better for better programmers.
But, anyway, the point is, software today does nothing that it didn't do 15 years ago, but takes a lot more processing power to do it. Hell, go back 30 years, you had spreadsheets, bulletin boards, games, bookkeeping, etc... Nothing really changes. Games get more resolution, but do spreadsheets or word processing programs do anything now that you needed and couldn't do 15 years ago? What does Windows 7 do that OS/2 couldn't do, except require 2 Gig to run right, instead of 16 mb. Oh, and run a lot slower. Why do applications like Flash keep taking more memory the longer you use them, if it's just doing the same thing? After eight hours, it takes roughly 3 times as much as after 15 minutes. Why does it keep growing? Bad programming and memory leaks.
That's where the opportunity is. Hardware has carried bad programmers for a long time. These stupid layers on top of layers are designed for bad programmers. Games will not keep using garbage interfaces like DX11, or whatever comes after it, and will have to start writing more directly to the hardware. If they do this, you'll see big improvements. If they keep writing game engine on top of 3D engine, on top of operating system, on top of device driver, games will suffer a lot of performance loss from MS bloatware.
If Crysis were running on DOS and writing to the hardware, no one would ask that annoying and infamous question. Not that I'm suggesting that will or should come back, but something in between almost has to for performance to improve dramatically.