The fact is, the industry really has no choice. Apple is absolutely crushing it on performance / watt and their M1 / M1 Pro / M1 Max chips make for far more compelling laptops than ANYTHING Intel based. Intel doesn't have an answer for this. They can match performance, but at huge energy costs. Nobody wants a laptop that has to be plugged in to run properly and that has fans like a turbojet.
That's because Apple bought out TSMC's entire 5nm capacity. The M1 series is being produced at 173 million transistors/mm^2. AMD is still using TSMC's 7nm which is about 114 MT/mm^2. Intel's 10nm process is about 101 MT/mm^2.
Intel's fate lies with how quickly they can get their 7nm process (since renamed Intel 4) up and running. It's supposed to be around 200 MT/mm^2. But AMD is supposed to switch over to TSMC"s 5nm next year. And TSMC is sampling 3nm (numbers are vague but the percentage improvements they're citing suggest it will be somewhere around 225 MT/mm^2), which Apple is sure to buy priority access to. (Their other option is to swallow their pride, and contract to have their CPUs manufactured by TSMC. Intel is one of the few companies with a more obscene profit margin than Apple, and so could potentially outbid Apple.)
There hasn't been a major breakthrough in processor architecture in decades. All the low-hanging fruit has been picked. The biggest recent change was nearly 20 years ago, when unable to push clock speeds higher or make instruction sets faster, they resorted to adding more cores to improve multithreaded performance.
So it's likely that all these different architectures - Intel, AMD, Apple, ARM - perform similarly. The primary difference is the manufacturing process. That determines performance per Watt, and consequently top clock speed and raw performance.
If you remember Nvidia's botched Maxwell release, it ran into the same problem. Kepler was manufactured on TSMC 28nm. Nvidia was expecting to be able to manufacture Maxwell on 22nm, but Apple bought TSMC's entire 22nm capacity. That forced them to manufacture Maxwell on 28nm, where it ran too hot. The entire desktop 800-series was canceled. They had to redesign their Maxwell GPUs for 28nm, which was released as the 900-series. The only 800-series Maxwell GPUs which made it to market were a few lower-power mobile versions.
The problem with Linux isn't its customization. It's the way apps are primarily distributed. Every distribution's package manager runs on the principle that no app is stand-alone. If it depends on something, you have to get its dependency which is a problem for two reasons:
- Does that dependency even exist?
- Is the version of that dependency compatible with the app you want to run?
Linux's problem (which also causes the dependency issue you point out) is pretty simple: There is no effective feedback mechanism for users to impress their wants and needs onto developers. So you have developers in charge of projects like GNOME going nuts doing whatever they think is cool or following their preconceived notion of how a UI should work. Oblivious to how much their users hate it, ignoring features users want, removing features users like. Linux thus ends up being an OS by developers for developers. And it never makes a dent in the desktop market.
With commercial software, that feedback mechanism is provided by money. Users are willing to pay more for software they like. That helps guide developers to implement features users want, and to waste less time doing things they may find cool but users find useless or counter-productive.
Linux on the desktop won't happen until open source figures out a way to implement a user-to-developer feedback mechanism, which doesn't rely on money (since they're trying to remain free). Right now if you're a user trying to get a much-needed feature added, your choices are:
- Grovel before the project managers/developersand shower them with praise. Stoke their ego and maybe they'll implement the feature you want.
- Pay to hire a programmer to implement the feature and add it to the project. But doing that usually costs more than buying commercial software with the feature. In the open source case, you're footing the entire bill for the feature's implementation. In commercial software, the cost gets amortized over all users.
- Learn to code and implement it yourself. But the whole point of a modern economy is that people specialize in different fields, allowing them to become more efficient in their field than a generalist. They then trade their specialty goods or services, for goods and services in fields where they aren't specialized. Telling people to learn to code and implement the feature themselves, is tantamount to telling us to roll the economy back to the stone age.
Seriously, the developer-user relationship in open source right now is like noble-peasant. The nobles do whatever they want, and simply don't care about the peasants' needs or desires. The example I like to cite is VLC. It's an excellent project and my preferred video player, but the lead developer believed that the mouse wheel should control volume. Users wanted to use the mouse wheel for seeking (FF/RW) through the video. No amount of begging, pleading, constructive criticism, would get him to change his mind. He was so set in his opinion that he refused to even allow the
option to change what the mouse wheel did. If you used VLC, the mouse wheel would control volume, and only volume. For 7 years he held out, before finally giving in and allowing the mouse wheel to be remapped to other functions (it still defaults to volume to protect his ego).