News Apple Silicon Broadens Arm Assault on Intel and AMD's x86

So Apple is positioning itself to eventually directly compete, in terms of performance, with Intel and AMD? I haven't heard before that is their ambition. I'm not sure ARM architecture was designed to (massively) scale in that way, but who knows(?).
 

Giroro

Splendid
Tim Cook, nostalgic of an era when Macintosh computers couldn't run software, looked at the overwhelming success of Microsoft's Surface Go and thought, "yeah, we can totally do that, but without the touch screen".

And thus, the era where desktop software developers had to pay most of their profit to the iTunes app store was born.

But what they don't tell you, is that right-click mouse functionality has been removed from the box, and will require a $60 dongle. Because that has been scientifically proven to end global warming, and totally isn't an offensive corporate marketing lie to hide a blatant scheme to rip off their customers.
 
So Apple is positioning itself to eventually directly compete, in terms of performance, with Intel and AMD? I haven't heard before that is their ambition. I'm not sure ARM architecture was designed to (massively) scale in that way, but who knows(?).
this article was clearly written as a press release by apple and masquerading as a "news" article on this site. No numbers yet vague claims of performance and superior efficiency.

Sure, ARM is efficient, but it's also slow clocked, and much more comparable to old mobile x86 chips in pure performance. ARM processors don't come close to the raw compute horsepower in x86; and the only time it comes close is if the x86 chip is running at LOW frequencies (sub 2ghz). The idea Apple was able to both clock ARM up to 4 or 5ghz, AND match the pure compute performance of x86 is sorta silly considering last years top of the 7nm line 8 core ARM chip running at 2ghz was slightly better then a 2013 AMD Jaguar 8 core cpu running at 1.7ghz on 28nm.

It's not even in the same ballpark with a modern x86 chip. No one runs a desktop on a 5w cpu.
 
apple, known for their "our way or the highway" mentality, wont survive in a arm vs x86 war.

arm has limitations, and as an apple one...there greater.

unexpandable storage (internal) and ram.

locked down store.

and the largest issue imho...servers and gaming favor x86. Stuffi s made to work for windows. Mac and linux are after thoughts.

Even linux and mac are barely getting support decades late.
 
All of this is forgetting 2 large facts :
  • AMD and Intel both have ARM licenses,
  • and ARM has a potential huge competitor coming due to restriction on ARM design distribution.
But, more uncertain, is the elephant in the house...
Yeah, RISC-V made in China. That same country that took a couple years to catch up on 15 years of US R&D and manufacturing, and an open CPU design that has neither the licensing encumberment of x86 or ARM nor the design limitations of both.
 

TEAMSWITCHER

Distinguished
Aug 7, 2008
206
5
18,685
So Apple is positioning itself to eventually directly compete, in terms of performance, with Intel and AMD? I haven't heard before that is their ambition. I'm not sure ARM architecture was designed to (massively) scale in that way, but who knows(?).

Power efficiency is the key to more performance. There is only so much power you can draw through an outlet on the wall. Also, there is only so much power you can generate from a lithium polymer battery. And regardless, both must also dissipate the waste heat that consuming all this power generates. Once power and thermal limits are reached, performance gains are impossible. You either cannot get more power, or you cannot dissipate the generated heat of using it.

This is where ARM shines. It has better power efficiency, which means that more work can be done in any given power or thermal envelope. Apple's new M1 chip does this (and so much more) that I seriously doubt x86 will survive the decade.

You heard it from me first.. by 2030, x86 will only be emulated.
 

spongiemaster

Honorable
Dec 12, 2019
2,364
1,350
13,560
You heard it from me first.. by 2030, x86 will only be emulated.
Will that also be the year of the Linux desktop? Because I've been hearing that one for over 20 years now. You're underestimating the lack of interest software developers have for moving off of x86. Developers don't care about CPU power efficiency. Without monetary incentives to move to another architecture, which aren't coming, developers aren't leaving x86.
 
This is where ARM shines. It has better power efficiency, which means that more work can be done in any given power or thermal envelope. Apple's new M1 chip does this (and so much more) that I seriously doubt x86 will survive the decade.

You heard it from me first.. by 2030, x86 will only be emulated.
ARM only has better power efficiency because it lacks all of the additional instructions that x86 has, which means that it is only more power efficient if you can get by with using only the ARM specific instructions.

Emulating x86 or even outright incorporating all of the x86 instructions will drop it's efficiency to the same point as x86 if not much worse(just because amd/intel have so many years head start) .
And it goes both ways of course, AMD or Intel could easily rip out any instructions that are not broadly used and get a much more power efficient CPU...with all the same problems which are that those instructions would have to be executed by stringing together as many available instructions as you have to to get your result.Lowering performance and increasing power requirements.

apple, known for their "our way or the highway" mentality, wont survive in a arm vs x86 war.

arm has limitations, and as an apple one...there greater.

unexpandable storage (internal) and ram.

locked down store.

and the largest issue imho...servers and gaming favor x86. Stuffi s made to work for windows. Mac and linux are after thoughts.

Even linux and mac are barely getting support decades late.
Apple doesn't sell products because they have good performance, they sell them because people see apple products as low maintenance for tech illiterates.
And they provide all their own support for their products.
They already changed their CPUs and OS two times which would have killed off any other company.
 
  • Like
Reactions: Kamen Rider Blade
This is where ARM shines. It has better power efficiency, which means that more work can be done in any given power or thermal envelope. Apple's new M1 chip does this (and so much more) that I seriously doubt x86 will survive the decade.

You heard it from me first.. by 2030, x86 will only be emulated.
All of this has happened before. (All of this will probably happen again.) ARM is a RISC instruction set. x86 (and x64) are CISC. For the 40 years I've been involved in computing, about once a decade people get all excited about RISC, how it's lower power and cheaper manufacturing (smaller CPUs) are advantageous. It gains steam, then people start seriously comparing it side-by-side with CISC, and switch back to CISC. Every Single Time.

Will it be different this time? I dunno. But history says probably not. ARM has been buoyed thus far by the rapid shift towards mobile devices. CISC processors had been optimized for 4 decades for high performance tasks. They mostly ignored the low-power end, which created an opening for ARM to capture the mobile market (where low power consumption was more important than raw performance). But we're rapidly approaching the point where both RISC and CISC processors are so power-thrifty that they're "good enough" even if they aren't the best at low power computing. Same as happened at the high end, where the bulk of the market no longer clamors for i7 CPUs anymore because the i5 and i3 have become "good enough" for everyday tasks. Once you pass about 10 hours of battery life on a laptop, there's really not much advantage to lowering power consumption any more. Even reducing the battery size has diminishing returns. Going from a 2 lbs battery to a 1 lbs battery is a huge deal. Going from 1 lbs to 0.5 lbs not so much. And 0.5 lbs to 0.25 lbs will barely be noticeable.

Also, Apple has already given up the cheaper manufacturing advantage of RISC. The A14 processor is 11.8 billion transistors. AMD's Ryzen 7 (8 core) is estimated to be only 9.8 billion transistors. What's going on is that RISC is slower than CISC in raw like-for-like instruction execution speed. So to buoy its performance, Apple has added lots of hardware acceleration for specific operations. Which essentially makes it more like CISC than RISC. At that point, the only real advantage here is that they're manufacturing at 5 nm while everyone else is still on 7 nm. That might be good enough - Intel was able to milk the advantage of being 1-2 processes ahead of everyone else for 3 decades. But TSMC only gives Apple priority access to 5 nm because of their large orders. If anyone else were to offer TSMC an equivalent size order for more money (e.g. Intel swallows its pride and uses TSMC to fab their processors), that can quickly flip and Apple can find itself the one who is 1 process behind the competition. Its totally different from when Intel's process advantage was in-house and proprietary.
 
  • Like
Reactions: velocityg4

PapaCrazy

Distinguished
Dec 28, 2011
311
95
18,890
With Intel being stagnant for decades and the mobile segment exploding during the same time frame, we are in new waters. It surprises me people are making predictions based on old trends that have already come undone.

All the nay-saying about developers embracing new platforms is a paradox to the influx of new development on iOS and Android. Companies don't develop for platforms or languages, they develop for profitable markets. If one opens up, it will be as simple as that.

In all my years observing technology, the one thing I learned is never say never. Recent lesson: AMD. Things change, Kundun.
 
While technically a CISC architecture, the x86 processors themselves have been RISC-like internally for quite a while (ever since the 486, I believe) because the decode units essentially convert the complex instructions that make x86 into simpler instructions.
Though, it is now becoming possible to consider migrating from one CPU architecture to another for one huge reason: last time an architecture threatened x86 was back when Itanium came out; it was a failure because it wasn't much more efficient than i686, and emulating x86 on it was slow.
Since then, a huge change has appeared in development: most software shipped today is made for a software platform, not a hardware one.
Case in point: C#, Java and web apps.
Today, you can get an app made for ARM and run it on x86 with little trouble, and vice versa.
What does it leave behind? Specific high performance apps that require the very last transistor available for performance, and even those are starting to make use of run-time compilation - like, say, game shaders. Stuff like YASM is getting old, compiling an app's performance kernel barely takes a minute on install, most systems are starting to integrate systems to share precomputed binaries for a given platform...
It is already possible to ship a cross-platform binary today and having it recompile parts of itself at run-time. So yeah, while x86 still has some time in front of it, Wintel's domination is at an end.
 
The move to Arm marks Apple's biggest shift since it moved from PowerPC to Intel's x86 processors fifteen long years ago and threatens to unseat x86's decades-long dominance – dealing both Intel and AMD a stunning blow in the process.
A "stunning blow"? How can it be a stunning blow when we don't even know how they perform yet? Most likely not that well, if as the article points out "Apple still offers Intel-powered versions of its PCs as the upsell to its fresh roster of Arm-powered Macs." This sounds like Apple is at least initially just using ARM to make low-end computers out of upscaled versions of their phone processors, that they will undoubtedly sell at premium prices even if performance isn't consistently better.

Mac developers comprise roughly 30% of the overall pool of software developers, while 45% toil away on Windows and 25% dedicate their time to writing Linux software.
Where did these numbers come from? I find it rather unlikely that Macs would have two-thirds the number of developers as windows, and more than Linux, especially considering Windows users outnumber Mac users by at least 10 to 1 worldwide, and Linux owns the server space.

Doing a quick search shows that these numbers appear to be from a 2019 survey of visitors to Stack Overflow, a site where people ask for help with questions about programming, asking them what "primary operating system" they use. They only surveyed a fraction of a percent of the estimated ~25 million programmers worldwide, and I imagine regular visitors to that site are probably not all that representative of programmers as a whole. We're talking about an opt-in web survey that took respondents on average over 23 minutes to complete, and was active for just three weeks in early 2019, and 96% of respondents said they visit the site at least weekly. It seems a bit of a stretch to interpret the results as being representative of programmers as a whole, and the question was asking what primary operating system they use, not necessarily the primary platform they develop for.
 

PCWarrior

Distinguished
May 20, 2013
216
101
18,770
With Intel going big.little (big.bigger as Intel calls it) it won't be happening. AMD will follow suit once the big hurdle of making this concept work is ironed out by Intel. In any case it will be like this. First you have the utilisation of the little cores for background tasks for efficiency purposes. It does require the OS to dynamically do that switching but that's the future. Then you have raw performance. For example Alderlake will be 8 cores with hyperthreading plus 8 small cores. That is basically like making an 8-core/24-thread cpu. Hyperthreading adds about 30-35% extra performance so in other words “hyperthreaded logical cores” have 30-35% the “performance of physical cores". The little cores will be more or less the same. The little cores (e.g the Gracemont cores) will have around 35% the performance (due to lower IPC and lower frequency) of the big cores (e.g. Colden cove cores). So it is like adding another set of "logical hyperthreaded big cores".
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
Facts about the new Apple M1 chip ...

1- very limited I/O
2- very limited memory (they CANT GO ABOVE 16GB in this GEN)
3- NO PCI EXPRESS 4.0 lanes AT ALL. and NO Gen 3.0 as well (but is doable in ARM)
4- FOUR cores only (the little 4 cores are not important) ..
5-and no Hyperthreading ! it is just 4 threads big cores.
6- No information about any turbo mode

and yet you are saying it is an Assault on Intel and AMD ?

not in a million years. and I dont know how Anandtech and Tomshardware could write such misleading articles .
 
Last edited: