News Intel Fires Back at Apple's M1 Processors With Benchmarks

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

tomachas

Distinguished
Oct 12, 2014
94
12
18,535
This "Intel claims that the M1 in the MacBook Pro it tested failed eight out of 25 tests it uses, including "Switch to Calendar'' in Outlook". This is the sole reason Apple will go back to Intel chips :ROFLMAO:
 
Last edited:
  • Like
Reactions: ottonis

ottonis

Reputable
Jun 10, 2020
224
193
4,760
M1 is the fastest CPU that Apple has produced.
That's not what he said. The post you were replying to said "the slowest Desktop CPU Apple will ever release".
The M1 ist just a starting point. The rumor mill and leaks suggest a more beefy M1X version later this year with 50% higher core count that will power the upcoming more advanced MacBook Pro's.
 

ottonis

Reputable
Jun 10, 2020
224
193
4,760
Intel tried to pull us out of the x86 molasses, but AMD went and screwed it all up with x86-64 dooming us to decades more of duct tape and band aides on x86.

With all due respect, I couldn't disagree more on that statement.
What did Intel try that was sabotaged by AMD???
The fail Itanium? How is AMD responsible for a failed CPU design choice by Intel?
That doesn't make any sense.

AMD did what they could best. Period.
Intel on the other hand was completely free to develop better new designs, and they had all the resources, the manpower, the money to make a new design that would be more modern, more efficient and pave the way for future computing prowess. They just decided to stick with x86.
 

korekan

Commendable
Jan 15, 2021
86
8
1,535
What?

My new lambo is the same as your civic?
My lambo with eco mode is same consumption of energy with your civic?

Unbelieveable.
Im a gamer too but if intel $2000 vs apple $1000 laptop for work, its very easy decision.

For a thin, fast, energy efficient and cheaper. Why would i choose intel? For calendar?
 
With all due respect, I couldn't disagree more on that statement.
What did Intel try that was sabotaged by AMD???
The fail Itanium? How is AMD responsible for a failed CPU design choice by Intel?
That doesn't make any sense.
Itanium ran from 2001 to 2019...18 years is hardly a failure, AMD athlon was produced for way shorter and so was FX.
It had a small adoption rate but if that's enough to be called a failure then anything from AMD would be called a failure.
 
40 years of x86 development is not an advantage, it is an almost overwhelming handicap as opposed to starting from a 30+ year newer base like Apple did. Every time Intel and AMD start on a "new" architecture, this is basically what they have to start with:

PC's biggest advantage of almost endless backwards compatibility is also its biggest obstacle for forward progress. Intel tried to pull us out of the x86 molasses, but AMD went and screwed it all up with x86-64 dooming us to decades more of duct tape and band aides on x86.
The only reason why x86 appears to have so much baggage isn't because it's part of x86. It's because it's part of the IBM PC standard, a standard which I have no freaking idea why either company doesn't abandon or update to remove things that are required but not really in any serious use.

Case in point, when homebrew hackers wanted to get Linux on a PS4, they couldn't use a bog standard x86 Linux. They had to modify it because x86 versions of Linux were trying to initialize components that weren't in the PS4. There's a specific PS4 version of Linux now, even though once you get it running, said Linux can run x86 software just like any other PC.

Another thing to look at is the A20 line issue. At least Intel finally abandoned supporting fixes for it since Haswell.
 

archfrog

Honorable
Sep 7, 2015
6
1
10,515
x64 could be rejuvenated by removing all the legacy stuff and make the chip a pure 64-bit chip. No need for real mode, 286 protected mode, 32-bit mode, and the rest of the loads of backwards compatibility things in modern x64 chips. Says a guy who has suffered much trauma from having to write x86 bootstrap code for 386+ processors. This would probabably eliminate 25 percent of the chips internals, making everything quite a bit easier for everybody while preparing for a potential 128-bit successor. Please notice that I personally loathe the x86/x64 instruction set, but as someone were saying: The customers want it. And I don't see ARM competing on equal hands with x64 anytime soon, not even despite M1. While at it, Intel and AMD could eliminate the sad segmentation from x64 too - as far as I know, it mostly serves as unwanted tape from a time past where somebody, somewhere thought that segmented architectures were cool.
 
x64 could be rejuvenated by removing all the legacy stuff and make the chip a pure 64-bit chip. No need for real mode, 286 protected mode, 32-bit mode, and the rest of the loads of backwards compatibility things in modern x64 chips. Says a guy who has suffered much trauma from having to write x86 bootstrap code for 386+ processors. This would probabably eliminate 25 percent of the chips internals, making everything quite a bit easier for everybody while preparing for a potential 128-bit successor. Please notice that I personally loathe the x86/x64 instruction set, but as someone were saying: The customers want it. And I don't see ARM competing on equal hands with x64 anytime soon, not even despite M1. While at it, Intel and AMD could eliminate the sad segmentation from x64 too - as far as I know, it mostly serves as unwanted tape from a time past where somebody, somewhere thought that segmented architectures were cool.
All (or at least most) x86-64 processors still have support for Real Mode.

 
I'll back up what Ogotai posted. The M1's basic ISA chip architecture has been known in CPU circles for decades. and every decade or two someone dusts it off, updates it and claims it will overthrow x86. The problem ISA processors have is they are optimized for floating point operations, while they struggle with basic instructions. Since x86 is optimized for standard math, all software is written for standard math, about the only thing floating point is great at is graphics (which is just a lot of calculus).

Apple is playing a game with their ISA benches (just like intel is, only in the opposite direction), they're focusing on benchmarks that work in their favor, apple on graphical ones (which play to the ISA strengths) intel on processing and real world applications (which are designed for and work on x86 process).

Now as to a point that the apple fans have been pointing out, 'these chips are great in desktop applications too!" actually they're right. See desktop applications don't need much processing power at all. I have a 5yo atom cpu laptop with a ssd, and it is amazing in desktop apps like office. indistinguishable from my high end gaming desktop. If your goal is nothing special those M1 cpus are fine, but then so is pretty much everything on the market, the requirements to run word is almost non-existent.

Just don't let apple's marketing fool you into thinking a processor essentially at the level of the last gen xbox's jaguar core CPU is some sort of computer revolution.
 
The problem ISA processors have is they are optimized for floating point operations, while they struggle with basic instructions. Since x86 is optimized for standard math, all software is written for standard math, about the only thing floating point is great at is graphics (which is just a lot of calculus).
x86 and ARM aren't "optimized" for any sort of math. They're instruction sets. It's like saying English is optimized for grade school math while Russian is optimized for calculus.
 

syadnom

Distinguished
Oct 16, 2010
25
15
18,535
yeah, what hotaru.hino said. ingtar33's entire post is just crazyness, it makes no sense and is outright false. And quite frankly, M1 destroys jaguar CPUs in every single 'cpu' way, completely. The xbox has a discreet GPU which is maybe in the realm of the M1's for gaming but is substantially inferior in most any productivity context like media encoding or decoding.

Apples implementation of the ARM ISA has shown to be absolutely superior to every other ARM chip out there. That's why it's making this impact. Just like their mobile CPUs have set the bar than other vendors haven't been able to catch up to, they are all 1-3 generations behind in IPC vs Apple's silicon.

That's not really the issue with the article though. Intel's i7-1185G7 CPU is a higher clocked, higher watt CPU that goes in higher priced systems than Apples M1. Yeah, as in it's more that systems with the 'apple tax', and intel's benchmarks simply do not line up with the hundreds of benchmarks real-world people have shown.

The 'magic' in the M1 if you want to fanboi apple here is that it can perform very well for 15+ hours in a macbook without making a sound. The i7-1185G7 is a 28W CPU. It's not even in the same class as the M1 and it's being presented with demonstrably bad benchmarks.

The M1 is not the fastest CPU on the market, no one seriously thinks it is, but when intel comes out with bad benchmarks and conveniently skipping the battery life during the benchmarks that makes intel look really bad.
 

spongiemaster

Honorable
Dec 12, 2019
2,364
1,350
13,560
simply put, most companies didn't want to have to either get new software, or rewrite their existing software to be able to run on itanium. yes itanium had an x86 emulation mode, but it was slower then x86 running software on an x86 cpu.

Exactly, developers don't want to have to rewrite all their code for a new platform. No real surprise there, which is why you have to force them to switch by not giving other alternatives. This is at least Apple's 3rd dump everything and start from scratch architecture switch (Motorola -> IBM Power -> Intel x86 ->ARM). Developers are still writing software for Apple, because they don't have a choice if they want to keep selling software on Apple systems. AMD's x86-x64 band aid gave developers the out they needed to not have to write new software for IA-64 which killed any chance it had of replacing x86. I wouldn't endorse the wipe and start over every 10 years that Apple does, but 25-30 years is a good enough run, and time to be looking at what legacy support can be ended and what clean slate we should be starting with again.

And quite frankly, M1 destroys jaguar CPUs in every single 'cpu' way, completely. The xbox has a discreet GPU which is maybe in the realm of the M1's for gaming but is substantially inferior in most any productivity context like media encoding or decoding.

Xbox One does not have a discreet GPU. It's an almost 8 year old 28nm APU that you're admitting is probably equivalent to Apple's brand new 5nm APU, outside of encoding/decoding hardware for codecs that weren't used 8 years ago. Awesome work Apple.
 

watzupken

Reputable
Mar 16, 2020
1,181
663
6,070
You're still not getting it. Not going to keep beating a dead horse here. Also, Apple has been making ARM based CPU's for years. Stop acting like they just dropped this out of nowhere with no previous knowledge. Being the only CPU on the most advanced node currently available, that another company developed, certainly gives the M1 an advantage against the competition as well.
This is partially true. In my opinion, the reason for Apple's success is their tight hardware and software integration, which almost all companies don't get to enjoy. Looking at the software sales on Apple store, it gives software developers a big incentive to also optimize their software for Apple's hardware.

I don't think Intel's 10nm is that far away from TSMC's 5nm since the naming of the node is deceiving.
 

watzupken

Reputable
Mar 16, 2020
1,181
663
6,070
Exactly, developers don't want to have to rewrite all their code for a new platform. No real surprise there, which is why you have to force them to switch by not giving other alternatives. This is at least Apple's 3rd dump everything and start from scratch architecture switch (Motorola -> IBM Power -> Intel x86 ->ARM). Developers are still writing software for Apple, because they don't have a choice if they want to keep selling software on Apple systems. AMD's x86-x64 band aid gave developers the out they needed to not have to write new software for IA-64 which killed any chance it had of replacing x86. I wouldn't endorse the wipe and start over every 10 years that Apple does, but 25-30 years is a good enough run, and time to be looking at what legacy support can be ended and what clean slate we should be starting with again.
In my opinion, I won't say the software developers don't have a choice here. I believe they sell relatively well in Apple's ecosystem which gives them the incentive to make those changes. Also if you consider that Apple's hardware is not as fragmented as what we find for other OS, it actually should somewhat streamline their effort to recode it. Not great, but still easier.
 

Ogotai

Reputable
Feb 2, 2021
411
254
5,060
Itanium ran from 2001 to 2019...18 years is hardly a failure, AMD athlon was produced for way shorter and so was FX.
It had a small adoption rate but if that's enough to be called a failure then anything from AMD would be called a failure.
from https://en.wikipedia.org/wiki/Itanium#Market_share :
" By 2006, HP manufactured at least 80% of all Itanium systems, and sold 7,200 in the first quarter of 2006.The bulk of systems sold were enterprise servers and machines for large-scale technical computing, with an average selling price per system in excess of US$200,000. A typical system uses eight or more Itanium processors.
By 2012, only a few manufacturers offered Itanium systems, including HP, Bull, NEC, Inspur and Huawei. In addition, Intel offered a chassis that could be used by system integrators to build Itanium systems.
By 2015, only HP supplied Itanium-based systems. With HP split in late 2015, Itanium systems (branded as Integrity) are handled by Hewlett-Packard Enterprise (HPE), with recent major update in 2017 (Integrity i6, and HP-UX 11i v3 Update 16). HPE also supports a few other operating systems, including Windows up to Server 2008 R2, Linux, OpenVMS and NonStop
. "

considering HP was pretty much the only major supplier of itanium based servers for most of its life, it indeed could be labeled as a failure. as it wasnt adopted by anyone other then HP, but HP kinda of had to push it, as they were also part of its development.

https://www.extremetech.com/computi...anium-9700-series-cpu-finally-officially-dies
" Of course, technical reasons aren’t the only reason why Itanium failed. The chips were expensive, difficult to manufacture, and years behind schedule. Intel made a high-profile declaration that Itanium was the future of computing and represented its only 64-bit platform. Then AMD announced its own AMD64 instruction set, which extended 64-bit computing to the x86 architecture. Intel didn’t change course immediately, but it eventually cross-licensed AMD64. This was a tacit admission that Itanium would never come to the desktop. Then, a few years ago, a high-profile court case between Oracle and HP put a nail in Itanium’s coffin. Today’s chips are the very definition of a contract fulfillment, with no improvements, cache tweaks, or other architectural boosts. "
while you say itanium was made until 2019. as the extremetech article states, " Today’s chips are the very definition of a contract fulfillment, with no improvements, cache tweaks, or other architectural boosts. " so intel HAD to keep making itanium based cpus, and they did, while barely updating them, ( the 9740 is literally the same chip as the 9540, with the exact same clock speed.) so out of 18 years, it was only being produced, the last few years of life, was only because of contracts.

AMD athlon was produced for way shorter and so was FX.
the SAME thing could be said about the pentium, Pentium pro, Pentium II and Pentium III whats your point ?
and athlon and FX were brand names, NOT architectures

AMD's x86-x64 band aid gave developers the out they needed
and there in essence is a jab at amd, that i was expecting. i dont think i have seen yet, you say anything that is positive towards amd. while intel was trying to start from a clean slate with ai64, it SHOULD of realized the push back it would of gotten. and tried to at least get x86 to run better then it did on Merced, in essence, do what amd64 did, run 32bit code well enough, that it would of given companies time to keep old software, while they migrated to ia64, amd saw this, and took advantage of it. it works with the apple eco system, probably only because it is such a closed eco system, and as watzupken said, is probably part of the reason as well.
 
  • Like
Reactions: Krotow
It would be ludicrous to think Apple's new M1 architecture could fully compete across a wide use case spectrum with Intel/AMD, whose architectures (and total software library) have had years to mature.
Irrespective, the more players competing the better....drives innovation and keeps pricing in better(!) check. That is what we all want...?

ARM has been making CPU designs for 30 years. Apple has about 10 years making ARM based chips as there first chip was sold in 2013 which it takes about 3 years to design a CPU. The ARM ISA uses fixed length instructions so the decoder is going to be a simpler design than Intel's. The lower power chips is where ARM will easily shine its the high power desktop/server parts that are going to be a real battle.
 

waltc3

Honorable
Aug 4, 2019
454
252
11,060
It's hard to know which company is the bigger BS'er--Intel or Apple...;) I used to think it was Apple, hands down. But ever since AMD leapt ahead of Intel like Intel was sitting still (which is not far from the truth), Intel has gone hog wild with the BS--laying it on thick everywhere. So Intel is definitely right on Apple's heels as the industry's biggest BS'er...;)

Why Intel is even wasting its breath on Mac users is beyond me--and "productivity" benchmarks have everyone laughing uproariously. I'm sorry, but dragging a bunch of old farts out of retirement--old farts who were high in the company when Intel had no peers--is not going to solve Intel's problems. Neither will spurious benchmarks. Itanium is long dead--good riddance.

Apple is not a CPU company--will never be a CPU company, ever. Poor Intel, falling so far behind AMD has really confused the company--what a sad showing.
 
  • Like
Reactions: Conahl
Feb 7, 2021
12
1
15
TBH, I doub't someone on the iOS ecosystem would be able to shift back to Windows ever. Windows is like an expensive Android which will eventually start lagging after 1-2 years no matter how costly it is.

I shifter to the cheapest Macbook Air in 2015. The whole interface and performance is far better than today's $1500 range Windows.

Yes, Apple is way too costly but its' worth every penny. The best part, my 5 year old system still gives me a battery backup of around 8-10hrs even after exhaustive usage. I even use it for 1080p editing on Premier Pro. Not buttery smooth but still works better than any other Intel PC in the same price range.

Thus, I highly doubt the Intel benchmarks.
 
Oct 2, 2021
1
0
10
TBH, I doub't someone on the iOS ecosystem would be able to shift back to Windows ever. Windows is like an expensive Android which will eventually start lagging after 1-2 years no matter how costly it is.

I shifter to the cheapest Macbook Air in 2015. The whole interface and performance is far better than today's $1500 range Windows.

I disagree. I am both an iPhone and an Windows user and I never liked Mac OS. Windows gets leaner not more bloated.
For me Mac interfaces really still look like being designed by teens for the teens, more like by cartoonists for the cartoon lovers.
It's just a bit better than it was in the Quicktime player era a while back. A dark mode doesn't change bleep. Programs only look better when in fullscreen. This is certainly the reason I only tolerate Apple programs on my iphone. Why would I want to see my notifications from the Iphone on the desktop too? Makes me more informed, more cool?
I don't like Macbooks' keyboards. People seem to love them but for me typing on them is like typing on paper thin Mahjong tiles. This is a thing Apple could 'borrow' from Microsoft. The latter knows a thing about surfaces :/ and their keys are state-of-the-art. I've typed better on no names cheap keyboards than on that macbomination.
Last Macbook Pro I tried had i7, mind you, also a Radeon card and it got hot a lot and huffing. Apple is spot on with swapping the architecture. Intel chips on desktop consume peanuts for watts when they step down and they do. Recent ones are implemented poorly (or outright suck) on windows laptops , last ones I tried were unnervingly loud and many buyers felt the same. It's not like Intel have been making chips for mobiles all this time though and some of the models they've made until the arrival of M1 weren't bad at all. It's just that with something like the M1 they have now somewthing to compare to. So there it is,

M1 is a game changer. But I actually want it to run Windows as I could run whatever it crosses my mind, not be bothered by alienating design choices and Windows on Intel not once crashed on me since I put it together(what more can one wish in terms of optimisation) and lots of M1 owners complain of crashes :(
This last point I make is in Intel's defense because while they might be releasing misleading benchmarks or simply put ads, on the other hand their merchandise doesn't crash and thank God now doesn't have to run that OS anymore as though doing so would have brought them some of Apple's money, in the long run the abuses the devs would put it through and mistreatment from users would leave it with indelible marks.

So, while Windows users do defect to the Mac, I would not.