News Apple Details First Arm Chips for Mac

javiindo

Commendable
Jun 12, 2019
66
7
1,535
0
I understrand now why all new AMD APU still have Vega GPU architecture. Because Apple wanted to be the first with a RDNA GPU. Otherwise, if the specs are true, this is very impressive. Another hit to Intel (and to AMD by the way).
 
Reactions: Kiril1512

gggplaya

Distinguished
I dont believe any of their claims . Also , They cant compete in the pro line ..
Actually they can in niche categories. Like for photographers and videographers. As long as they can get Adobe, Divinci, and similar programs to integrate their software with their M1 chips hardware accelerators, then yes they can compete. Art and Design is still where Apple has their biggest marketshare in the professional market.

They might also pick up some business people. Let's face it, most business people only need email, skype meeting type of apps, office apps and web browsing, which the M1 chip will do just fine. They don't care what type of processor is in it. But the reason they might pick up marketshare here is the insane battery life on these laptops. When you're on a business trip, or all day meetings etc... Battery life matters!
 

nofanneeded

Respectable
Sep 29, 2019
1,562
251
2,090
36
Actually they can in niche categories. Like for photographers and videographers. As long as they can get Adobe, Divinci, and similar programs to integrate their software with their M1 chips hardware accelerators, then yes they can compete. Art and Design is still where Apple has their biggest marketshare in the professional market.
No they cant , the Pro line could use RDNA2 chips from AMD with something like 20 Tflops performance . there is no way they can compete against that.

Also they are asking alot for an ARM chip Notebook , no one with a single brain cell would pay the prices they are demanding for the pro line with that cheap ARM chip. the best ARM chips are cheap , in the $100-$150 price range unlike Intel $250-$450 price range.

More Over , it is clear their ARM chip max RAM is 16GB .. this is low for professional work in 2020 . keep in mind that this is unified memory , so with like 8GB going for GPU you are left only with 8 GB for system ?

this is a JOKE ...
 
Last edited:

Murissokah

Distinguished
Aug 12, 2007
1,317
21
19,665
135
I'm curious as to how well Apple can execute x86/ARM transition. Microsoft tried it with Windows RT with dismal results. Apple has a more controlled environment, but it's still a huge task. People who buy a very expensive laptop might not react well if they keep hitting compatibility walls, and if it takes too long to fix that the product will be burnt already.
 

gggplaya

Distinguished
No they cant , the Pro line could use RDNA2 chips from AMD with something like 20 Tflops performance . there is no way they can compete against that.

Also they are asking alot for an ARM chip Notebook , no one with a single brain cell would pay the prices they are demanding for the pro line with that cheap ARM chip. the best ARM chips are cheap , in the $100-$150 price range unlike Intel $250-$450 price range.

More Over , it is clear their ARM chip max RAM is 16GB .. this is low for professional work in 2020 . keep in mind that this is unified memory , so with like 8GB going for GPU you are left only with 8 GB for system ?

this is a JOKE ...
Do you edit photos? You don't need any of power you're speaking of for that. Being a "PROFESSIONAL" only means that you get paid for the work you do. It doesn't mean that you need to biggest, baddest, most powerful equipment possible. Especially at the expense of battery life. When you're out, away from the office, actually doing your work, sometimes battery life with more than adequate power is what you need to actually accomplish your work, not an ultra beast of a computer. As long as the photo software is quick and snappy, photographers don't care what's under the hood. If they start to notice lag as they move through libraries, or click to expand a photo. Then yes, they care what's under the hood.

It remains to be seen if any video editing bottlenecks exist, but if hardware acceleration integration is good. These computer will be fine. I personally wouldn't use it for professional use because I think CPU encoding is still far cleaner than hardware accelerated encoding. Especially AMD's flavor of VCE which is inferior to Nvidia's turing NVENC.
 

nofanneeded

Respectable
Sep 29, 2019
1,562
251
2,090
36
Do you edit photos? You don't need any of power you're speaking of for that. Being a "PROFESSIONAL" only means that you get paid for the work you do. It doesn't mean that you need to biggest, baddest, most powerful equipment possible. Especially at the expense of battery life. When you're out, away from the office, actually doing your work, sometimes battery life with more than adequate power is what you need to actually accomplish your work, not an ultra beast of a computer. As long as the photo software is quick and snappy, photographers don't care what's under the hood. If they start to notice lag as they move through libraries, or click to expand a photo. Then yes, they care what's under the hood.

It remains to be seen if any video editing bottlenecks exist, but if hardware acceleration integration is good. These computer will be fine. I personally wouldn't use it for professional use because I think CPU encoding is still far cleaner than hardware accelerated encoding. Especially AMD's flavor of VCE which is inferior to Nvidia's turing NVENC.
It is not only about encoding , it is about real time Editing which uses the GPU to 100% all the time.

the Majority of Mac users buy it for Final Cut pro x , almost all youtubers prefer it as well over Adobe premiere pro .

You need at least 32 GB of RAM for that for heavy editing in 4K.

and I am talking about the "pro" note books line only .. which is supposed to be very powerful .

And you are mistaken about AMD Acceleration in Final Cut pro X , it does make a huge difference in real time editing.
 
I'm curious as to how well Apple can execute x86/ARM transition. Microsoft tried it with Windows RT with dismal results. Apple has a more controlled environment, but it's still a huge task. People who buy a very expensive laptop might not react well if they keep hitting compatibility walls, and if it takes too long to fix that the product will be burnt already.
Apple did manage well from the PowerPC to x86, though in that case, they knew the PowerPC in and out enough to make a competent compatibility layer/emulator.

In this case though, since Intel really hates it when someone makes an x86 emulator, it's possible Apple's developed their software stack so well that it's easy to compile an app for one platform or the other. Something akin to how Microsoft has Xbox 360 compatibility on the Xbox One.

There's also the question of how many apps in widespread use in Apple's ecosystem come from the store than from a website.
 
Nov 10, 2020
8
1
15
0
I'm curious as to how well Apple can execute x86/ARM transition. Microsoft tried it with Windows RT with dismal results. Apple has a more controlled environment, but it's still a huge task. People who buy a very expensive laptop might not react well if they keep hitting compatibility walls, and if it takes too long to fix that the product will be burnt already.
This is the third time Apple has made a CPU transition in the Mac lineup. From Motorola to PowerPC to Intel and now to ARM. They've a deep bench of tools to automatically convert existing apps to the new CPUs with good performance (Rosetta2 for example). Not trivializing the amount of work involved, but I'll be shocked if they have any significant compatibility issues with any mainstream applications.
 
Nov 10, 2020
8
1
15
0
Apple did manage well from the PowerPC to x86, though in that case, they knew the PowerPC in and out enough to make a competent compatibility layer/emulator.

In this case though, since Intel really hates it when someone makes an x86 emulator, it's possible Apple's developed their software stack so well that it's easy to compile an app for one platform or the other. Something akin to how Microsoft has Xbox 360 compatibility on the Xbox One.

There's also the question of how many apps in widespread use in Apple's ecosystem come from the store than from a website.
Intel can't stop Apple from using Rosetta2, it's based on i64 which they don't own.
 
Nov 10, 2020
8
1
15
0
No they cant , the Pro line could use RDNA2 chips from AMD with something like 20 Tflops performance . there is no way they can compete against that.

Also they are asking alot for an ARM chip Notebook , no one with a single brain cell would pay the prices they are demanding for the pro line with that cheap ARM chip. the best ARM chips are cheap , in the $100-$150 price range unlike Intel $250-$450 price range.

More Over , it is clear their ARM chip max RAM is 16GB .. this is low for professional work in 2020 . keep in mind that this is unified memory , so with like 8GB going for GPU you are left only with 8 GB for system ?

this is a JOKE ...
This is their ENTRY LEVEL Apple Silicon CPU. And it's as fast as anything AMD or Intel offer for laptops. Early next year the mid range round of chips will include much higher RAM amounts, and run significantly faster with higher power budgets.

And yes, it's an inexpensive chip. Likely Apple is making them for less than $100. One big sign of that is cutting the price of the Mac mini by $100. Very likely Apple will be significantly cutting prices of their high end iMacs and Mac Pros when the high end Apple Silicon ships in a year or two. Inexpensive, but also powerful.
 
Nov 10, 2020
8
1
15
0
I dont believe any of their claims . Also , They cant compete in the pro line ..
Anandtech just ran their own independent benchmarks on the A14, and this is what they had to say.

"Apple claims the M1 to be the fastest CPU in the world. Given our data on the A14, beating all of Intel’s designs, and just falling short of AMD’s newest 5950X Zen3 – a higher clocked Firestorm above 3GHz, the 50% larger L2 cache, and an unleashed TDP, we can certainly believe Apple and the M1 to be able to achieve that claim."

No one is going to be able to compete with Apple in the pro line with second rate chips.
 
Reactions: San Pedro

nofanneeded

Respectable
Sep 29, 2019
1,562
251
2,090
36
Anandtech just ran their own independent benchmarks on the A14, and this is what they had to say.

"Apple claims the M1 to be the fastest CPU in the world. Given our data on the A14, beating all of Intel’s designs, and just falling short of AMD’s newest 5950X Zen3 – a higher clocked Firestorm above 3GHz, the 50% larger L2 cache, and an unleashed TDP, we can certainly believe Apple and the M1 to be able to achieve that claim."

No one is going to be able to compete with Apple in the pro line with second rate chips.

I did not see any benchmarks over there that proves what he said. and that last statement is 100% false . and there is no proof of it whatsoever on Anandtech site ...

an 8 cores 10 watts CPU falls short of 105 watts 16 cores 5950x ? what kind of a writer wrote this ?

His claims are false ! 1000%

I will wait for a real Review when the notebook arrives.
 
Nov 10, 2020
8
1
15
0
I did not see any benchmarks over there . and that last statement is 100% false . and there is no proof of it whatsoever on Anandtech site ...

an 8 cores 10 watts CPU falls short of 105 watts 16 cores 5950x ? what kind of a writer wrote this ?

His claims are false ! 1000%
Their layout is confusing, I can't be snarky about it.

The quote is on the last (5th) page of their A14 /M1review article:

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/5

The benchmarks are on the 4th page:

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/4

What he's saying there is the iPhone 12's A14 is faster than all of Intels CPUs, and just behind AmD's 5950X Zen3 in performance. Since the M1 is based on the A14 design/silicon, but with hae a higher clock speed, 50% more L2 cache and much higher TDP, they believe the M1 will be the fastest CPU in the world. At least for the moment.
 
Nov 10, 2020
8
1
15
0
However if any of Rosetta 2 was based on the implementation details of Intel's processors, Intel has a case against Apple for litigation. This is something they brought up when they wanted to sue Qualcomm and Microsoft for their x86/x64 emulator.
Sure, anyone can use anyone for anything in the great old USA. But one thinks that Intel would have first sued 14 years ago when Apple released Rosetta 1 to run Intel apps on PowerPC. Obviously the situation has changed, Apple is going from their most important customer to a competitor, so you might be proven right.
 

kristoffe

Distinguished
Jul 15, 2010
143
2
18,695
1
Actually they can in niche categories. Like for photographers and videographers. As long as they can get Adobe, Divinci, and similar programs to integrate their software with their M1 chips hardware accelerators, then yes they can compete. Art and Design is still where Apple has their biggest marketshare in the professional market.

They might also pick up some business people. Let's face it, most business people only need email, skype meeting type of apps, office apps and web browsing, which the M1 chip will do just fine. They don't care what type of processor is in it. But the reason they might pick up marketshare here is the insane battery life on these laptops. When you're on a business trip, or all day meetings etc... Battery life matters!
you're kind of living in the late 90s early 2000s... apple does not hold any lead in computing for art.
 
Sure, anyone can use anyone for anything in the great old USA. But one thinks that Intel would have first sued 14 years ago when Apple released Rosetta 1 to run Intel apps on PowerPC. Obviously the situation has changed, Apple is going from their most important customer to a competitor, so you might be proven right.
Pretty sure the first version only translated PowerPC to x86.
 

nofanneeded

Respectable
Sep 29, 2019
1,562
251
2,090
36
Their layout is confusing, I can't be snarky about it.

The quote is on the last (5th) page of their A14 /M1review article:

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/5

The benchmarks are on the 4th page:

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/4

What he's saying there is the iPhone 12's A14 is faster than all of Intels CPUs, and just behind AmD's 5950X Zen3 in performance. Since the M1 is based on the A14 design/silicon, but with hae a higher clock speed, 50% more L2 cache and much higher TDP, they believe the M1 will be the fastest CPU in the world. At least for the moment.
Okay I saw them , I will wait for the real review and applications benchmarks , because that benchmark he is using is saying even 28 watts intel i7 1185G7 is same speed as intel i9 10900k ...

NO COMMENT , He is just comparing SINGLE CORE performance !
 

gggplaya

Distinguished
It is not only about encoding , it is about real time Editing which uses the GPU to 100% all the time.
That's why I used the term hardware accelerators. I'm not just talking about the final encoding but decoding as well. Hardware acceleration will make thumbnails and previews load fast, and the ability to real time edit much quicker.

the Majority of Mac users buy it for Final Cut pro x , almost all youtubers prefer it as well over Adobe premiere pro .
Final Cut is an Apple product, which is why I didn't even bother mentioning it. It's very obvious it would be well optimized for Metal and this M1 chip of theirs.

You need at least 32 GB of RAM for that for heavy editing in 4K.

and I am talking about the "pro" note books line only .. which is supposed to be very powerful .
Yes, I would want to have 32gb myself, which is why I edit on PC. But Apple only recommends 8GB minimum for 4k editing with most people still editing on 16gb of ram. Now if you're trying to edit hollywood movies with hundreds of scenes to put together, then yes 32gb all the way. But for the lower end professionals doing weddings and events with only a few dozen scenes, 16GB is fine. Now some of that will be shared with the GPU so we have to wait until this thing is released and people start using it to see if there are any bottlenecks. These "pro" notebooks start at $1299, which is not really that expensive for a pro notebook. The one I use for work cost over $2k. So this "pro" laptop is more for the lower end pro, not for a higher grade pro which will just buy a larger more expensive laptop with a discreet GPU anyways. To be honest, they won't edit on a laptop anyways, but rather use a desktop like me. Again, it's about the demographic niche these $1300 laptops are targeting.

And you are mistaken about AMD Acceleration in Final Cut pro X , it does make a huge difference in real time editing.
You totally missed the point because clearly you never heard of macroblocking. I wasn't talking about real time editing, I'm talking about the final render. GPU's can encode very quickly thanks to techniques like macroblocking. Instead of rendering every frame in it's fullness, it splits the frame into a grid of blocks and looks for changes between the previous, current and forward scene. It then copies the blocks that are the same and only renders the changed blocks. This works well for dramas with slow action, but terrible for anything that moves quickly on the scene or for quick panning of the camera. You'll see macroblocking. Nvidia's Turing does it mostly cleanly and is comparable to cpu encoding at medium to low quality presets. AMD s worse than even Pascal's video encoding. Intel's in the worst. I only CPU encode my final output which makes it very clean, this ARM silicon will choke trying to do a CPU render vs my Ryzen 3900x.
 

gggplaya

Distinguished
you're kind of living in the late 90s early 2000s... apple does not hold any lead in computing for art.
Never said they did have a lead. I said "their" biggest marketshare of the professional market. It's like me saying desktop CPU's is where AMD has "their" biggest marketshare of the consumer market. Intel is still leading, but it's where AMD is doing their best.
 

ASK THE COMMUNITY