News Apple M1 Ultra SoC Cranks Mac Performance with 20-Core CPU, 64-core GPU

Articles like this are why apple has a lawsuit against them for misrepresenting the speed of the m1 processors. They never live up to the hype. Apple products are never worth the cost.
 
  • Like
Reactions: drivinfast247
I'm inclined to believe Apple that the M1 Ultra's GPU can achieve RTX 3090 performance if we had more data to compare it with. Apple claims they used "Industry standard benchmarks", but failed to mention which ones specifically.

Looking at Notebookcheck's entry on the M1 Max, it certainly beats out the competition... in GFXBench. No 3DMark, no SPECviewperf, or anything else that may be considered "Industry standard." Plus I looked back on this a while ago and GFXBench is Metal for Apple in all of its tests, but only some of the tests have DirectX 12 or Vulkan, rendering the comparison even more moot.

Plus the whole "perf/watt" chart makes no sense to me if taken at face value. Why should I care about anything other than the performance at the part's maximum performance rating? If I wanted to consume less power, I'd look for a part specifically designed to consume less power.
 
Idc how good it is as long as it comes with a ridiculous premium, and mac os. Make these good power efficient cpus, pair them with NVIDIA laptop gpus and make a great gaming laptop with windows 10.
 
Idc how good it is as long as it comes with a ridiculous premium, and mac os. Make these good power efficient cpus, pair them with NVIDIA laptop gpus and make a great gaming laptop with windows 10.

Are you that scared of macOS and other Unix/Linux based operating systems? These days the OS's are all basically the same with just a different GUI slapped on top from a feature point of view. Other than games, Windows really doesn't offer anything over any of the other operating systems. I haven't seen a productivity/developer tool made for Windows only in about a decade.
 
either Apple is greatly inflating its numbers and prowess, or the other specialized company (intel, nvidia, amd) with lot of experience and years of upgrading and investing in their chips are underperforming a lot or especially incompetent...I tend to believe that Apple sell clouds to their followers. from afar, they look huge...
 
Kind of hard to say how good it is when they do not giving any information on how much power it uses.
It's in the x-axis of this chart:
xkj8TVHWV8E33pERPEBFSC-970-80.png.webp


Are you that scared of macOS and other Unix/Linux based operating systems? These days the OS's are all basically the same with just a different GUI slapped on top from a feature point of view. Other than games, Windows really doesn't offer anything over any of the other operating systems. I haven't seen a productivity/developer tool made for Windows only in about a decade.
I would argue on the Linux side, it depends. For one thing, I really don't like most Linux based OS's default app management and distribution system. I tried installing a specific version of Python and it took me like 6-7 steps to get it to a point where I could finally type in "python" in the command line and get going. On Windows? Download installer, install, done.

But if you strip it down to the kernel level stuff, then yeah, I'd agree that for the most part, Windows and UNIX have enough similarities that arguing anything is purely academic.
 
  • Like
Reactions: thisisaname
Even Intel would be ashamed by the amount of nonsense in those slides.

"Our integrated graphics are faster than a 3090 while drawing 100 watts!"
"Oh wow! What are they faster at?"
"You know... things. <_< "

No doubt they could be faster at certain workloads that utilize specific baked-in hardware features, like if their chip has hardware support for encoding a particular video format, while another does not have the same level of support for it, and must utilize general-purpose hardware to perform the task. But I really doubt we're going to see overall graphics performance anywhere remotely close to 3090 in that chip.
 
So I wanted to dig into the GPU part some more...

On Apple's press release they have this in their footnotes:
Testing was conducted by Apple in February 2022 using preproduction Mac Studio systems with Apple M1 Max, 10-core CPU and 32-core GPU, and preproduction Mac Studio systems with Apple M1 Ultra, 20-core CPU and 64-core GPU. Performance was measured using select industry‑standard benchmarks. Popular discrete GPU performance data tested from Core i9-12900K with DDR5 memory and GeForce RTX 3060 Ti. Highest-end discrete GPU performance data tested from Core i9-12900K with DDR5 memory and GeForce RTX 3090. Performance tests are conducted using specific computer systems and reflect the approximate performance of Mac Studio.
If I'm reading this correctly, they compared the M1 Ultra GPU to the M1 Max GPU in one line and the RTX 3090 against an RTX 3060 Ti on the other.

This is really looking more like a jab at NVIDIA.
 
CPU. Plus GPU. Plus 128 gigs memory. 114 billion transistors. One chip?

If this delivers then let Apple bask in the well-earned glory.

But gut feeling has questions about production yields, and whether they might be trying to do too much.
 
Are you that scared of macOS and other Unix/Linux based operating systems? These days the OS's are all basically the same with just a different GUI slapped on top from a feature point of view. Other than games, Windows really doesn't offer anything over any of the other operating systems. I haven't seen a productivity/developer tool made for Windows only in about a decade.
I have a MacBook for photo and video editing and you are right that most productivity apps are for both. But mac os is slim on games. I and many others heavily prefer windows for diverse reasons.
 
Even Intel would be ashamed by the amount of nonsense in those slides.

"Our integrated graphics are faster than a 3090 while drawing 100 watts!"
"Oh wow! What are they faster at?"
"You know... things. <_< "

No doubt they could be faster at certain workloads that utilize specific baked-in hardware features, like if their chip has hardware support for encoding a particular video format, while another does not have the same level of support for it, and must utilize general-purpose hardware to perform the task. But I really doubt we're going to see overall graphics performance anywhere remotely close to 3090 in that chip.
Agreed. The M1 SoC has the Neural Engine (AI Accelerator) and an Image Processor on the SoC. The load they could be benchmarking could be an Image AI load. The M1 SoC also could be running a load that would favor an SoC vs a GPU
 
Smoke and mirrors nothing more. And at 6g price. Not happening. This will be reserved for the real retards who love to waste money
 
  • Like
Reactions: peachpuff
99% more this and 4.7x more that. Bah nothing will beat the $$$ markup spec vs the competition.

Seriously though, I recently built a 12900k/3090 (and every other part was a top tier halo part) and the total was $4k

Money aside, I do think Apple is doing some amazing engineering with the M1 line. There's just not much I can do with one on a daily basis. From gaming to work everything is on Windows.
 
Pretty easy to make wild and baseless claims when you know no one can run a benchmark and do a real comparison.

I can do it, too. Phenom II 3-core is 60% faster than M1 Ultra across a variety of workloads. Prove me wrong.
 
  • Like
Reactions: gargoylenest
It's in the x-axis of this chart:
xkj8TVHWV8E33pERPEBFSC-970-80.png.webp



I would argue on the Linux side, it depends. For one thing, I really don't like most Linux based OS's default app management and distribution system. I tried installing a specific version of Python and it took me like 6-7 steps to get it to a point where I could finally type in "python" in the command line and get going. On Windows? Download installer, install, done.

But if you strip it down to the kernel level stuff, then yeah, I'd agree that for the most part, Windows and UNIX have enough similarities that arguing anything is purely academic.

Brew or apt-get python. Done.
 
Brew or apt-get python. Done.
Yes, for whatever's on the repo. But if I need a specific version of Python, then I have to go through a lot of steps to get it installed.

I think in the case that I had to go through this, I wanted to use Python 3.4, but the repo had 3.6. And yes, it had to be that specific version because what I was targeting was on a closed network and 3.4 was the earliest version I was guaranteed would be on a given computer.

EDIT: Before you go "but I can sudo apt install [app]=[version]", the repo may not have the version I need and I don't expect repos to host more than 2-3 versions of the app.
 
Last edited: