News Apple M1 Max Catches up to RX 6800M, RTX 3080 Mobile in GFXBench 5

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Oct 21, 2021
1
1
10
0
Lmao i m happy i never trust benchmarks.i only trust real life test apple beating nvidea 3080 mobile in benchmark lol ik about apples tricks in benchmark and ik apple can beat intel qualcomm amd and nvidea only in benchmark. apple a15 is very high than sd 888 in benchmark but it only beat sd 888 in fee things and other place sd 888 beats it for me always benchmark is always joke
 
Reactions: artk2219

sycoreaper

Reputable
Jan 11, 2018
25
4
4,535
0
If the M1 Max is as fast as the claim and has close to 3080 graphics, this thing is going to be a volcano. Sure hope they have a good cooling design.
 
Reactions: artk2219
If I'm not mistaken, isn't it total 60W TDP?
That brings up another point. Barring the M1 Max's total TDP, the 3080 Mobile has a TDP range of 80W-150W. And this isn't the GPU shifting between power levels. This is the TDP being set by the system builder and the GPU is going to base its cooling/clocks/etc. around that power profile. So are we talking about the 80W 3080 Mobile being tested here or the 150W 3080? If it's the 80W 3080, things get even less impressive.

Either way, this comparison has so many variables in it that it should be tossed out.
 

renz496

Champion
I disagree that they would lose massively. At worst it'd just be blip on their bottom line.

They don't have to make any new hardware. Or at least they just continue on with their Apple TV line. Make it compatible with Bluetooth controllers and boom, they have a console. If you think this idea is silly, NVIDIA basically made the Android version of this and the best NVIDIA could offer outside of standard Android games was its cloud service (which is on iOS).

The only real problem here iOS doesn't have much in the way of a killer app for gaming and a majority of the games are designed with a touch screen in mind. But the latter could just be a chicken/egg problem.
will Apple pride allow such thing to happen? if they going with console business they most definitely want to charge top dollar on any hardware that associate with their brand. imagine charging $999 just for the controller charging dock.
 
Reactions: artk2219
Reactions: artk2219

renz496

Champion
Well I'm not surprised since Apple has a contract with Imagination Technologies GPU new IP. They demonstrated the ability to surpass Intel and AMD iGPU every now and then using less than half the wattage. Sad thing is they refuse to compete in PC gaming cards and no third party wants to license their IP to do the same. It's a lot easier than any startup coming up with their own tech. Now that manufacturing has been disrupted, it pretty much doesn't matter unless 3D printing fabrication makes a breakthrough and away from the commercial industry.
because imagination don't want to spend extra money on software, spend extra money to develop strong devrel. even on mobile imagination was notorious known for not providing drivers beyond first release. hence some soc maker decided to ditch them over ARM Mali even if hardware wise Power VR is better. for apple they want full control on the chip even on the software so they have no issue developing their own driver (this is of one thing that i heard did not sit well with nvidia in the past).
 
Reactions: artk2219

JamesJones44

Great
Jan 22, 2021
69
38
60
0
That brings up another point. Barring the M1 Max's total TDP, the 3080 Mobile has a TDP range of 80W-150W. And this isn't the GPU shifting between power levels. This is the TDP being set by the system builder and the GPU is going to base its cooling/clocks/etc. around that power profile. So are we talking about the 80W 3080 Mobile being tested here or the 150W 3080? If it's the 80W 3080, things get even less impressive.

Either way, this comparison has so many variables in it that it should be tossed out.
Apple showed two tests against the 100 watt and a 140 watt version. It supposedly beat the 100 watt version but not the 140 watt version. However, recent CUDA benchmarks have shown up in GB 5 and the numbers look closer to a 3050 Ti than a 100 watt 3080. That makes it more comparable to 75 watts TDP.
 
Apple showed two tests against the 100 watt and a 140 watt version. It supposedly beat the 100 watt version but not the 140 watt version. However, recent CUDA benchmarks have shown up in GB 5 and the numbers look closer to a 3050 Ti than a 100 watt 3080. That makes it more comparable to 75 watts TDP.
Being closer to the 3050 Ti in performance makes more sense to me. And even barring the API issue with GFXBench, I'm still calling into question of how much of a graphics load it actually is throwing on the GPUs.

This would be like saying the Intel Iris Xe GPU in my laptop matches a 3060 in a Source engine game because it can get 200+ FPS or something (never minding that Source has a 300 FPS cap by default)
 
Jan 16, 2021
84
12
35
0
because imagination don't want to spend extra money on software, spend extra money to develop strong devrel. even on mobile imagination was notorious known for not providing drivers beyond first release. hence some soc maker decided to ditch them over ARM Mali even if hardware wise Power VR is better. for apple they want full control on the chip even on the software so they have no issue developing their own driver (this is of one thing that i heard did not sit well with nvidia in the past).
There's always third party development for drivers. It's GPU IP technology that's harder to come by and it should be passed up as there are no other viable alternatives at this point.
 

TCA_ChinChin

Honorable
Feb 15, 2015
475
129
11,090
33
Even if the GPU on the M1 max only matches the lowest binned, most wattage constrained RTX-3080 in applications, it would still be a pretty big achievement. I'll be waiting for some geniuses get Linux to work on these puppies, hopefully with native GPU acceleration.
 
Reactions: King_V

hushnecampus

Prominent
Sep 16, 2020
33
15
535
0
These specs are all well and fine but it won't change anything regarding the top publishers bringing their games to the Mac. The Big publishers have so much invested in games that some big budget games are now pushing half a billion dollars. Those games are going to platforms with the largest audiences and those platforms are the consoles and PC's running Windows.

If Apple were to become serious and want those big budget AAA games, they'll need to develop middleware software to port those games to the hardware running Apple Silicon. It can be done but it will require they open their wallet and then convince the top publishers why it makes business sense to bring their games to the Mac. So far they only have interest on the TV side and their gaming interest is in Apple Arcade.
If Apple wanted to become serious about games they'd need to include the good graphics in the cheaper, non-Pro models.
 

JamesJones44

Great
Jan 22, 2021
69
38
60
0
Even if the GPU on the M1 max only matches the lowest binned, most wattage constrained RTX-3080 in applications, it would still be a pretty big achievement. I'll be waiting for some geniuses get Linux to work on these puppies, hopefully with native GPU acceleration.
Latest benches suggest it's on par with a 3050 Ti. While that's not bad, not bad at all, it's not equal to a 80 watt RTX-3080.
 

TCA_ChinChin

Honorable
Feb 15, 2015
475
129
11,090
33
Latest benches suggest it's on par with a 3050 Ti. While that's not bad, not bad at all, it's not equal to a 80 watt RTX-3080.
I saw some benchmarks comparing it with older Intel + Radeon macbook pros and it was significantly more performant (to be expected) in video editing workloads using both Adobe and Final Cut, but if comparing with a Windows laptop with high end specs (like an i9 + mobile RTX-3080) it is either behind, or roughly equivalent in several workloads (using Adobe + RED footage). I'm guessing for anything non-native to MacOS or not Apple specific, it'll probably perform like an RTX-3060 or 3050ti, but for optimized workloads, it'll do a lot better. A little disappointed, but in the end, still quite respectable, and I think worthy of a "PRO" moniker.
 

ASK THE COMMUNITY

TRENDING THREADS