News Apple debuts M4 processor in new iPad Pros with 38 TOPS on neural engine

abufrejoval

Reputable
Jun 19, 2020
361
247
5,060
And just what good will all that AI do?

Can it do the dishes?

Can you have a meaningful conversation with it?

Judging from what I get from the various Llamas with my RTX 4090, which got plenty more power going and still garbage going out, all those transistors are total waste of sand.
 
  • Like
Reactions: artk2219

kealii123

Great
Nov 3, 2023
60
34
60
So TLDR for $1000 the 11 inch ipad will have roughly double the performance as my steamdeck but in a fanless design probably staying under 10 watts sustained load. The screen will be as good as my flagship OLED phone, and their OS isn't full of adware, spyware, bloatware, and other nonsense.

I hate apple. I hate their OSes. I love using my own software, or at least my own choice of software, like Brave browser, and running desktop software on a tablet (like Fusion 360, IDEs, etc). But man, RIP to Intel and AMD. Apple silicon is crushing, and Windows 11 is now worse than MacOS. Even if Qualcomm comes thru with all their promises, thats still a chip thats 1 generation behind Apple AND is married to Windows and Microsoft. Its infuriating.

I have AMD, Intel, and Qualcomm Windows devices, and my work machine is an M1 macbook. I've use all of them. Hats off to Apple, for everything but gaming its not even a choice.
 
  • Like
Reactions: NinoPino
M silicon is largely wasted in an iPad. Most of these end up media consumption devices, expensive but trendy POS terminals, or wrapped in an industrial case inventory/inspection terminals. Three gen old A series does all that fine. If they want the iPad to do more and actually be relevant they need to fix the software. Make it more like macOS or allow it to run macOS. I did read that Apple is working towards on device AI models, so perhaps this is heading towards an even worse HomePod/Siri experience but on a tablet? Sounds awful to me.
 

The Hardcard

Distinguished
Jul 25, 2014
31
41
18,560
And just what good will all that AI do?

Can it do the dishes?

Can you have a meaningful conversation with it?

Judging from what I get from the various Llamas with my RTX 4090, which got plenty more power going and still garbage going out, all those transistors are total waste of sand.
No AI is going to transform how people use computers in key ways. In five years, most people will insist on AI accelerators. What you have today is AI in its infancy. Llama 3 is like the Commodore PET-Apple I era. Hobbyists feeling their way around.

Polished AI that will do complex, tedious tasks will arrive inside of 3 years and will transform all aspects of computer and device work. I’m astonished by the number of tech enthusiasts who are not seeing the biggest technological transformation of their lifetime.

It should be clear that by 2030, there will be no need to learn Excel, AutoCAD, DaVinci Resolve, VS Code, Photoshop, Powerpoint or myriad other apps. People take classes and get certificates to show they can do highly complex, technical work using professional applications. Mature AI apps will allow people who know nothing about today’s applications to get better results.

We are going to be able to just tell the devices what we want and have the results provided without thinking or even knowing about cells, formulas, filters, grids, macros, formatting, layers, plugins, coding, and the like.
 

The Hardcard

Distinguished
Jul 25, 2014
31
41
18,560
M silicon is largely wasted in an iPad. Most of these end up media consumption devices, expensive but trendy POS terminals, or wrapped in an industrial case inventory/inspection terminals. Three gen old A series does all that fine. If they want the iPad to do more and actually be relevant they need to fix the software. Make it more like macOS or allow it to run macOS. I did read that Apple is working towards on device AI models, so perhaps this is heading towards an even worse HomePod/Siri experience but on a tablet? Sounds awful to me.
Silicon waste is not determined by whether there are people who don’t use its full potential. It would be wasted if nobody used it. You could argue that there are people wasting their money by getting more compute than they need just to have a halo device. To be sure, Apple makes extra billions off of those people.

But millions of iPad M4s will be often be maxed out. Hundreds of thousands will be regularly maxed out. The M4 is not nearly the end of the road for necessary iPad capacity. Many people will have use for the future chips.
 

JamesJones44

Reputable
Jan 22, 2021
714
668
5,760
Given TOPS isn't a great metric and they are touting a 50% performance increase over a 2 gen old CPU with two less cores (e cores, but with no SMT, makes a difference) and a node that is almost half in size of the M2's this is a lot of "see what we did" with out a lot of meat to back it up IMO.

For all the hype I expected a lot more, this is very much iterative vs the M3 overall.
 
  • Like
Reactions: artk2219
Silicon waste is not determined by whether there are people who don’t use its full potential. It would be wasted if nobody used it. You could argue that there are people wasting their money by getting more compute than they need just to have a halo device. To be sure, Apple makes extra billions off of those people.

But millions of iPad M4s will be often be maxed out. Hundreds of thousands will be regularly maxed out. The M4 is not nearly the end of the road for necessary iPad capacity. Many people will have use for the future chips.
It's a turn of phrase nothing more, don't read so far into things. Here it indicates that typical home users will purchase something powerful well beyond their use case simply because it's the "best". I'm sure there are dozens of people that will max these out in a "Pro" capacity. Some may play games that push it to it's boundaries, the other several million will purchase something fit for a professional use case, like a full blown Macbook featuring the same processor or its PC equivalent. You know, something with actual IO and a functional keyboard that isn't costing them the equivalent to a second, entry level iPad. My beef with the iPad is that it is OS limited, the hardware is capable of so much more. This is a complaint echoed throughout the Appleverse, it's not just me.
 
  • Like
Reactions: NeoMorpheus
It should be clear that by 2030, there will be no need to learn Excel, AutoCAD, DaVinci Resolve, VS Code, Photoshop, Powerpoint or myriad other apps. People take classes and get certificates to show they can do highly complex, technical work using professional applications. Mature AI apps will allow people who know nothing about today’s applications to get better results.
I would bet you X dollars that your claim above will not come to fruition by 2030. You are essentially claiming a layman that only has basic computer use experience will be able to bust open above said programs and have the AI do all the work for them, or that AI can produce such things from scratch for them without the need of said programs. I would guess you are not a programer...
 
I would bet you X dollars that your claim above will not come to fruition by 2030. You are essentially claiming a layman that only has basic computer use experience will be able to bust open above said programs and have the AI do all the work for them, or that AI can produce such things from scratch for them without the need of said programs. I would guess you are not a programer...

Yeah the AI-Bro's are pushing it as the cure for cancer, world hunger and hair loss. I can tell how much someone knows about the technology based on how they are trying to sell it. "Generative AI" doesn't create anything, it's just copy pasting content from humans that it stole from. Instead of plagiarizing from one human creator it's plagiarizing from dozens. Using this guys eyes, this other guys hair, third persons background, other persons hands and so forth. Instead of stealing $1,000,000 from one person, it steals $1 from 1,000,000 people.
 

The Hardcard

Distinguished
Jul 25, 2014
31
41
18,560
I would bet you X dollars that your claim above will not come to fruition by 2030. You are essentially claiming a layman that only has basic computer use experience will be able to bust open above said programs and have the AI do all the work for them, or that AI can produce such things from scratch for them without the need of said programs. I would guess you are not a programer...

i’m not just saying it will come to fruition by 2030. I’m saying it will be the norm by 2030, fruition will happen before then. I started programming computers in 1982, though it has never been my main function. Nevertheless, computing has been one of my central passions for more than 40 years.

Sadly, a couple of massive mechanical failures in my trucking enterprise has me currently severely net negative in assets. In fact I am currently on the brink of total loss. When and if I am able to recover and get back in the positive territory, i’d be more than willing to bet it all on that prediction.

I think closely watching what is happening now with Google, Microsoft, OpenAI, Stability Ai, Meta and the like make the future almost painfully obvious. As I said earlier. The number of tech site commenters who don’t see this coming really surprises me. The magnitude of the AI advancement will be equal to all the previous advancements humans have made since the dawn of civilization. I make that statement honestly, and believe there is no hyperbole in that claim.
 
Yeah the AI-Bro's are pushing it as the cure for cancer, world hunger and hair loss. I can tell how much someone knows about the technology based on how they are trying to sell it. "Generative AI" doesn't create anything, it's just copy pasting content from humans that it stole from. Instead of plagiarizing from one human creator it's plagiarizing from dozens. Using this guys eyes, this other guys hair, third persons background, other persons hands and so forth. Instead of stealing $1,000,000 from one person, it steals $1 from 1,000,000 people.
We have argued in the past about our disagreements on the characterizations' of what and how AI works and the apparent legalities of using it, but what this guy is saying is ludicrous. I am firmly in the camp that AI is not some cure-all technology because that is impossible with current AI capabilities. Chat GPT can barely cobble together a competent statement about technical information such as math, programming, or even building a computer, so how are today's, or even 2030's AI going to be some cure-all?
 
i’m not just saying it will come to fruition by 2030. I’m saying it will be the norm by 2030, fruition will happen before then. I started programming computers in 1982, though it has never been my main function. Nevertheless, computing has been one of my central passions for more than 40 years.

Sadly, a couple of massive mechanical failures in my trucking enterprise has me currently severely net negative in assets. In fact I am currently on the brink of total loss. When and if I am able to recover and get back in the positive territory, i’d be more than willing to bet it all on that prediction.

I think closely watching what is happening now with Google, Microsoft, OpenAI, Stability Ai, Meta and the like make the future almost painfully obvious. As I said earlier. The number of tech site commenters who don’t see this coming really surprises me. The magnitude of the AI advancement will be equal to all the previous advancements humans have made since the dawn of civilization. I make that statement honestly, and believe there is no hyperbole in that claim.
You cannot make such an extraordinary claim without extraordinary evidence. The only evidence we have today of what AI is capable of are LLM's and some niche other programs and they all have similar and obvious flaws that not even the programmers can figure out how to fix. AI is definitely a large part of the future as you describe it, just plan on it taking another 30+ years, not 6. Its not exactly what we are talking about here, but there is some important context here.
 

JamesJones44

Reputable
Jan 22, 2021
714
668
5,760
I would bet you X dollars that your claim above will not come to fruition by 2030. You are essentially claiming a layman that only has basic computer use experience will be able to bust open above said programs and have the AI do all the work for them, or that AI can produce such things from scratch for them without the need of said programs. I would guess you are not a programer...
I agree. As someone who works with ML daily, ML can do impressive things, but going from what we have today to being able to run the world is a massive leap that is unlikely to happen in 10 years. I do however see lots of uses for what we have today, especially for games and general applications, but being able to run Excel completely for you with zero knowledge of what it's doing... Only if you don't mind it bankrupting the company before you know it.
 

The Hardcard

Distinguished
Jul 25, 2014
31
41
18,560
Yeah the AI-Bro's are pushing it as the cure for cancer, world hunger and hair loss. I can tell how much someone knows about the technology based on how they are trying to sell it. "Generative AI" doesn't create anything, it's just copy pasting content from humans that it stole from. Instead of plagiarizing from one human creator it's plagiarizing from dozens. Using this guys eyes, this other guys hair, third persons background, other persons hands and so forth. Instead of stealing $1,000,000 from one person, it steals $1 from 1,000,000 people.
It is far from just copying and pasting. It can apply human knowledge and experience to unique and novel situations. Of course, it is taking from what humans have done. That is basically what it is - concentrated human achievement. What it will be able do is apply human knowledge to whatever situation you find a need for it. Civilization capability in a box.

How is that not an incredible revolution? A tool that gives you access to all the capabilities and achievements of all of humanity is not transformative? Most people who spend years studying in college or technical schools are doing it because they want results. nearly everyone who pays someone who’s done studying in some field definitely just wants results. Soon you will be able to hand the device data that you care about and get those results applied to your specific needs and desires.

The world is going to be completely different and obviously the current economy won’t be able to handle it. But people will flock to it once it crosses past these early rudimentary stages.
 

dimar

Distinguished
Mar 30, 2009
1,045
64
19,360
How can Apple even think about comparing x86 to M stuff? With x86 I can run thousands of apps and games since like 80s. On iPad the app will be abandoned in a few years and forever disappear from the app store.
 
  • Like
Reactions: artk2219

usertests

Distinguished
Mar 8, 2013
529
492
19,260
And yet, that's what most people will want because Pro simply means the best now. It's more of a status symbol than anything.

As it turns out, you get fewer cores enabled for the entry level models. So there is your $999 media consumption device.
 
  • Like
Reactions: artk2219

newtechldtech

Notable
Sep 21, 2022
307
115
860
How can Apple even think about comparing x86 to M stuff? With x86 I can run thousands of apps and games since like 80s. On iPad the app will be abandoned in a few years and forever disappear from the app store.

yea ... thousands of games that you dont have the time to play until you die ... IPAD is not for games it is for productivity and ART and Editing ... and the best choice there is Zero competition.

if you want a gaming Tablet get Nintendo Switch .
 
  • Like
Reactions: NinoPino

JamesJones44

Reputable
Jan 22, 2021
714
668
5,760
Most people who spend years studying in college or technical schools are doing it because they want results. nearly everyone who pays someone who’s done studying in some field definitely just wants results. Soon you will be able to hand the device data that you care about and get those results applied to your specific needs and desires.
It doesn't really work like that. ML/LLM is not going to invent new science to say cure cancer simply by putting in data points. It does not have that capability, it does not have cognitive abilities. It can mimic reasoning, but it can't think. ML today requires a human to train it by programing algorithms and then tell it to train on x data.

While that is big, it's not replacing a college student or college research anytime soon. It can certainly augment those activities and make them way more efficient/productive, but it's not a replacement. I don't think we will get to replacing humans this decade and maybe even not the decade after that.
 
It is far from just copying and pasting.

No it's just copy pasting. AI models use a multidimensional array to represent the mathematical relationship between all elements of that array. Chat-GPT for example use's an array of 57 thousand dimensions, bigger models use bigger arrays. Each element of that array is a unique word, phrase or construct, the training model then "trains" on data by incrementing the value whenever one identified component follows another and so forth. When you get done you are left with a mathematical construct that represents the relationship between all unique objects trained on. Then during processing the algorithm can reference that model to predict the probability of any construct following another construct. Example would be that after processing A then B, it can reference that array for [A,B] then chose the value with the highest probability of following it.

There is absolutely nothing special about "AI", there is no intelligence, it's just a mathematical model that represents the relationship between all objects referenced by that model. The plagerization absolutely occurs because the model only references objects by unique values, that are then indexed and put together. With CoPilot entire functions and subroutines were copied and referenced by ID values. CoPilot didn't write code, it just decided that function 3477 would follow by function 2390 then statement 23097 and so forth, all based on the most likely value. It analyzed copy writed code to create those values and people have been able to reproduce copywrited code via text prompts.

There is nothing new about this process, it's been around for decades and is used heavily in plasma physics simulations. The problem was that we simply did not have sufficient computational power to ingest the training data and create that mathematical model. The real innovation is that someone discovered how to use vector coprocessors (GPU's) to compute the relationship weights of ridiculously large multi-dimensional arrays. That reduced the computational cost drastically in much the same way as going from software rendering to hardware rendering does.

If stealing $1,000,000 from one person is theft, then stealing $1 from 1,000,000 people is also theft. AI does the latter.
 
Last edited: