No AI is going to transform how people use computers in key ways. In five years, most people will insist on AI accelerators. What you have today is AI in its infancy. Llama 3 is like the Commodore PET-Apple I era. Hobbyists feeling their way around.
Polished AI that will do complex, tedious tasks will arrive inside of 3 years and will transform all aspects of computer and device work. I’m astonished by the number of tech enthusiasts who are not seeing the biggest technological transformation of their lifetime.
It should be clear that by 2030, there will be no need to learn Excel, AutoCAD, DaVinci Resolve, VS Code, Photoshop, Powerpoint or myriad other apps. People take classes and get certificates to show they can do highly complex, technical work using professional applications. Mature AI apps will allow people who know nothing about today’s applications to get better results.
We are going to be able to just tell the devices what we want and have the results provided without thinking or even knowing about cells, formulas, filters, grids, macros, formatting, layers, plugins, coding, and the like.
I've been in the IT industry for forty years now. And from time to time I re-read old predictions e.g. from BYTE magazine, which I used to positively devour every month when I started out.
Some are total laughs, others right on, some were vastly exceeded. Old science fiction is also a lot of fun.
Over the last twenty years an important part of my job has been to both predict the future of IT and then prepare my employer for it.
And let me just say: I don't share your enthusiasm.
And that in five years I'd like you to eat your words.
Excel triggered me, to be honest, because spreadsheets remain extremely underused and underrated in terms of capabilities, whilst others damm people who run their businesses on them and then fail when they run into their limits.
It starts with the spreadsheet formula language, which nearly nobody seems to really appreciate for what it is: a functional programming language. In computer science they told us that while these functial algebras are wonderful in theory, they are unusable in practice, because people just think imperative.
Well programmery may think imperative, but evidently the early Visicalc, Multiplan, etc. adopters had no issues using that natural functional calculus, perhaps because nobody told them it was in fact a functional programming language considered too complicated for programming.
I've long thought that many HPC challenges are much more naturally expressed in a functional calculus that sits atop giant multidimensional vectors of database tables and is much better computed in a naturally distributed and parallel manner via query languages, which are also mostly functional. I guess some of that today is called Jupyter, some of that found its way into Greenplum, so I wasn't totally off 30 years ago.
But AIs doing HPC well enough to build nucler fusion devices? Sounds a lot like getting rid of old age.
Or just like the IoT predictions from ten to five years ago.
Most everyone who thinks Excel is complicated today may use an AI to do better whatever he does with Excel.
But that's a person that is scratching 0,0000001% of what that spreadsheet paradigm could enable them solving.
It doesn't HPC models for weather, nuclear physics etc. which could perhaps be better expressed in a spreadsheet like functional calculus than in Fortran or C++.
I'd even go as far as saying that there is a good chance even current AIs would be better at AutoCAD than I am today.
But that's because I'm a total idiot with AutoCAD.
So in that sense some of what you say may be true, because it's nearly tautological.
The other parts I consider utter bollocks, and I'm already sorry I thought that aloud.
Mostly I'm not sure who I'd like to be the better predictor with what will happen.