News Linus Torvalds reckons AI is ‘90% marketing and 10% reality’

chaz_music

Distinguished
Dec 12, 2009
103
87
18,690
I COMPLETELY agree with Linus. The market should first find the "must have" use, and that can drive the market acceptance as well as the tech development. This does not mean making a cool picture or screwing up a term paper. It should be on the same level to society as search engines or smart phones.

When I studied neural nets several decades ago, there was a common feeling in our class that this was considered esoteric tech to be used by techies only, for things like adaptive control and fuzzy logic. Fast forward to now and we finally have enough nodes to do some interesting things to images and sound. But the method that we are using to do these things is bulky, not cost effective, and certainly not optimized.

Using analog nodes uses way less power and probably will be the right final solution for many uses. Yes, that is a real thing. Analog neural nets (like inside your brain).
 

roba67

Prominent
Sep 16, 2023
3
8
515
Yes at first we were all impressed with CHATGPT but then we found it was not very useful. The hype cycle is much larger than I expected. I think leading edge semiconductor technology is a solution looking for a problem and many of the large tech company [marketers] are easily fooled. Nvidia has done a good job of providing tools like CUDA but the idea that AI can work without heuristics is nonsense.
 

Dantte

Distinguished
Jul 15, 2011
172
60
18,760
Agree and disagree.

It probably is mostly 90/10, but its so new we're in a spot that I like to say: "we dont know what we dont know." We need people to jump on that band wagon, play with it, use it, break it, fix it, and figure it out. Those that do this early will be wasting a lot of time on the hype, but like any good pyramid scheme, those that get in early, as much as it may seem pointless now, will have the most to gain when it becomes mainstream.
 

King_V

Illustrious
Ambassador
90% sounds extremely optimistic. I agree with Li, 99% of them are pure bubble and will fail.

But even that sounds optimistic. I've still yet to to see AI do something, that actually turns a profit (the only part that actually matters) and isn't just a relabeled service previously done by "big data" or "the algorithm".
Yeah, I'm pretty much in line with this feeling. I thought Torvalds was being slightly on the optimistic side. But, Torvalds and Li both have the right idea. Right now it's become a marketing term, and they're applying the AI moniker to everything they can.

Suddenly parsing documents is "AI," recording video or sound and parsing out the words is "AI." Even the commercials for the new phones, talking about now with AI, your phones can do something new... except that the examples they give seem to be what phones could already do.

Anyone remember how we were hearing the phrase "big data" all the time? Yeah, kinda feels like that.
 
  • Like
Reactions: helper800

JamesJones44

Reputable
Jan 22, 2021
824
746
5,760
This statement has been true for all bubble tech. Hardware bubble in the 80s, .com bubble in the 90s, cloud bubble in the 00s, crypto bubble in the 10s, AI bubble in the 20s. 1000s of companies making ever promise in the world, 30 or so players with 3 to 5 super major players will come out with real, usable products and survive when the bubble pops
 

edzieba

Distinguished
Jul 13, 2016
567
572
19,760
I would argue the ratio should be closer to 95% Marketing, 5% Reality.
5% is probably optimistic.

The big problem is the <5% of non-nonsense applications are the ones that were being actively used before the 'deep learning' boom, mainly machine vision and data filtering.
Large Language Models, where the vast majority of investment is currently going, have basically no utility beyond toys. Image generation is in Photoshop Era (or if you're older, the digital drum-machine era) of artistic panic: the bit where everyone thinks it will take their jobs by having anyone push a button and receive art, and before the bit where everyone figures out that you still ned to be an artist to actually get anything of actual value out of it and it just becomes another digital tool that some artists will use productively and others will eschew for various reasons to various effect (e.g. the auteur directors who will still demand celluloid film vs. the many directors who cannot afford the massive costs of celluloid film). But even then, image generation is a pretty niche use, and will be relegated to mundane non-consumer-facing applications like creating tiling non-repeating rock textures from a small number of seed textures in game engines, or mapping user-generated-avatar expressions to face models without having to make sure the individual composited face elements can tolerate all possible poses.
 
  • Like
Reactions: JamesJones44

JRStern

Distinguished
Mar 20, 2017
154
59
18,660
Sounds about right to me!

In next five years it should get 50%-80% cheaper to field a system at about the current level, and it will be better understood by both vendors and customers, and it might even turn a small profit, LOL.
Probably more like ten years to see anything massively better than today, and longer than that - say 20 years - to something more or less human-level.
FWIW I have very little fear of any "super-intelligence", for reasons I can go on about at very great length.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,383
906
20,060
As cynical as he may be, Linus is basically an optimistic guy.
I thought he was a "No Non-Sense, 100% Realistic" kind of guy


5% is probably optimistic.

The big problem is the <5% of non-nonsense applications are the ones that were being actively used before the 'deep learning' boom, mainly machine vision and data filtering.
Large Language Models, where the vast majority of investment is currently going, have basically no utility beyond toys. Image generation is in Photoshop Era (or if you're older, the digital drum-machine era) of artistic panic: the bit where everyone thinks it will take their jobs by having anyone push a button and receive art, and before the bit where everyone figures out that you still ned to be an artist to actually get anything of actual value out of it and it just becomes another digital tool that some artists will use productively and others will eschew for various reasons to various effect (e.g. the auteur directors who will still demand celluloid film vs. the many directors who cannot afford the massive costs of celluloid film). But even then, image generation is a pretty niche use, and will be relegated to mundane non-consumer-facing applications like creating tiling non-repeating rock textures from a small number of seed textures in game engines, or mapping user-generated-avatar expressions to face models without having to make sure the individual composited face elements can tolerate all possible poses.
4% of that 5% belongs to nVIDIA and how they're fleecing so many "Hyper Scalers" into buying their Over-Priced hardware

As with the gold rush, it's the Axe-Pick sellers that are making the fortune.

Ergo nVIDIA is getting away with "Highway Robbery".
 
The hype is not for the infancy of AI, it is for its future.

He is missing the point. All robot automatization will past through AI, not mentioning all the automatization for smart cities.

There is a reason why the investment are only made by the big tech companies.

It is literally the 4th industrial revolution.
 

sygreenblum

Distinguished
Feb 25, 2016
31
23
18,535
Yeah, that 10 percent is mostly made up of porn and politically motivated bots. It is the future but currently it's not that useful outside a handful of niche applications.