News Research shows more than 80% of AI projects fail, wasting billions of dollars in capital and resources: Report

chaz_music

Distinguished
Dec 12, 2009
98
78
18,640
AI and neural nets have been around for DECADES. When I was in grad school, we had all sorts of names for what we call AI: neural networks, fuzzy logic, adaptive control, adaptive observer, etc. These systems have been used for things like trajectory guiding and automobile cruise control for many years. The only thing new now is that we have the ability to connect many more nodes than before, and it has been made available to the public to play with making pictures and term papers. And available to people to do great harm.

If AI was a sure thing, big tech like IBM and Microsoft would have developed it many many years ago. The sure thing is the end application such as medical devices, drugs, and other critical applications. Another is crunching large data sets to get a significant finding, like finding an asteroid that is going to strike the earth in a few solar cycles. But these take people who are close to the end application to utilize AI properly and make the esoteric models for the AI to learn correctly. So teach it "Here are the rules, now go find me a dangerous asteroid using the telescope data."

All it takes is for someone to give a new name for something old and it turns into a fad, draining tons of resources from the venture capital pool that would have otherwise been supportive for other great start up tech. Example: Digital Twin. This is simply making a viewable physical model attached to an already existing mathematical model = "simulation" in the VR world. We have been able to create viewable models in the mechanical CAD world for at least 20 years or even more. But Digital Twin marketing is allowing companies to make it seem as though we have "new" technology and raking in tons of cash. Makes me think of the pet rock fad.
 

vanadiel007

Distinguished
Oct 21, 2015
314
313
19,060
You mean talking to ChatGTP is not making money? Who would have thought.

No worries, Nvidia et all has you covered with tons of new GPU solutions arriving at a regular interval.
They are often sold out, so you better place your millions dollar order right now to avoid not getting those fancy GPU's.
 

Giroro

Splendid
Nobody has the slightest idea how to use their shiny new billion dollar super computer to make enough money to pay for itself.
That's eventually going to a a big problem for everybody. If you think AI is annoying now, just wait until some of these companies start desperately trying to squeeze you for cash, to try and pay off their investors.
It probably will even be a big problem for Nvidia, unless they're always being paid up-front, in cash.
 

shtldr

Honorable
Nov 12, 2018
2
2
10,515
It'd be nice if this "waste" was trickling down to the middle and lower classes and not making a few instant millionaires.
You could have YOLO'd your life savings on Nvidia stock call options, like some d3g3n3r4t3s on r/wallstreetbets, and become a millionaire, too. Or did you expect people to become rich while taking zero risk and just standing by?
There are going to be many AI investors (including the biggest corporations), who will lose a lot of money in the end (FoMO investing into AI without a proper use case). Would you like these losses to trickle down to you, too? :D
Every bubble is a zero sum game. If you don't play, you don't win (or lose).
 
Last edited:

edzieba

Distinguished
Jul 13, 2016
534
528
19,760
The "80%2 figure is not from some deep analysis, the source they cite is just some op-ed piece. And with general failure of startups being between 75% and 90% depending on who you ask (and what industry, and what country, etc), that figure would not be out of the ordinary even if accurate.
Example: Digital Twin. This is simply making a viewable physical model attached to an already existing mathematical model = "simulation" in the VR world. We have been able to create viewable models in the mechanical CAD world for at least 20 years or even more. But Digital Twin marketing is allowing companies to make it seem as though we have "new" technology and raking in tons of cash. Makes me think of the pet rock fad.
There is a little more to a Digital Twin.... but only a very little. A proper Digital Twin is a simulation running in parallel with the actual hardware (an assembly line, a building, etc) with the outputs of that simulation compared to sensor measurements of the actual environment to flag up "hey, something is not happening like its supposed to". Basically picking out the error term of a closed-loop feedback system but taking the system-of-systems approach rather than only applying it to subsystems.
But yes, 99% of 'digital twins' sold to businesses are just "here's a copy of the CAD for your building plans, go update it to the as-build state and then keep updating it" that is never updated and is left in a drawer somewhere never to be looked at again, because the buyer has no clue what a Digital Twin actually is or why they want one other than buzzword-chasing, and there are plenty of unscrupulous 'consultants' happy to sell a company their own CAD files back to them at a tasty markup.

When everyone figures out that Large Language Models don't have any actual utility to anyone and a bunch of accelerators are offloaded at bargain basement prices on ebay, the real winners will be those who have been training specific-task models for decades (stellar object classifiers, crop status monitors, etc) who now have cheaper iron that run faster.
 
Last edited:
  • Like
Reactions: thisisaname

kealii123

Proper
Nov 3, 2023
91
50
110
Good LLMs like GPT4 is definitely disrupting the software development industry. I use copilot every day, and despite the fact that its a crappy MS product (I often just screenshot paste directly into GPT4 instead) I'm probably twice as fast. Better products like Cursor can allow an 8 year old child to program faster than a CS college freshman on their own:

View: https://x.com/rickyrobinett/status/1825581674870055189?t=pevnj7taM3r8oabugoq4hA&s=19


Iterative, self-reinforcing AI coding projects like Devin show that a good LLM can be as productive as a junior engineer if allowed. If you take Llama3 or the latest Claude and train it on your repos every night, then allow it to edit code & iterate on the outcome, I bet you could get mid level, SE2 performance out of it with less than the annual salary worth of hardware.

Once the cost of writing software collapses, anything software-dependent (everythnig) will also shrink significantly in price and innovation will grow dramatically.
 

Gururu

Upstanding
Jan 4, 2024
175
112
270
You could have YOLO'd your life savings on Nvidia stock call options, like some d3g3n3r4t3s on r/wallstreetbets, and become a millionaire, too. Or did you expect people to become rich while taking zero risk and just standing by?
There are going to be many AI investors (including the biggest corporations), who will lose a lot of money in the end (FoMO investing into AI without a proper use case). Would you like these losses to trickle down to you, too? :D
Every bubble is a zero sum game. If you don't play, you don't win (or lose).
Big investors lose their money, young knuckleheads get rich with most of the money. Not sure how you read so much into my comment.
 

Gururu

Upstanding
Jan 4, 2024
175
112
270
According to the National Restaurant Association, 80% of restaurants fail within 5 years. The U.S. Bureau of Labor Statistics (BLS), says that approximately 20% of new businesses fail during the first two years of being open, 45% during the first five years An 80% failure rate is not surprising.
I would hope that AI businesses employ as much as failed restaurants. I'm thinking chefs, wait staff, dishwashers, etc. Maybe they employ more engineers with six figures, who knows.
 

USAFRet

Titan
Moderator
Good LLMs like GPT4 is definitely disrupting the software development industry. I use copilot every day, and despite the fact that its a crappy MS product (I often just screenshot paste directly into GPT4 instead) I'm probably twice as fast. Better products like Cursor can allow an 8 year old child to program faster than a CS college freshman on their own:
The problem with those things is a lack of all the upfront work.
Gathering actual requirements, design.

And then, the aftermath.
Deployment and maintenance.

Sure, it "works".
Can it scale?
Does it meet the actual customers needs?

The "code" is the relatively easy part.
And a person who does not already understand the "code" part does not know when the AI is coughing up junk.