News Research shows more than 80% of AI projects fail, wasting billions of dollars in capital and resources: Report

chaz_music

Distinguished
Dec 12, 2009
106
90
18,690
AI and neural nets have been around for DECADES. When I was in grad school, we had all sorts of names for what we call AI: neural networks, fuzzy logic, adaptive control, adaptive observer, etc. These systems have been used for things like trajectory guiding and automobile cruise control for many years. The only thing new now is that we have the ability to connect many more nodes than before, and it has been made available to the public to play with making pictures and term papers. And available to people to do great harm.

If AI was a sure thing, big tech like IBM and Microsoft would have developed it many many years ago. The sure thing is the end application such as medical devices, drugs, and other critical applications. Another is crunching large data sets to get a significant finding, like finding an asteroid that is going to strike the earth in a few solar cycles. But these take people who are close to the end application to utilize AI properly and make the esoteric models for the AI to learn correctly. So teach it "Here are the rules, now go find me a dangerous asteroid using the telescope data."

All it takes is for someone to give a new name for something old and it turns into a fad, draining tons of resources from the venture capital pool that would have otherwise been supportive for other great start up tech. Example: Digital Twin. This is simply making a viewable physical model attached to an already existing mathematical model = "simulation" in the VR world. We have been able to create viewable models in the mechanical CAD world for at least 20 years or even more. But Digital Twin marketing is allowing companies to make it seem as though we have "new" technology and raking in tons of cash. Makes me think of the pet rock fad.
 

vanadiel007

Distinguished
Oct 21, 2015
368
361
19,060
You mean talking to ChatGTP is not making money? Who would have thought.

No worries, Nvidia et all has you covered with tons of new GPU solutions arriving at a regular interval.
They are often sold out, so you better place your millions dollar order right now to avoid not getting those fancy GPU's.
 

Giroro

Splendid
Nobody has the slightest idea how to use their shiny new billion dollar super computer to make enough money to pay for itself.
That's eventually going to a a big problem for everybody. If you think AI is annoying now, just wait until some of these companies start desperately trying to squeeze you for cash, to try and pay off their investors.
It probably will even be a big problem for Nvidia, unless they're always being paid up-front, in cash.
 

GenericUsername109

Honorable
Nov 12, 2018
6
5
10,515
It'd be nice if this "waste" was trickling down to the middle and lower classes and not making a few instant millionaires.
You could have YOLO'd your life savings on Nvidia stock call options, like some d3g3n3r4t3s on r/wallstreetbets, and become a millionaire, too. Or did you expect people to become rich while taking zero risk and just standing by?
There are going to be many AI investors (including the biggest corporations), who will lose a lot of money in the end (FoMO investing into AI without a proper use case). Would you like these losses to trickle down to you, too? :D
Every bubble is a zero sum game. If you don't play, you don't win (or lose).
 
Last edited:

edzieba

Distinguished
Jul 13, 2016
578
583
19,760
The "80%2 figure is not from some deep analysis, the source they cite is just some op-ed piece. And with general failure of startups being between 75% and 90% depending on who you ask (and what industry, and what country, etc), that figure would not be out of the ordinary even if accurate.
Example: Digital Twin. This is simply making a viewable physical model attached to an already existing mathematical model = "simulation" in the VR world. We have been able to create viewable models in the mechanical CAD world for at least 20 years or even more. But Digital Twin marketing is allowing companies to make it seem as though we have "new" technology and raking in tons of cash. Makes me think of the pet rock fad.
There is a little more to a Digital Twin.... but only a very little. A proper Digital Twin is a simulation running in parallel with the actual hardware (an assembly line, a building, etc) with the outputs of that simulation compared to sensor measurements of the actual environment to flag up "hey, something is not happening like its supposed to". Basically picking out the error term of a closed-loop feedback system but taking the system-of-systems approach rather than only applying it to subsystems.
But yes, 99% of 'digital twins' sold to businesses are just "here's a copy of the CAD for your building plans, go update it to the as-build state and then keep updating it" that is never updated and is left in a drawer somewhere never to be looked at again, because the buyer has no clue what a Digital Twin actually is or why they want one other than buzzword-chasing, and there are plenty of unscrupulous 'consultants' happy to sell a company their own CAD files back to them at a tasty markup.

When everyone figures out that Large Language Models don't have any actual utility to anyone and a bunch of accelerators are offloaded at bargain basement prices on ebay, the real winners will be those who have been training specific-task models for decades (stellar object classifiers, crop status monitors, etc) who now have cheaper iron that run faster.
 
Last edited:
The drug dealer don't use drugs.
Nvidia is top seller of dreams.
Crazy mining, Crazy AI... just Tons of money... some day We will grab this enterprise hardware cheap on ebay.
Can it run crysis.
 

kealii123

Proper
Nov 3, 2023
96
51
110
Good LLMs like GPT4 is definitely disrupting the software development industry. I use copilot every day, and despite the fact that its a crappy MS product (I often just screenshot paste directly into GPT4 instead) I'm probably twice as fast. Better products like Cursor can allow an 8 year old child to program faster than a CS college freshman on their own:

View: https://x.com/rickyrobinett/status/1825581674870055189?t=pevnj7taM3r8oabugoq4hA&s=19


Iterative, self-reinforcing AI coding projects like Devin show that a good LLM can be as productive as a junior engineer if allowed. If you take Llama3 or the latest Claude and train it on your repos every night, then allow it to edit code & iterate on the outcome, I bet you could get mid level, SE2 performance out of it with less than the annual salary worth of hardware.

Once the cost of writing software collapses, anything software-dependent (everythnig) will also shrink significantly in price and innovation will grow dramatically.
 

Gururu

Prominent
Jan 4, 2024
301
202
570
You could have YOLO'd your life savings on Nvidia stock call options, like some d3g3n3r4t3s on r/wallstreetbets, and become a millionaire, too. Or did you expect people to become rich while taking zero risk and just standing by?
There are going to be many AI investors (including the biggest corporations), who will lose a lot of money in the end (FoMO investing into AI without a proper use case). Would you like these losses to trickle down to you, too? :D
Every bubble is a zero sum game. If you don't play, you don't win (or lose).
Big investors lose their money, young knuckleheads get rich with most of the money. Not sure how you read so much into my comment.
 

Gururu

Prominent
Jan 4, 2024
301
202
570
According to the National Restaurant Association, 80% of restaurants fail within 5 years. The U.S. Bureau of Labor Statistics (BLS), says that approximately 20% of new businesses fail during the first two years of being open, 45% during the first five years An 80% failure rate is not surprising.
I would hope that AI businesses employ as much as failed restaurants. I'm thinking chefs, wait staff, dishwashers, etc. Maybe they employ more engineers with six figures, who knows.
 

USAFRet

Titan
Moderator
Good LLMs like GPT4 is definitely disrupting the software development industry. I use copilot every day, and despite the fact that its a crappy MS product (I often just screenshot paste directly into GPT4 instead) I'm probably twice as fast. Better products like Cursor can allow an 8 year old child to program faster than a CS college freshman on their own:
The problem with those things is a lack of all the upfront work.
Gathering actual requirements, design.

And then, the aftermath.
Deployment and maintenance.

Sure, it "works".
Can it scale?
Does it meet the actual customers needs?

The "code" is the relatively easy part.
And a person who does not already understand the "code" part does not know when the AI is coughing up junk.
 
  • Like
Reactions: SSGBryan

Wimpers

Distinguished
Feb 26, 2016
28
7
18,535
You could have YOLO'd your life savings on Nvidia stock call options, like some d3g3n3r4t3s on r/wallstreetbets, and become a millionaire, too. Or did you expect people to become rich while taking zero risk and just standing by?
There are going to be many AI investors (including the biggest corporations), who will lose a lot of money in the end (FoMO investing into AI without a proper use case). Would you like these losses to trickle down to you, too? :D
Every bubble is a zero sum game. If you don't play, you don't win (or lose).
Well, it seems like the investors are a bit disgruntled that NOIDA's quarterly predictions. The predicted revenue of 32.5 billion dollars, now has an error margin of 2%.
Investors were not happy with this, and the stock fell 7% because 'Simply “slightly better than expected” isn’t good enough for a company that has been outperforming even the most optimistic estimates in recent quarters.'

Can you imagine those greedy stock market capitalist POS bastards and bankers with the mindset of a cancer cell, unlimited growth with all other things subordinate. I hope they fail miserably.
And also kind hope NVIDIA tanks too.
 

Bamda

Distinguished
Apr 17, 2017
114
38
18,610
It's unsurprising, akin to the Crypto craze. Pursuing what's essentially fool's gold in hopes of winning the business lottery. Such investments in futile ventures can lead to a company's downfall.
 

GenericUsername109

Honorable
Nov 12, 2018
6
5
10,515
Good LLMs like GPT4 is definitely disrupting the software development industry. I use copilot every day, and despite the fact that its a crappy MS product (I often just screenshot paste directly into GPT4 instead) I'm probably twice as fast. Better products like Cursor can allow an 8 year old child to program faster than a CS college freshman on their own:

View: https://x.com/rickyrobinett/status/1825581674870055189?t=pevnj7taM3r8oabugoq4hA&s=19


Iterative, self-reinforcing AI coding projects like Devin show that a good LLM can be as productive as a junior engineer if allowed. If you take Llama3 or the latest Claude and train it on your repos every night, then allow it to edit code & iterate on the outcome, I bet you could get mid level, SE2 performance out of it with less than the annual salary worth of hardware.

Once the cost of writing software collapses, anything software-dependent (everythnig) will also shrink significantly in price and innovation will grow dramatically.
I've worked as a software developer, a team lead and a line manager on a datacenter back-end project for a major financial institution. As such, most of the workload was NOT actual programming (typing in lines of code). It was requirements analysis and (incremental) solution design, (unit)testing, issue analysis and bug-fixing (both technical and functional), production support and maintenance, back and forth with business analysts, with a dedicated test team, with production operators etc.
Once the cost of writing software collapses, it's not going to change much (it will be mostly effective before the initial release), since it was just a minor part of the job to begin with. At least when talking about the most complex and labour-intensive SW projects (not talking about simple mobile or web apps).
 

kealii123

Proper
Nov 3, 2023
96
51
110
I've worked as a software developer, a team lead and a line manager on a datacenter back-end project for a major financial institution. As such, most of the workload was NOT actual programming (typing in lines of code). It was requirements analysis and (incremental) solution design, (unit)testing, issue analysis and bug-fixing (both technical and functional), production support and maintenance, back and forth with business analysts, with a dedicated test team, with production operators etc.
Once the cost of writing software collapses, it's not going to change much (it will be mostly effective before the initial release), since it was just a minor part of the job to begin with. At least when talking about the most complex and labour-intensive SW projects (not talking about simple mobile or web apps).
Ha, the #1 use case for LLMs and programming, at least for me, is writing unit tests. Unless you're a big TDD guy (I bet it still would be game-changing) I just tell copilot to write a bunch of unit tests for my new function/method matching the style in the existing test file, or if a new file matching the style of an existing file.

The second biggest use case is for essentially documentation lookup. Its so much faster querying GPT4 instead of searching and trying to read Stripe's documentation (esp. since their types package sucks so bad).