News Elon Musk says the next-generation Grok 3 model will require 100,000 Nvidia H100 GPUs to train

"In fact, Musk believes than an artificial intelligence smarter than the smartest human will emerge in the next year or two."

Book smart or street smart? I had a calculator in the 1970's that was book smart.
 
So Musk is claiming that we'll develop a whole new type of AI that's several orders of magnitude more powerful than any we have available right now, in 2 years, using currently available technology?

Yeah, even Jensen Huang wouldn't go that far and he's been doing nothing but hyping up AI these last couple of years.
 
  • Like
Reactions: PEnns
I watched Terminator 1 and 2 last week presciently.
It will take a year, maybe 2.
It will get too powerful.
Hope we have a backup plan.

His use of Grok is a synonym for God (2000AD lore)
I hope he's wrong, or knows what he's doing.

edit : Actually, wasn't Grok a synonym for "Sh!t" or "Feck" ? Grud was God wasn't it ?
 
So Musk is claiming that we'll develop a whole new type of AI that's several orders of magnitude more powerful than any we have available right now, in 2 years, using currently available technology?

Yeah, even Jensen Huang wouldn't go that far and he's been doing nothing but hyping up AI these last couple of years.
It's computer AI, all it needs is more CPU processing power.
As much as it can get.
Wait until it designs its own processor.
 
The advancement of AI technology, according to Musk, is currently hampered by two main factors: supply shortages on advanced processors — like Nvidia's H100, as it's not easy to get 100,000 of them quickly — and the availability of electricity.

That and the software to run on it.

Does just adding more parameters to a LLM make it any more intelligent, or just more complex and power hungry?
It may give out more complex answers but are they any more intelligent?
 
It's computer AI, all it needs is more CPU processing power.
As much as it can get.
Wait until it designs its own processor.
Hardly.

It's not just a matter of resources, there are limits to language models and to machine learning as a field.

There's a lot more to developing general AI than just pouring processing power into an language model.

We're not even close yet.
 
Hardly.

It's not just a matter of resources, there are limits to language models and to machine learning as a field.

There's a lot more to developing general AI than just pouring processing power into an language model.

We're not even close yet.
Exponential growth like we have never seen before.
 
Considering AI has been able to synthesize new music from old ones since the late 50s that's hardly surprising.

The exponential gains you talked about above are entirely restricted to the output quality of AI.

Image manipulation, response to input, music generation, text generation, AI has been capable of all that for decades, with processing power and larger data sets simply allowing for it to do what it already did better than before.

To get AGI we'd have to add entirely new capabilities to AI, things we simply can't do using current models.

The research required to even find out how to get there in the first place will take time, it's not the sort of thing you can rush.
 
He's right. It's going to happen almost overnight. There's going to hit a certain level of sophistication when the AI can improve itself at an exponential rate. It will be so fast that it's unbelievable.
 
He's right. It's going to happen almost overnight. There's going to hit a certain level of sophistication when the AI can improve itself at an exponential rate. It will be so fast that it's unbelievable.
Doubtful.

And if it happens at all it certainly won't be in one or two years from now like he claims.

General consensus, as much as it exists anyway, is that if it happens it will be decades to centuries from now.
 
They should build a better AI that can be trained with, like, way fewer resources.

If every iteration of your software needs 5x the resources, that's a sign your software is getting worse. Not better. What a weird flex.
 
knowing musk, he probably bought a lot of Nvidia stock and tries to short them for a quick profit

he did the same with crypto

must be fun to generate millions of profit with a few twits
 
  • Like
Reactions: PEnns
Regarding "AI", things changed in 2017 with implementation of transformers. That was the turning point that started the current revolution.

There's a great Sora demo where they visually show the effect of more compute power. It's a video of a puppy that not coherent - hardly recognizable, but as the increase the compute available to network, the puppy becomes more realistic until at 8x it's easy to mistake it for reality. It's a great way to understand why people believe increasing power will increase usefulness.

AI today can't reason very well, and can't plan responses. Matthew Berman has some great tests that show this. Ask any current LLM how many words are in it's reply to your question. It won't be able to plan the response well enough to predict that. Ask it to produce 10 sentences that end with the word apple. Can't do it. Heck even try mathematics that require multiple steps to complete and many will falter - something any cell phone app can do with ease.

Modern models are quantized (truncated) down as far as using 2 bits to store values, even the biggest models running on massive clusters rarely go beyond 16 bits of precision because of the lack of memory and compute. If we could run those models at 64bit precision (easy to envision) quite a lot would change.

Smarter that a human at logical reasoning? Easy to get there. Better that everything that makes us human? Not likely.
 
  • Like
Reactions: Evildead_666
It's computer AI, all it needs is more CPU processing power.
As much as it can get.
Wait until it designs its own processor.
Both Synopsis and Cadence have already integrated "AI" in to their chip tools. Machine learning is incredibly useful for chip design. That doesn't mean some malicious sentience is running behind the scenes modifying designs. It doesn't work that way.
 
"In fact, Musk believes than an artificial intelligence smarter than the smartest human will emerge in the next year or two."

Book smart or street smart? I had a calculator in the 1970's that was book smart.
Smart enough to replace the average knowledge worker at easily definable tasks. If there's a clear SOP for a task and it doesn't require physical interaction with the real world (that'll take longer), then let the AIs take over that crap, and free us up for other things.
 
Smart enough to replace the average knowledge worker at easily definable tasks. If there's a clear SOP for a task and it doesn't require physical interaction with the real world (that'll take longer), then let the AIs take over that crap, and free us up for other things.
Yeah, like the unemployment line.
 
  • Like
Reactions: Evildead_666
I watched Terminator 1 and 2 last week presciently.
It will take a year, maybe 2.
It will get too powerful.
Hope we have a backup plan.

His use of Grok is a synonym for God (2000AD lore)
I hope he's wrong, or knows what he's doing.

edit : Actually, wasn't Grok a synonym for "Sh!t" or "Feck" ? Grud was God wasn't it ?
"Grok" is from a Robert Heinlein novel, apparently it means something to the effect of 'understand in a deep, empathetic manner' (haven't read the book myself though).
 
  • Like
Reactions: Evildead_666