News Jim Keller responds to Sam Altman's plan to raise $7 billion to make AI chips — I can do it cheaper!

abufrejoval

Reputable
Jun 19, 2020
520
367
5,260
Could someone please explain to me who will pay with what the return on that investment?

Even if Microsoft manages to push Co-mplot into every product and niche, how does that make everyone spend enough money on that to make it worthwhile?

A trillion in invests is a thousand bucks for each of a billion people and typically buyers want a return in value.
And if it puts people out of well paying jobs, the onus just gets bigger on those who manage to profit.

I just don't see myself spending a thousand bucks on AI because I don't see the two thousand bucks return in value I'd expect for that.

At the moment I'd rather spend a thousand bucks on diminishing world conflict than on advancing Altman: far more tangible value.
 

ekio

Reputable
Mar 24, 2021
106
129
4,760
Altman really sounds like a new rich brat.
He never heard about ecological issues regarding energy and materials that such a plan would create? Not even mentioning how wasteful it would be economically.

Nowadays, there are ways such as minimizing data distance transportation inside the processors that could lead to 99x consumption reduction, much better AI designs not originating from tweaked GPU platform that could do 10x better per watt.
Jim Keller knows what he’s talking about so much more, it hurts.
 

vanadiel007

Distinguished
Oct 21, 2015
359
356
19,060
Altman really sounds like a new rich brat.
He never heard about ecological issues regarding energy and materials that such a plan would create? Not even mentioning how wasteful it would be economically.

Nowadays, there are ways such as minimizing data distance transportation inside the processors that could lead to 99x consumption reduction, much better AI designs not originating from tweaked GPU platform that could do 10x better per watt.
Jim Keller knows what he’s talking about so much more, it hurts.

The future of AI is with quantum computing. You need way less cells if they can store both 0 and 1 and anything in between in a single cell.

So a quantum brain is the future, not a digital high end tensor core processor.
 
  • Like
Reactions: ivan_vy
Feb 17, 2024
1
4
10
The future of AI is with quantum computing. You need way less cells if they can store both 0 and 1 and anything in between in a single cell.

So a quantum brain is the future, not a digital high end tensor core processor.
Don't get ahead of yourself. Quantum computing won't work exactly analogously to classical computing- this storage argument isn't the whole story. First, you need to repeat the computation over and over in order to build enough statistics to determine correlation, which is mired by the fact that it's way slower than just turning over bits in a register and, second, performing the manipulations to do a quantum calculation is underpinned by interfacing to classical computing. They aren't fundamentally separate as it is. Considering that classical computing experts keep finding ways to show that quantum computing isn't yet an advantage, the future is definitely not set. The reality will be a hybrid, at best.
 

vanadiel007

Distinguished
Oct 21, 2015
359
356
19,060
Don't get ahead of yourself. Quantum computing won't work exactly analogously to classical computing- this storage argument isn't the whole story. First, you need to repeat the computation over and over in order to build enough statistics to determine correlation, which is mired by the fact that it's way slower than just turning over bits in a register and, second, performing the manipulations to do a quantum calculation is underpinned by interfacing to classical computing. They aren't fundamentally separate as it is. Considering that classical computing experts keep finding ways to show that quantum computing isn't yet an advantage, the future is definitely not set. The reality will be a hybrid, at best.

Well, if you look at the human brain you will notice all the similarities with quantum computing rather than classical computing.

We don't store anything down to the last bit. We store only what we need.
 
In a way they are all right and wrong at the same time. I'm more than sure Jim Keller has the chops to design an incredibly efficient architecture more so than most...with a caveat...within the limits of what is physically possible. As someone here in the forum correctly pointed out pointed out: Efficiency improvements are quickly coming to an end. It's why we have 30% jumps at most instead of 100% jumps like we used to. Meanwhile the number of parameters being evaluated is growing into the millions and is exponentially growing. Thus we have an exponential decay in improvements, with an exponential increase in compute.

But neither are fab makers. I'm sure Altman is just looking at the flops and ops he needs based on some projection graph. And also looking at what's currently available and putting THAT into another graph and how he came to his conclusion. (Interestingly AI inference can also predict this) But as the TSMC fabs showed us in Arizona, things are hardly predictable when it comes to building fabs. And as Intel showed us, a die shrink to 10mm was harder than they thought. 2nm and smaller will be insane.

So both are taking SWAGs (simple wild arse guessing) based on predicting future tech and needs.

In all honesty it's always better to ask for more and deliver, then ask for less and not deliver. (The later happens with a lot of govt contracts. Cost overruns are insane. And the govt always relents and pays out more) So if you want to be seen as reliable and predictable you buffer for unforseen situations.

I love Jim Keller. But I think Altman's taking a safe approach. Neither has proven himself in funding of large scale fabs.

And Jensen will seize up on every opportunity he can to dismiss them both so he doesn't have to compete. Ignore the insecure billionaire.
 
Last edited:

Sleepy_Hollowed

Distinguished
Jan 1, 2017
533
222
19,270
Could someone please explain to me who will pay with what the return on that investment?

Even if Microsoft manages to push Co-mplot into every product and niche, how does that make everyone spend enough money on that to make it worthwhile?

A trillion in invests is a thousand bucks for each of a billion people and typically buyers want a return in value.
And if it puts people out of well paying jobs, the onus just gets bigger on those who manage to profit.

I just don't see myself spending a thousand bucks on AI because I don't see the two thousand bucks return in value I'd expect for that.

At the moment I'd rather spend a thousand bucks on diminishing world conflict than on advancing Altman: far more tangible value.
You see, not that this would ever put people out of a job, but if it were so that not a lot of people have money to spend, there's no onus, they'll just buy the governments of the world faster and have people just do menial tasks until they drop dead from disease and climate disasters from energy consumption.

These people are sick.
 
  • Like
Reactions: ivan_vy

George³

Prominent
Oct 1, 2022
228
124
760
Put Sam Altman in a mental hospital for life and you'll save $7 trillion that could be spent on good causes. Add Jim Keller in same room for to be companion for Altman.
 

bit_user

Titan
Ambassador
A trillion in invests is a thousand bucks for each of a billion people and typically buyers want a return in value.
I think the disconnect is happening at a couple points:
  1. It was previously stated that the investments will be spent over several years - possibly a decade, which brings it much more in line with (i.e. within an order of magnitude of) the roughly ~$100B/year of investments the semiconductor is currently making.
  2. You're thinking too narrowly about the applications of AI - they clearly are targeting things like robotics, self-driving cars, surveillance, and probably other sectors like telecomms and industrial automation.
  3. Probably not all of the chips will even be AI chips, but many will be DRAM, storage, and various infrastructure (CPUs, networking, etc.) ICs needed to support the actual AI processors. Just look at the various glue that Nvidia needs to tie together & support its huge networks of H100s.

And if it puts people out of well paying jobs, the onus just gets bigger on those who manage to profit.
This is also a very good point. It's easier to see how he could find a return-on-investment, if he's thinking AI will replace millions of information workers with an average salary on the order of $100k or so.

At the moment I'd rather spend a thousand bucks on diminishing world conflict than on advancing Altman: far more tangible value.
This is an investment, so they're going to spend it wherever they think they can get a good return on it.
 

ivan_vy

Respectable
Apr 22, 2022
194
209
1,960
he's thinking AI will replace tens of millions of information workers with an average salary on the order of magnitude of $100k.
people will not disappear overnight, using AI to make people more productive (like Excel did to accounting, ACad to architechture, safer cars and trailers for drivers ,etc.) is the way to go, replacing people it's a dangerous path to make precarious jobs and potential hacking to every aspect of our lives.
 

bit_user

Titan
Ambassador
He never heard about ecological issues regarding energy and materials that such a plan would create? Not even mentioning how wasteful it would be economically.
He might be banking on using AI to reduce waste and improve efficiency? Also, the carbon footprint of a well-paid IT worker is pretty big, so replacing us could yield a net eco savings.

Nowadays, there are ways such as minimizing data distance transportation inside the processors that could lead to 99x consumption reduction, much better AI designs not originating from tweaked GPU platform that could do 10x better per watt.
Why do you assume the designs of whatever chips are ultimately built won't also improve beyond current ones?

It will take a long time for this new production capacity to get built out and come online. Also, it won't happen all at once. In the intervening time & beyond, the industry will obviously continue refining and optimizing both silicon and software designs. Whatever the fabs end up making will probably be the state of the art, at that time, just as current fabs aren't wed to building chips designed like 5 years ago, when construction of the fabs was first planned.
 

bit_user

Titan
Ambassador
The future of AI is with quantum computing. You need way less cells if they can store both 0 and 1 and anything in between in a single cell.
Quantum computers can train neural networks, but their lack of scale is a huge hurdle, with no clear roadmap that could achieve Altman's apparent vision, or anything remotely close.

They're also incredibly expensive to build and operate, so far. If you also want to focus on scaling them up, I don't imagine those costs will come down anytime soon.

So a quantum brain is the future, not a digital high end tensor core processor.
As for inferencing, the answer is quite simply no. And most AI compute power is spent on inferencing. Luckily, it's a simpler problem and easier to scale & optimize.

Well, if you look at the human brain you will notice all the similarities with quantum computing rather than classical computing.
Perhaps you mean analog computers, then? People have been dabbling with analog neural networks for at least as long as this branch of AI has existed. We periodically continue to hear sporadic announcements continuing in this area, but I'm not sure how much of that just a fringe thing vs. really viable to deploy at scale.

We don't store anything down to the last bit. We store only what we need.
You don't think digital neural networks also utilize abstractions and approximations? They very much do!
 
Last edited:
  • Like
Reactions: NinoPino

bit_user

Titan
Ambassador
I'm sure Altman is just looking at the flops and ops he needs based on some projection graph. And also looking at what's currently available and putting THAT into another graph and how he came to his conclusion.
OpenAI have some of the smartest AI researchers in the world. I'm sure they're making projections, but I'm sure those projections are based on how capable they expect the state-of-the-art AI silicon to be by then, not naively basing their assumptions on the performance & cost of today's silicon.

But as the TSMC fabs showed us in Arizona, things are hardly predictable when it comes to building fabs.
That's probably more of an outlier. Even then, its schedule only slipped by a year or so. What they're talking about building won't even start coming online for probably 3-5 years from now, at the earliest. And it'll take probably at least another 5 years for that initial buildout to complete.

And as Intel showed us, a die shrink to 10mm was harder than they thought.
Again, that's another outlier. Intel shot themselves in the foot by trying to use DUV for a 7nm-class node, and it was pushing the technology too far.

In all honesty it's always to ask for more and deliver, then ask for less and not deliver.
Sure, I could see Altman asking for $8T, but actually expecting to achieve more like $2-3T.
 

bit_user

Titan
Ambassador
people will not disappear overnight, using AI to make people more productive (like Excel did to accounting, ACad to architechture, safer cars and trailers for drivers ,etc.) is the way to go, replacing people it's a dangerous path to make precarious jobs and potential hacking to every aspect of our lives.
I'm not talking about what I want to happen, but we should consider what happened to factory workers as automation improved. Today, you need like 1/10th as many factory workers as you did for the same factory output, back in the 1970's or so, yet the quality of manufactured goods has markedly improved. By analogy, enhancing IT workers of the future should mean the industry needs fewer of them/us. Businesses will embrace AI, in order to save money or increase revenues. I think those savings will come in the form of reduced labor costs, more than increased revenues, especially if everyone is doing it (because the total addressable market of most industries probably won't get much bigger).

As for the ecological impact, the carbon footprint of someone roughly scales with their salary - particularly in the range between the poor & middle class. So, if large numbers of us drop out of the middle class, you'd expect to see us have a smaller ecological impact. I'm not saying that's a worthwhile tradeoff, but if you imagine how certain people could justify the carbon footprint of deploying AI at greater scales, perhaps it's something they're taking into account. Who knows?
 
  • Like
Reactions: ivan_vy
Feb 18, 2024
6
4
15
Altman wants to solve the problem by just throwing money at it...looks like the solution came from ChatGPT.
Not from gpt, it's the standard playbook every MBA is taught. It guarantees accelerated industry growth, commodity, and monopoly. Strategy is typically used when you don't have a clear use case (i.e. vision) but see huge market upside.

Ford was successful at it with the model T, Apple with the iPhone, but customers knew exactly what those products were for when they unboxed them, AI's not there, yet.
 
  • Like
Reactions: ivan_vy