News ChatGPT 5 power consumption could be as much as eight times higher than GPT 4 — research institute estimates medium-sized GPT-5 response can consum...

Have they released any info on how large their AI model is? For example, can they get much of it to run locally on a user's PC, and have it automatically send work to their servers when a more sophisticated set of models and extensions are needed? It may help offset some power demands and server resources needed per user.
 
Have they released any info on how large their AI model is? For example, can they get much of it to run locally on a user's PC, and have it automatically send work to their servers when a more sophisticated set of models and extensions are needed? It may help offset some power demands and server resources needed per user.
That is the reason they try to move as much AI workload to local machines as possible. But it seems that local one is much much slover than the online version. At this moment big companies pays the electric bills, and collet user data to train theis AIs and also just for the user data itself, but one day there will be payment waiting, so developing local operations is in high priority even it is rather slow...
 
This is the problem with university researchers and estimates. Is OpenAI actually using 45 GWh of electricity, purely for GPT-5 queries? It should be possible to determine appoximately how many data centers and such OpenAI uses, but without insider knowledge, it's impossible to say whether those GPUs and servers are being for:
  1. Running GPT-5
  2. Running GPT-4 and older models
  3. Running non-GPT models
  4. Training new models
  5. Running other infrastructure
  6. Basically anything else
45 GWh would mean that OpenAI is consistently using the equivalent of 1.875 GW of power, all day long, every day. That's easily the equivalent of a couple dozen large 75~100 megawatt data centers doing nothing other than running OpenAI. That figure might be viable, but even if OpenAI is using that much power, it's a safe bet that at present a large chunk of that power isn't currently being used to serve up GPT-5 responses.

Realistically? I'd guess no more than 10~20 percent is for GPT-5 inference. You could also argue that GPT-5 used probably thousands (tens of thousands) of GPUs for a couple of months to train the model. Again, that's just estimating, but obviously a ton of electricity gets consumed in the process. However, if that's factored into estimates at all, it would also mean the cost per query goes down over time, as the training power gets diffused across billions (trillions?) of queries.

Anyway, a quick estimate here. GPT-5 can respond to a typical request in about 17 seconds. That's based on me just running a query right now asking it to write me a short, funny story about training GPT-5. So, about 1000 tokens took 17 seconds. To get up to 18 Wh per query with that sort of napkin math, it would mean that the hardware used to respond to my query was consuming 3800 watts of power. Getting up to 3800 watts isn't hard. But Nvidia's GPUs are designed to run multiple concurrent workloads at the same time. If a single H100-based server (eight H100 GPUs) is running the query, it's probably also running a dozen other concurrent queries is my guess.

But I'll admit, I could be wrong. I'm not researching OpenAI power use or anything else. I just suspect that a lot of these estimates are more of a "worst-case" estimate than a real-world estimate. Probably both GPT-4 and GPT-5 average power use is a lot lower than these estimates on average, but proportionately GPT-5 likely does use 8X more power on average.
 
This is the problem with university researchers and estimates. Is OpenAI actually using 45 GWh of electricity, purely for GPT-5 queries? It should be possible to determine appoximately how many data centers and such OpenAI uses, but without insider knowledge, it's impossible to say whether those GPUs and servers are being for:
Its like , 5 years ago every company was spending half the marketing budget on being percieved as "ESG FRIENDLY" because it attracked stockbuyers.Now we are drinking through paper straws, everyone has forgotten ESG, the private planes are litterally commuting the rich around like its nobodys business , dumping millions of litres of jetfuel into the air. The AI is soon consuming more power than all the households,

and somehow your analysis ends up in the realm of ... "nothing to see here"

The modern west is a circus waiting to break down. We have speedrun into oligarchy and half the people like you be like "yo man stop being so xonspiracy boizzzz everything is just fine !!!!!!!". Nobody can afford a home 😀 People are litterally buying each other out of the market , with money they lend to be able to afford a home in the first place. Its the most systematic abuse of every single human who is not able to just provide security for the loans. We are told "its so that we can afford" , but really we are just enabling and accellerating the very dynamic pushing normal folks out of owning a home in the first place.

I find it rather depressing the amount of sloths making any change completely unfeasible because the conformity have gotten so over the top, that half the population will sloth around until they comfort and conformity is risking getting compromised. Machiavelli wrote a book about this 400 years ago, arguing that black/white statements are so much more comfortable to the voters, promising some sort of "order" in the chaos. As such, someone with grandiose claims will always have a vast advantage over anyone being honest about the complexity.

Today the biggest enemy of society is not the companies , or the politicians. They are just doing what everyone has always done. Looking out for themselves. The problem is the vast amount of western population who are so comfortable they will not realise the problem until it has gotten really bad. It is similar to how a privileged addict falls harder , because they will get so father away from reality, before feeling the consequences and thus hopefully be able to change. If you dont feel any consequence until you house is actually on fire, then it is to late to do much.
 
The takeaway from the article for me is that we're using the output of nuclear power plants to deliver incorrect, hallucinated results. I'm sure AI will be nice when it's refined and useful, but right now it appears to be sucking up such ridiculous resources that the return on investment promises won't come true. This power study amps up the risk AI poses not just to investors but also to the entire world. The AI bubble needs to pop.

I think someone wrote a book about this that got turned into a movie. A civilization poured all their resources into building the best computer and it returned 42 as a result.
 
  • Like
Reactions: jonaswox
40 watt-hours for a 30 second, 1000 word response? No way. Unless they amortize the power used to train the model, and build the building, and resettle all the Native Americans who used to live on the sacred grounds where they built the powerplant to South Carolina and the energy they used to build and operate the casino there for the last ten years.

Oh, btw, freebie ChatGPT apparently rolled back GPT5, brought back GPT4o, and when you exceed the quota it puts you out cold for 24 hours instead of to a weaker model for 4 hours. SMH