News Using GPT-4 to generate 100 words consumes up to 3 bottles of water — AI data centers also raise power and water bills for nearby residents

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

pixelpusher220

Distinguished
Jun 4, 2008
221
107
18,760
The only place they tend to use big AC systems is in areas where water isn't plentiful. The Dalles is located right off the Columbia River hence the water usage.

Here's what I think is the pertinent Oregonian article on it if you're interested they elaborate a lot on the local deals, water usage and climate: https://www.oregonlive.com/silicon-...-show-with-two-more-data-centers-to-come.html
Thanks, didn't really see an explanation other than Google can use water or electricity, but not how it's done via water. It really does seem insane that they'd 'single use' cool vs a closed loop.

I have a kinda vested interest here as my house will have ELEVEN data centers within a 1/4 mile before long.

4f6f6864eef41d17.jpg
 
Thanks, didn't really see an explanation other than Google can use water or electricity, but not how it's done via water.
I'm not really knowledgeable with regards to how datacenter cooling systems work, but I think in this case it's largely related to a chiller and managing room temperatures. It's definitely got nothing to do with the cooling directly attached to the hardware itself.
 
Mar 19, 2024
33
13
35
Production cost of one TSMC wafer with chips on it printed is around $20,000 and each wafer has approximately 100 NVIDIA A100/H100 GPUs. One A100/H100 GPU module then is priced at $20,000 to the consumer. The development cost by NVIDIA for all their GPUs was paid back several times already. These days every cat and dog develop GPUs and some already successfully producing them. So the NVIDIA/AMD/Intel GPU prices are inflated by the factor of 10 at least. High bandwidth memory used there also inflated despite it is the same tech as consumer one, just the number of channels what is different (cost difference is small ) plus the stacking packaging which again is minor additional cost nowadays.

Consumer GPU and server GPU have almost the same silicon but differ 10x in price tags despite both like hot dogs are sold in huge quantities. The Tenstorrent's GPUs for example have almost the same performance as NVIDIA/AMD GPUs but are sold at 10x price tag difference for almost $1000 a piece so what the heck NVIDIA and RAM companies are doing?

Conclusion: if even Elon Mask OKs to buy an order of magnitude overpriced NVIDIA GPUs then the whole world is doomed since that indicates that for sure no single sane man is left
 
Last edited:
  • Like
Reactions: Nitrate55

DavidLejdar

Respectable
Sep 11, 2022
284
178
1,860
Closed loop liquid cooling would help a lot with that, wouldn't it? Would need even more energy though.

Here in Berlin, we have a lot of municipal heating. That means that there are pipes across the city, which bring warm water for heating and general purposes to the households.

Transformation to renewables is also going on around Berlin.

Combine the two, and I'd say, that there is the technical option of (pre)heating the water with the heat from datacenters, before it goes on its way as usual. That wouldn't be a completely closed loop. But basically, warm water used in a shower as part of a somewhat larger custom cooling system than some may know from PCs. And with the infrastructure, one can already talk somewhat big numbers in terms of heat transport capability.

Also, considering that latency to servers around the world will likely still somewhat decrease, and that night time here in Berlin means evening time in the U.S., while a lot of the electricity generated at night by wind still goes unused here, one may perhaps talk about the "exciting world of data hosting in Berlin".
 

pixelpusher220

Distinguished
Jun 4, 2008
221
107
18,760
Closed loop liquid cooling would help a lot with that, wouldn't it? Would need even more energy though.

Here in Berlin, we have a lot of municipal heating. That means that there are pipes across the city, which bring warm water for heating and general purposes to the households.

Transformation to renewables is also going on around Berlin.

Combine the two, and I'd say, that there is the technical option of (pre)heating the water with the heat from datacenters, before it goes on its way as usual. That wouldn't be a completely closed loop. But basically, warm water used in a shower as part of a somewhat larger custom cooling system than some may know from PCs. And with the infrastructure, one can already talk somewhat big numbers in terms of heat transport capability.

Also, considering that latency to servers around the world will likely still somewhat decrease, and that night time here in Berlin means evening time in the U.S., while a lot of the electricity generated at night by wind still goes unused here, one may perhaps talk about the "exciting world of data hosting in Berlin".
Some of the big US cities have it, but we've done so much suburban sprawl in the US municipal heating would be impractical I think.
 
Sep 20, 2024
1
0
10
The article should define water usage.

Is your water-cooled gaming PC or car engine releasing waste water?

It's all just running forever in a loop with pumps and heat exchangers...
 
Sep 20, 2024
1
0
10
Rather than complaining about power usage of data centres that power our lives, why don't you complain about governments and energy companies not spending enough on researching clean energy. If all fossil fuels suddenly ran out today, you can guarantee within a day or two we would suddenly have access to limitless energy. Wake up people.
 
Sep 20, 2024
1
2
15
You disappoint me Tomshardware, please think about where you have come from and what you have stood for in the past. With reporting like this you will definitely lose you position as a tech new/info outlet that can be trusted.

You appear you have received a pretty good tongue-lashing already from other folk who are calling you out on this story, so I won’t repeat myself with all the facts that have been push back to you already.

Please consider actually doing real fact finding when it come to reporting claimed facts, you and what you stand for SHOULD be reporting a balanced view point and also show your workings in detail.

I would like to see a follow up story that links to this showing what you actually researched and investigated not what other news outlet produced for clickbait with real interviews. You get paid plenty for your advert drenched site so please use those monies to report balanced stories backed by facts and credible view points. Thank you.
 

RUSerious

Honorable
Aug 9, 2019
65
25
10,570
I'm a bit confused by the WP article. Can figure out why the data centers on using refrigerant based closed loop LC for GPU and CPU? Cooling the additional excess heat given off by additional heating from other components via evaporation may make sense as a lower cost solution. Of course, refrigerant based cooling just ups the total electricity requirements, so power cost increases to other business and consumers would still be challenging. Sounds like a good case for surcharges to offset consumer costs.
 

Notton

Commendable
Dec 29, 2023
864
764
1,260
Closed loop liquid cooling would help a lot with that, wouldn't it? Would need even more energy though.

Here in Berlin, we have a lot of municipal heating. That means that there are pipes across the city, which bring warm water for heating and general purposes to the households.

Transformation to renewables is also going on around Berlin.

Combine the two, and I'd say, that there is the technical option of (pre)heating the water with the heat from datacenters, before it goes on its way as usual. That wouldn't be a completely closed loop. But basically, warm water used in a shower as part of a somewhat larger custom cooling system than some may know from PCs. And with the infrastructure, one can already talk somewhat big numbers in terms of heat transport capability.

Also, considering that latency to servers around the world will likely still somewhat decrease, and that night time here in Berlin means evening time in the U.S., while a lot of the electricity generated at night by wind still goes unused here, one may perhaps talk about the "exciting world of data hosting in Berlin".
US Suburbia is not designed to work in conjunction with its neighboring properties. Instead it is several construction projects walled off by 6-lane+ roads with zero consideration of its surrounding properties.
That's why you need a car to get anywhere, and why you need to walk 5km to get from point A to B when a direct path is only 500m distance.
 

AcostaJA

Distinguished
Dec 23, 2014
18
5
18,515
So soon the oceans will start receding.
Good news, right?
Until the whole Earth is as dry as Mars.
Don't worry, Jensen Huang will pay Elon Musk to fetch a few giant ice asteroids and crash them into the dry ocean beds.
EGBOK

Just fear-mongering.
AI IS USING ALL OUR DRINKING WATER!!!!
All the "study" purpose is to build fear and scandal, it don't overcome basic fact checking, seeing this blatant BS here (where mean reader is over mean IQ) is what disgust me.
They collect the water now? Last I heard the data centers dump the hot and humid air into the atmosphere, rather than having a closed loop system.

The whole point of evaporative cooling is to reduce electricity bills, and you'd need a dehumidifier running to turn humid air into water. Dehumidifiers need electricity to run.

If by system, you mean the surrounding environment, the wind would blow all that extra humidity away and it won't go directly back into the water source that was used.
Most data center don't pump water on cooling, actually water is an server n1 enemy, closed loop liquid cooling (similar to gaming rigs) is not common neither implies water consumption, most data center cools by movings fresh air in and out server rooms, and are extremely dry (and it actually can dry tons of wet clothes a day)
Seems like a lot of folks in this thread are rolling .
Lot's of people arguing here barely ever saw an computer with an loaded GPU, and few ever tried to run locally an LLM, I run llama 3 70b on my personal workstation all the day and only thing using water it's mee, my workstation idle at 60W or less and never peaks 400W inferencing (Apple Mac Studio), a typical inference session requires 2-3 seconds at full power at my Mac studio (writing program code way more demanding than the 100 words....) inferencing at industrial scale it's even more efficient FYI, so it's clear the 'study' figures are a blatant lie.
The overall problem is hardly unique or new however and it's a great example of how government inaction and the sensationalism of politics hurts everyone. Requiring companies to minimize their surrounding impact should be part of the agreements that allows them to build in the first place.

Waiting for business to fix it for you is never a good solution. Look at all the semiconductor manufacturing that goes on in Arizona as a great example. They didn't really start worrying about the water consumption until more recently when shortages have become real so they're all building systems that are as closed as they can be. This is something that should be obvious to anyone who signed off on these facilities had they bothered to take it into account.
Data Center and most industrial deployment require clearance from local authority otherwise you can't put a single brick. And among the industrial activities data center is the one that less impact made at the environment, compare it to the manufacturing industry or chemical, the problem (now) is how much money they (not) handle to local politicians who commission BS studies like this.
I'm curious where you came up with this seeing as it wasn't actually in the article. You're also conflating population with water allocation which... yeah...

Here's the actual quote regarding The Dalles:
The original article at WP states that 6% figures.
Literal BS you're pushing. Plz stop.
BS is this 'research' everyone defending it should be ashamed I won't stop while having free speech.
Data Centers are not covered in solar panels, and if they did it wouldn't provide even a small fraction of the power they need - which is why they don't.
Apple and Facebook at least claim theirs are 100% solar powered, but don't need to have roof covered, actually solar panels are deployed km/miles far away and plugged to the grid so a 100% power compensation is achieved.
The other big issue is the cooling towers are on the roof so solar isn't nearly as viable since you've now .

I live in Northern VA, capital of data centers so I see plenty every day driving around. NOVA is not a 'cool' place most of the year.
Germany, Finland right now use data center waste heat for public/domestic heating, bad luck Nova.

companies that are repsonsible for such wasteful use of stuff should be reuqired to pay into a fund to help improve water/power development.
They do, have you ever filled to build even a single room business in America? Do you know the long list of environmental regulations you had to satisfy?
BS
I'm a bit confused by the WP article. Can figure out why the data centers on using refrigerant based closed loop LC for GPU and CPU? Cooling the additional excess heat given off by additional heating from other components via evaporation may make sense as a lower cost solution. Of course, refrigerant based cooling just ups the total electricity requirements, so power cost increases to other business and consumers would still be challenging. Sounds like a good case for surcharges to offset consumer costs.
Industrial scale data center cooling don't use vapor cycle cooling (as HVAC), it just move fresh Air in and out servers rooms (different than an office building server room where servers are enclosed in climatically controlled rooms), recently liquid cooling and immersion cooling was introduced at data centers in order to reduce maintenance cost or recycle waste heat, not because server rooms need to eat water for cooling the infinitely power hungry evil servers.
 
Last edited:
  • Like
Reactions: coolitic

AcostaJA

Distinguished
Dec 23, 2014
18
5
18,515
Got it you're just trolling no need for further engagement. I quite literally copy and pasted from the Washington Post article you claim says this.
"that facilities run by companies like Microsoft used around 6 percent of all the district’s water"

I miss quoted google by MS, it doesn't change the whole article is made up and ridiculous.
 

DS426

Upstanding
May 15, 2024
254
190
360
The researches have an agenda because it's a narrative that's not "ok"? I need a little more convincing than that. Yes, it's easy to say that anything that has "California" in the name must be a gang of greenwashers, but these kinds of studies are extremely important to understand the current, future short-term, and future long-term impacts of these AI datacenters on our local communities. Does anyone care about their utility costs and grid stability? If so, don't brush this kind of stuff off. If issues like grid instability weren't real, Tesla wouldn't have put in all those dual-fuel generators (which BTW still don't cover their entire electricity demand during training) as the fuel costs and investment are obviously more expensive than purely benefiting from public utilities.

There are plenty of other studies out there that present all sorts of different figures, but none disagree that electrical demand in particular is explosively high and the current trajectory looks even more grim. As for water use, no, not all consume significant amounts of water as it just depends on the cooling design and location. That said, as the article pointed out, water can be consumed to reduce electrical demand. Yes, that evap will just eventually rain back down, but the point was the stress on local utilities and water reserve sources.

My question is this: why isn't geothermal used more extensively with datacenters? I know Google is ramping up use of it in Nevada, and I'm sure it's utilized here and there, but it seems to remain relatively underutilized.

Lastly, solar power on datacenter roofs usually isn't done due to the cooling system as some mentioned, plus it's not enough surface area to provide for the full demand, i.e. it's an offset. That's good and bad news as it doesn't require batteries but that means no offset overnight. Now, there's some that have several acres with solar panels on the ground.
 
Last edited:

anoldnewb

Distinguished
Apr 29, 2011
29
3
18,535
I suscribe other' pior opinions, the study figures are opinionated at least, or madeup seeking an political headline, I'm familiar with GPU (ai) servers farms, some of them even dont use water but direct air or submerged hardware in cooling oil (the most modern), and those on liquid cooling use filtered water mixed with additives to avoid clogging intrisically being closed circuits, so if any water is spent is only when filling the circuits, nothing is wasted during its operation as the biased "study" poisonously suggest.

Datacenters expent tons of energy, but often they generate its own energy by solar panels, while this is something all we want to improve by a number of interest, the truth is often this power comes from renewable sources as solar, nuclear on wind or combination, much cleaner than a single rockstar concert tour.

I'm dissaponted to read how some Universities waste money on untrue narratives often hiding political pursuits, also dissapont when free media replicate this naratives, politicians calls on censorship (or 'missinfo-hate' speech correction), I want politicians calling on accouintability on faked or biased 'studies' presented as 'science' or backed by scientist wich is worst, they know are biased and cares nothing on truth but their political goals.
Read the original source article https://arxiv.org/pdf/2304.03271 to better understand the water usage.
Those figures don't seem made up as they have references to Google's and Microsoft's published data for water and energy usage. For example, training GPT-3 in Microsoft’s state-of-the-art U.S. data centers can directly EVAPORATE 700,000 liters of clean freshwater.

The Washington Post article incorrectly states GPT-4 whereas the source article says GHPT-3. Also the source states "GPT-3 needs to “drink” (i.e., consume) a 500ml bottle of water for roughly 10-50 responses, depending on when and where it is deployed. Not sure the WP calculated : 235 to 1408 ml needed to generate one 100-word email,
 

JRStern

Distinguished
Mar 20, 2017
170
64
18,660
All the "study" purpose is to build fear and scandal, it don't overcome basic fact checking, seeing this blatant BS here (where mean reader is over mean IQ) is what disgust me.
There might be something to it, like a nuclear power plant - even like some large fossil fuel power plants - use passing water as a heat sink, I'm not sure that's "using it up" but it can be a problem. I believe that a decade or more ago the abundant cold water of Washington state rivers was used that way, and Microsoft even sunk a data center in the ocean for its abundant cooling power.
 

Notton

Commendable
Dec 29, 2023
864
764
1,260
Most data center don't pump water on cooling, actually water is an server n1 enemy, closed loop liquid cooling (similar to gaming rigs) is not common neither implies water consumption, most data center cools by movings fresh air in and out server rooms, and are extremely dry (and it actually can dry tons of wet clothes a day)
Okay, but you admit most is not all.
Let me explain to you how you can use evaporative cooling without adding any humidity to the cooling side.

It's pretty simple, really. All you do is add a misting system to the hot side of a regular HVAC. The water droplets hit the hot radiator, absorbs the heat and turns into a gas. It works well in a hot and arid environment, where a regular HVAC will struggle to dump heat into an already hot environment.
 

AcostaJA

Distinguished
Dec 23, 2014
18
5
18,515
Read the original source article https://arxiv.org/pdf/2304.03271 to better understand the water usage.
Those figures don't seem made up as they have references to Google's and Microsoft's published data for water and energy usage. For example, training GPT-3 in Microsoft’s state-of-the-art U.S. data centers can directly EVAPORATE 700,000 liters of clean freshwater.

I understand some industrial chilling system spray free water on cooling towers to improve performance when the system capacity is exceed, but the methodology on this study cant be more biased and un-fair, firts it dont consider Datacenter specific energy source, it levels to US national std average of 73% non-renewable, it even includes water consumed from Hydroelectric power generation "butwater consumption due to expedited water evaporation from hydropower generation is often included [6]."

their average energy consumption per request is wrong and made up as high as the hidden maths allow: "Thus, we consider 0.004kWh as the inference energy consumption per request."
a single 350W Nvidia H100 gpu delivers 1200 tokens/second average llama 3 request requires 75 tokens utpo 8000, doing a bit of math 1h of h100 serves 57000-1000 request/400w (400 inc cpu,cooling etc a typical server host at least 8 GPU) , thats among 0,000.0007 KWh to 0,0001KWh (peak), at the highest theoretically possible study figures are at least 40 times higher.
The Washington Post article incorrectly states GPT-4 whereas the source article says GHPT-3. Also the source states "GPT-3 needs to “drink” (i.e., consume) a 500ml bottle of water for roughly 10-50 responses, depending on when and where it is deployed. Not sure the WP calculated : 235 to 1408 ml needed to generate one 100-word email,
in no way a 4000 token request (served in 4 seconds) would take that amount, I have no doubt Industry representative soon will refute this 'study' but I'm pessimist the WP even Tom'sHARDWARE to amend or publish an article with the actual facts.
 
Last edited:

AcostaJA

Distinguished
Dec 23, 2014
18
5
18,515
There might be something to it, like a nuclear power plant - even like some large fossil fuel power plants - use passing water as a heat sink, I'm not sure that's "using it up" but it can be a problem. I believe that a decade or more ago the abundant cold water of Washington state rivers was used that way, and Microsoft even sunk a data center in the ocean for its abundant cooling power.
a thing is water cooling power and another is water consumption.
 

AcostaJA

Distinguished
Dec 23, 2014
18
5
18,515
Okay, but you admit most is not all.
Let me explain to you how you can use evaporative cooling without adding any humidity to the cooling side.

It's pretty simple, really. All you do is add a misting system to the hot side of a regular HVAC. The water droplets hit the hot radiator, absorbs the heat and turns into a gas. It works well in a hot and arid environment, where a regular HVAC will struggle to dump heat into an already hot environment.
I considered it on my last answer, please read the study PDF, I find two big wrongs: it assumes national average power sources instead datacenter's power sources (often nuclear, hydro, solar etc, the cheapest), also it uses wrongs figures on power per request usage, at least 40x higher, but upto 6000x higher than current mean, that figures are so crazy even eclipse bitcoin mining industry/
 

RUSerious

Honorable
Aug 9, 2019
65
25
10,570
Industrial scale data center cooling don't use vapor cycle cooling (as HVAC), it just move fresh Air in and out servers rooms (different than an office building server room where servers are enclosed in climatically controlled rooms), recently liquid cooling and immersion cooling was introduced at data centers in order to reduce maintenance cost or recycle waste heat, not because server rooms need to eat water for cooling the infinitely power hungry evil servers
Well, water cooling in HPC/AI servers has been a thing and will continue to be - especially as Blackwell OAMs will hit 0.75kW each.
 

Zerk2012

Titan
Ambassador
"that facilities run by companies like Microsoft used around 6 percent of all the district’s water"

I miss quoted google by MS, it doesn't change the whole article is made up and ridiculous.
Not really you can't dispute the numbers along with making NDA agreements with the utilities.
Completely gave false information about everything to the local government and is now saying it's none of your business, even the not being legal under law of the huge generators running on diesel to help hide power usage.
https://time.com/7021709/elon-musk-xai-grok-memphis/

But other local officials and community members soon became frustrated with the project’s lack of details. The Greater Memphis Chamber and Memphis, Gas, Light, and Water Division (MLGW) signed a non-disclosure agreement with xAI, citing privacy of economic development. Some Memphis council members heard about the project on the news. “It's been pretty astounding the lack of transparency and the pace at which this project has proceeded,” Amanda Garcia, a senior attorney at the Southern Environmental Law Center, says. “We learn something new every week.”