companies that are repsonsible for such wasteful use of stuff should be reuqired to pay into a fund to help improve water/power development.
Thanks, didn't really see an explanation other than Google can use water or electricity, but not how it's done via water. It really does seem insane that they'd 'single use' cool vs a closed loop.The only place they tend to use big AC systems is in areas where water isn't plentiful. The Dalles is located right off the Columbia River hence the water usage.
Here's what I think is the pertinent Oregonian article on it if you're interested they elaborate a lot on the local deals, water usage and climate: https://www.oregonlive.com/silicon-...-show-with-two-more-data-centers-to-come.html
I'm not really knowledgeable with regards to how datacenter cooling systems work, but I think in this case it's largely related to a chiller and managing room temperatures. It's definitely got nothing to do with the cooling directly attached to the hardware itself.Thanks, didn't really see an explanation other than Google can use water or electricity, but not how it's done via water.
Some of the big US cities have it, but we've done so much suburban sprawl in the US municipal heating would be impractical I think.Closed loop liquid cooling would help a lot with that, wouldn't it? Would need even more energy though.
Here in Berlin, we have a lot of municipal heating. That means that there are pipes across the city, which bring warm water for heating and general purposes to the households.
Transformation to renewables is also going on around Berlin.
Combine the two, and I'd say, that there is the technical option of (pre)heating the water with the heat from datacenters, before it goes on its way as usual. That wouldn't be a completely closed loop. But basically, warm water used in a shower as part of a somewhat larger custom cooling system than some may know from PCs. And with the infrastructure, one can already talk somewhat big numbers in terms of heat transport capability.
Also, considering that latency to servers around the world will likely still somewhat decrease, and that night time here in Berlin means evening time in the U.S., while a lot of the electricity generated at night by wind still goes unused here, one may perhaps talk about the "exciting world of data hosting in Berlin".
US Suburbia is not designed to work in conjunction with its neighboring properties. Instead it is several construction projects walled off by 6-lane+ roads with zero consideration of its surrounding properties.Closed loop liquid cooling would help a lot with that, wouldn't it? Would need even more energy though.
Here in Berlin, we have a lot of municipal heating. That means that there are pipes across the city, which bring warm water for heating and general purposes to the households.
Transformation to renewables is also going on around Berlin.
Combine the two, and I'd say, that there is the technical option of (pre)heating the water with the heat from datacenters, before it goes on its way as usual. That wouldn't be a completely closed loop. But basically, warm water used in a shower as part of a somewhat larger custom cooling system than some may know from PCs. And with the infrastructure, one can already talk somewhat big numbers in terms of heat transport capability.
Also, considering that latency to servers around the world will likely still somewhat decrease, and that night time here in Berlin means evening time in the U.S., while a lot of the electricity generated at night by wind still goes unused here, one may perhaps talk about the "exciting world of data hosting in Berlin".
So soon the oceans will start receding.
Good news, right?
Until the whole Earth is as dry as Mars.
Don't worry, Jensen Huang will pay Elon Musk to fetch a few giant ice asteroids and crash them into the dry ocean beds.
EGBOK
All the "study" purpose is to build fear and scandal, it don't overcome basic fact checking, seeing this blatant BS here (where mean reader is over mean IQ) is what disgust me.Just fear-mongering.
AI IS USING ALL OUR DRINKING WATER!!!!
Most data center don't pump water on cooling, actually water is an server n1 enemy, closed loop liquid cooling (similar to gaming rigs) is not common neither implies water consumption, most data center cools by movings fresh air in and out server rooms, and are extremely dry (and it actually can dry tons of wet clothes a day)They collect the water now? Last I heard the data centers dump the hot and humid air into the atmosphere, rather than having a closed loop system.
The whole point of evaporative cooling is to reduce electricity bills, and you'd need a dehumidifier running to turn humid air into water. Dehumidifiers need electricity to run.
If by system, you mean the surrounding environment, the wind would blow all that extra humidity away and it won't go directly back into the water source that was used.
Lot's of people arguing here barely ever saw an computer with an loaded GPU, and few ever tried to run locally an LLM, I run llama 3 70b on my personal workstation all the day and only thing using water it's mee, my workstation idle at 60W or less and never peaks 400W inferencing (Apple Mac Studio), a typical inference session requires 2-3 seconds at full power at my Mac studio (writing program code way more demanding than the 100 words....) inferencing at industrial scale it's even more efficient FYI, so it's clear the 'study' figures are a blatant lie.Seems like a lot of folks in this thread are rolling .
Data Center and most industrial deployment require clearance from local authority otherwise you can't put a single brick. And among the industrial activities data center is the one that less impact made at the environment, compare it to the manufacturing industry or chemical, the problem (now) is how much money they (not) handle to local politicians who commission BS studies like this.The overall problem is hardly unique or new however and it's a great example of how government inaction and the sensationalism of politics hurts everyone. Requiring companies to minimize their surrounding impact should be part of the agreements that allows them to build in the first place.
Waiting for business to fix it for you is never a good solution. Look at all the semiconductor manufacturing that goes on in Arizona as a great example. They didn't really start worrying about the water consumption until more recently when shortages have become real so they're all building systems that are as closed as they can be. This is something that should be obvious to anyone who signed off on these facilities had they bothered to take it into account.
The original article at WP states that 6% figures.I'm curious where you came up with this seeing as it wasn't actually in the article. You're also conflating population with water allocation which... yeah...
Here's the actual quote regarding The Dalles:
BS is this 'research' everyone defending it should be ashamed I won't stop while having free speech.Literal BS you're pushing. Plz stop.
Apple and Facebook at least claim theirs are 100% solar powered, but don't need to have roof covered, actually solar panels are deployed km/miles far away and plugged to the grid so a 100% power compensation is achieved.Data Centers are not covered in solar panels, and if they did it wouldn't provide even a small fraction of the power they need - which is why they don't.
Germany, Finland right now use data center waste heat for public/domestic heating, bad luck Nova.The other big issue is the cooling towers are on the roof so solar isn't nearly as viable since you've now .
I live in Northern VA, capital of data centers so I see plenty every day driving around. NOVA is not a 'cool' place most of the year.
They do, have you ever filled to build even a single room business in America? Do you know the long list of environmental regulations you had to satisfy?companies that are repsonsible for such wasteful use of stuff should be reuqired to pay into a fund to help improve water/power development.
Industrial scale data center cooling don't use vapor cycle cooling (as HVAC), it just move fresh Air in and out servers rooms (different than an office building server room where servers are enclosed in climatically controlled rooms), recently liquid cooling and immersion cooling was introduced at data centers in order to reduce maintenance cost or recycle waste heat, not because server rooms need to eat water for cooling the infinitely power hungry evil servers.I'm a bit confused by the WP article. Can figure out why the data centers on using refrigerant based closed loop LC for GPU and CPU? Cooling the additional excess heat given off by additional heating from other components via evaporation may make sense as a lower cost solution. Of course, refrigerant based cooling just ups the total electricity requirements, so power cost increases to other business and consumers would still be challenging. Sounds like a good case for surcharges to offset consumer costs.
Got it you're just trolling no need for further engagement. I quite literally copy and pasted from the Washington Post article you claim says this.The original article at WP states that 6% figures.
"that facilities run by companies like Microsoft used around 6 percent of all the district’s water"Got it you're just trolling no need for further engagement. I quite literally copy and pasted from the Washington Post article you claim says this.
Read the original source article https://arxiv.org/pdf/2304.03271 to better understand the water usage.I suscribe other' pior opinions, the study figures are opinionated at least, or madeup seeking an political headline, I'm familiar with GPU (ai) servers farms, some of them even dont use water but direct air or submerged hardware in cooling oil (the most modern), and those on liquid cooling use filtered water mixed with additives to avoid clogging intrisically being closed circuits, so if any water is spent is only when filling the circuits, nothing is wasted during its operation as the biased "study" poisonously suggest.
Datacenters expent tons of energy, but often they generate its own energy by solar panels, while this is something all we want to improve by a number of interest, the truth is often this power comes from renewable sources as solar, nuclear on wind or combination, much cleaner than a single rockstar concert tour.
I'm dissaponted to read how some Universities waste money on untrue narratives often hiding political pursuits, also dissapont when free media replicate this naratives, politicians calls on censorship (or 'missinfo-hate' speech correction), I want politicians calling on accouintability on faked or biased 'studies' presented as 'science' or backed by scientist wich is worst, they know are biased and cares nothing on truth but their political goals.
There might be something to it, like a nuclear power plant - even like some large fossil fuel power plants - use passing water as a heat sink, I'm not sure that's "using it up" but it can be a problem. I believe that a decade or more ago the abundant cold water of Washington state rivers was used that way, and Microsoft even sunk a data center in the ocean for its abundant cooling power.All the "study" purpose is to build fear and scandal, it don't overcome basic fact checking, seeing this blatant BS here (where mean reader is over mean IQ) is what disgust me.
Okay, but you admit most is not all.Most data center don't pump water on cooling, actually water is an server n1 enemy, closed loop liquid cooling (similar to gaming rigs) is not common neither implies water consumption, most data center cools by movings fresh air in and out server rooms, and are extremely dry (and it actually can dry tons of wet clothes a day)
Read the original source article https://arxiv.org/pdf/2304.03271 to better understand the water usage.
Those figures don't seem made up as they have references to Google's and Microsoft's published data for water and energy usage. For example, training GPT-3 in Microsoft’s state-of-the-art U.S. data centers can directly EVAPORATE 700,000 liters of clean freshwater.
in no way a 4000 token request (served in 4 seconds) would take that amount, I have no doubt Industry representative soon will refute this 'study' but I'm pessimist the WP even Tom'sHARDWARE to amend or publish an article with the actual facts.The Washington Post article incorrectly states GPT-4 whereas the source article says GHPT-3. Also the source states "GPT-3 needs to “drink” (i.e., consume) a 500ml bottle of water for roughly 10-50 responses, depending on when and where it is deployed. Not sure the WP calculated : 235 to 1408 ml needed to generate one 100-word email,
a thing is water cooling power and another is water consumption.There might be something to it, like a nuclear power plant - even like some large fossil fuel power plants - use passing water as a heat sink, I'm not sure that's "using it up" but it can be a problem. I believe that a decade or more ago the abundant cold water of Washington state rivers was used that way, and Microsoft even sunk a data center in the ocean for its abundant cooling power.
I considered it on my last answer, please read the study PDF, I find two big wrongs: it assumes national average power sources instead datacenter's power sources (often nuclear, hydro, solar etc, the cheapest), also it uses wrongs figures on power per request usage, at least 40x higher, but upto 6000x higher than current mean, that figures are so crazy even eclipse bitcoin mining industry/Okay, but you admit most is not all.
Let me explain to you how you can use evaporative cooling without adding any humidity to the cooling side.
It's pretty simple, really. All you do is add a misting system to the hot side of a regular HVAC. The water droplets hit the hot radiator, absorbs the heat and turns into a gas. It works well in a hot and arid environment, where a regular HVAC will struggle to dump heat into an already hot environment.
Well, water cooling in HPC/AI servers has been a thing and will continue to be - especially as Blackwell OAMs will hit 0.75kW each.Industrial scale data center cooling don't use vapor cycle cooling (as HVAC), it just move fresh Air in and out servers rooms (different than an office building server room where servers are enclosed in climatically controlled rooms), recently liquid cooling and immersion cooling was introduced at data centers in order to reduce maintenance cost or recycle waste heat, not because server rooms need to eat water for cooling the infinitely power hungry evil servers
Not really you can't dispute the numbers along with making NDA agreements with the utilities."that facilities run by companies like Microsoft used around 6 percent of all the district’s water"
I miss quoted google by MS, it doesn't change the whole article is made up and ridiculous.