News Google and Microsoft consume more power than some countries

I would love to see these same statistics with the same charts vs 2010 side by side. Social media and AI are the biggest change since 2010 you still have the same silly day wasting cat videos today as 2010. Would be interesting I guess to see that.
 

gtarthur

Reputable
Jun 24, 2021
5
8
4,515
Why wasn't Amazon, the world's largest cloud provider, included in this story? For that matter, why not also include Meta/Facebook? Singling out Microsoft and Google actually diminishes the magnitude of the issue. It also gives the appearance of prejudice.
 

ThomasKinsley

Notable
Oct 4, 2023
385
384
1,060
The power requirements are absurd, and the ironic part is that it's not that hard to fix. We can cut the power requirements required by AI data centers by 9/10ths if we introduced better AI that runs locally on the device. The advent of ARM and RISC-V will also naturally lower power requirements for desktops and servers by a large percentage. Privacy legislation can also go a long way to reducing power requirements since, let's face it, a decent chunk of the Internet today is used to send ads, acquire user data, and build profiles.

Experimenting with Samsung Dex was a huge eyeopener on this subject. It can do most basic computing tasks while sipping only 7-25w of power. I'm not going to say you don't need a faster computer (many do!), but there IS an argument to be made that we're not being efficient with most of the current tech we have. Windows bloat can be removed so it frees up resources. Programmers can improve their apps to better allocate RAM and prevent memory leaks. We can cut power without losing any performance simply by being smart.
 

ThomasKinsley

Notable
Oct 4, 2023
385
384
1,060
Do you have any verification/corroboration of this "9/10ths"?
That number is an estimate meant to be within the order of magnitude. There are multiple variables, including AI capabilities of desktop processors and how popular local AI will be. The more it is used over online models, the more we will save power. Local AI is doing away with GPU requirements, which should significantly lower power requirements.

There is a growing body of data on how AI will increase power requirements in datacenters. It's something we can fix.

https://www.goldmansachs.com/intelligence/pages/AI-poised-to-drive-160-increase-in-power-demand.html
 

USAFRet

Titan
Moderator
That number is an estimate meant to be within the order of magnitude. There are multiple variables, including AI capabilities of desktop processors and how popular local AI will be. The more it is used over online models, the more we will save power. Local AI is doing away with GPU requirements, which should significantly lower power requirements.

There is a growing body of data on how AI will increase power requirements in datacenters. It's something we can fix.

https://www.goldmansachs.com/intelligence/pages/AI-poised-to-drive-160-increase-in-power-demand.html
So, just a guesstimate.
OK.

"AI" depends on LOTS of data, readily accessible. LOTS = more data than your PC or laptop can hold. By several orders of magnitude.
 

ThomasKinsley

Notable
Oct 4, 2023
385
384
1,060
So, just a guesstimate.
OK.

"AI" depends on LOTS of data, readily accessible. LOTS = more data than your PC or laptop can hold. By several orders of magnitude.
I prefer to call it a Fermi estimate. 😛

Google is offering "quantized models" small enough to fit on local devices. While it says it's local, it leverages Cloud Workstations, so I presume this is a hybrid model. For a true offline experience, FreedomGPT requires around 5GB of storage (as of 2023) and 16GB of RAM.
 

ThomasKinsley

Notable
Oct 4, 2023
385
384
1,060
Really?

To hold a representation of the entire planets knowledge base, completely offline, in less space than a typical movie.

Someone is blowing some smoke.
Some models are larger (and better) than others. FreedomGPT really functions as a host platform for the open source Alpaca and LLaMA LLM models, but it could be sketchy (edit: some are complaining it is). I'm not endorsing the app, just offline models in general.
 
Last edited: