And yet, these two high-tech giants also lead the industry in terms of green energy adoption.
Google and Microsoft consume more power than some countries : Read more
Google and Microsoft consume more power than some countries : Read more
I'd like to see them be compliant with US law, for starters...TBH world should just come to a global understanding: those single companies/business who use extreme amounts of power should be forced to support improving the power supply as they put most strain on the grids.
Complying with anti-monopoly laws would be a bonus. Too much 'power' concentrated with too few people.And yet, these two high-tech giants also lead the industry in terms of green energy adoption.
Google and Microsoft consume more power than some countries : Read more
Do you have any verification/corroboration of this "9/10ths"?We can cut the power requirements required by AI data centers by 9/10ths if we introduced better AI that runs locally on the device.
That number is an estimate meant to be within the order of magnitude. There are multiple variables, including AI capabilities of desktop processors and how popular local AI will be. The more it is used over online models, the more we will save power. Local AI is doing away with GPU requirements, which should significantly lower power requirements.Do you have any verification/corroboration of this "9/10ths"?
So, just a guesstimate.That number is an estimate meant to be within the order of magnitude. There are multiple variables, including AI capabilities of desktop processors and how popular local AI will be. The more it is used over online models, the more we will save power. Local AI is doing away with GPU requirements, which should significantly lower power requirements.
There is a growing body of data on how AI will increase power requirements in datacenters. It's something we can fix.
https://www.goldmansachs.com/intelligence/pages/AI-poised-to-drive-160-increase-in-power-demand.html
I prefer to call it a Fermi estimate. 😛So, just a guesstimate.
OK.
"AI" depends on LOTS of data, readily accessible. LOTS = more data than your PC or laptop can hold. By several orders of magnitude.
Really?For a true offline experience, FreedomGPT requires around 5GB of storage (as of 2023) and 16GB of RAM.
Some models are larger (and better) than others. FreedomGPT really functions as a host platform for the open source Alpaca and LLaMA LLM models, but it could be sketchy (edit: some are complaining it is). I'm not endorsing the app, just offline models in general.Really?
To hold a representation of the entire planets knowledge base, completely offline, in less space than a typical movie.
Someone is blowing some smoke.