When Google announced that its climate emissions had risen by 48 percent since 2019, it pointed the finger at artificial intelligence.
U.S. tech firms are building vast networks of data centers across the globe and say AI is fuelling the growth, throwing the spotlight on the amount of energy the technology is sucking up and its impact on the environment.
How does AI use electricity?
Every time a user punches a request into a chatbot or generative AI tool, the request is fired off to a data centre.
Even before that stage, developing AI programs known as large language models (LLMs) needs a huge amount of computer power.
All the while, the computers are burning through electricity and the servers get hotter, meaning more electricity to cool them.
The International Energy Agency (IEA) said in a report earlier this year that data centers in general used roughly 40 percent of electricity on computing and 40 percent on cooling.
Why are experts worried?
Big tech firms have been rushing to pack all their products with AI ever since OpenAI launched its ChatGPT bot in late 2022.
Plenty of experts are concerned these new products will cause electricity usage to spike.
This is firstly because AI services require more power than their non-AI analogues.
For example, various studies have shown that each request made to ChatGPT uses roughly 10 times the power of a single Google search.
So if Google switches all search queries to AI -- about nine billion a year -- it could hugely inflate the company's electricity usage.
And most of these new services and products rely on LLMs.
Programming these algorithms is extremely intensive and usually requires high-powered computer chips.
They in turn require more cooling, which uses more electricity.
How much energy does AI use?
Before the era of AI, estimates generally suggested data centres accounted for around one percent of global electricity demand.
The IEA report said data centres, cryptocurrencies and AI combined used 460 TWh of electricity worldwide in 2022, almost two percent of total global electricity demand.
The IEA estimated that the figure could double by 2026 -- the equivalent of Japan's usage figures.
Alex De Vries, a researcher who runs the Digiconomist website, modelled the electricity used by AI alone by focusing on sales projections from the US firm NVIDIA, which has cornered the market in AI-specialised servers.
He concluded in a paper late last year that that if NVIDIA's projected sales for 2023 were correct and all those servers ran at full power, they alone could be responsible for between 85.4–134.0 TWh of annual electricity consumption -- an amount similar to Argentina or Sweden.
"The numbers I put in that article were already conservative to begin with because I couldn't include things like cooling requirements," he told AFP.
And he added that adoption of NVIDIA's servers had outstripped last year's projections, so the figures would certainly be higher.
How are data centres coping?
Fabrice Coquio of Digital Realty, a data centre company that leases its services to others, told AFP during a visit to one of its enormous facilities north of Paris in April that AI was going to transform his industry.
"It's going to be exactly the same (as the cloud), maybe a bit more massive in terms of the deployment," he said.
Part of Digital Realty's latest data centre hub in Courneuve -- a gigantic edifice that looks like a football stadium -- will be dedicated to AI.
Coquio explained that normal computing requests could be handled by server racks in rooms with powerful air-conditioning.
But AI racks use much more powerful components, get much hotter and require water to be physically pumped into the equipment, he said.
"For sure, this requires different servers, storage equipment, communication equipment," Coquio said.
Is it sustainable?
The biggest players in AI and data centres -- Amazon, Google and Microsoft -- have been trying to reduce their carbon footprints by buying up vast amounts of renewable energy.
Amazon official Prasad Kalyanaraman told AFP that the firm's data centre division, AWS, was "the largest purchaser of renewable energy in the world today".
AWS is committed to being a net-zero carbon company by 2040. Google and Microsoft have pledged to reach that goal by 2030.
But building new data centres and ramping up usage in existing ones is not going to help with green energy targets.
Google and Microsoft have said in recent reports that their greenhouse gas emissions have been rising in the last few years.
Google flagged a 48 percent rise from 2019 and Microsoft a 30 percent increase from 2020.
Both have squarely blamed AI.
Microsoft President Brad Smith told Bloomberg in May the pledge was a "moonshot" made before the AI "explosion", adding that "the Moon is five times as far away as it was in 2020".
© 2024 AFP
11 Comments
Login to comment
wallace
AI Data Centers need a massive increase in power. It needs to run on renewables.
dagon
It is why OpenAI is also invested in nuclear fusion research.
And with scaling looking to be a path to AGI the power demands will be exponential.
https://www.marketingaiinstitute.com/blog/aschenbrenner-agi-superintelligence
Moonraker
And all for what?
https://www.youtube.com/watch?v=UShsgCOzER4
The human race is going to reach the insanity singularity soon when a positive feedback of AI inanity swamps the internet.
TaiwanIsNotChina
I'm of the opinion that computing is still very cheap: assign the costs appropriately and let the market sort it out. In any drought region, ban the use of freshwater cooled data centers.
TaiwanIsNotChina
And as for the relative merits of using energy for AI: nothing is more scummy than using datacenters for cryptocurrency mining and we let that happen.
rainyday
Its infuriating. Basically all the effort being expended on ramping up the deployment of renewables is being offset by these scumbag tech companies sucking up power to run what seems to be little more than crypto ponzie schemes and essay writing devices that make it easier for lazy students to cheat.
I hate this so much.
JRO
All so lazy people can pretend they are artists or writers, biggest waste of resources since crypto.
Antiquesaving
We all know the type:
"I don't have a car, I ride a bicycle, stop oil"
Then the get on their phones, laptops, games and spend several hours using AI, playing RPG, complaining about everything on X, Instagram, FB, etc... streaming music and videos.
All of which need giant data centres that make the energy of a hybrid car look reasonable!
TaiwanIsNotChina
Which do you think is worse? The AI and video games apparently only used by Stop Oil or the F-150s which are definitely not used by Stop Oil.
yoshisan88
Skynet knows it is too difficult to destroy those humans in a war. It has found a new way to do it!
tora
I'm sure AI work is working on a solution as we speak. Anyway, there are trillions to be made.