What’s The Hidden Cost of Your GenAI Search?

2024-07-05 13:21:29

Energy Hungry AI: What's The Hidden Cost of Your GenAI Search?

AI technology is driving an upsurge in energy consumption globally (Disclaimer: AI Generated Image)

New Delhi:

A traditional Google search consumes 0.0003-kilowatt hour (KWh) energy on an average, according to a 2009 Google report. That energy can power your household (9 watt) light bulb for about 2 minutes. Google averages about 8.5 billion searches per day as of 2023, translating to 2,550,000 KWh worth of electricity per day, which is about 2000 times more electricity than an average Indian consumes (1255 KWh) for an entire year.

This May, Google announced that the company will integrate AI into its search engine, which will be powered by its most powerful AI model – Gemini.  According to Alex de Vries, a Dutch Data Scientist who spoke to The New Yorker on the subject, a single Google search with AI integrated in it will consume 10 times more energy (3 KWh) than the traditional Google search. That is 20,000 times more than the average Indian’s consumption for a year. 

Alex de Vries, who is also the founder of Digiconomist-the organisation responsible for Bitcoin Energy Consumption Index- has said that Google’s energy consumption will reach about 29 billion terawatt-hours (TWh) per year if they go ahead with the AI integration in its search. This figure is equivalent to the electricity consumption of Ireland, and more than that of Kenya.

 
Why are AI systems so hungry for power? 

AI systems require a lot of computational power for running complex algorithms to process large corpus of ever-increasing data. When you enter a prompt in ChatGPT, it is processed by the chatbot using its servers hosted in data centres. These centres alone account for 1-1.5 percent of the entire global electricity use, according to the International Energy Agency.

“I think we still don’t appreciate the energy needs of this (AI) technology,” Sam Altman, C.E.O of OpenAI, said at a public event in Davos this January. Altman expressed the immediate need for a “breakthrough” technology like nuclear fusion to power the AI operations given the growth projections of the frontier technology. This is a clear indication that the industry leader of the Large Language Model chatbots is seeking avenues to sustain current electricity consumption levels and safeguard its future demand.  

AI’s Carbon Emissions -Setback for UN 2050 Net-Zero Emission Goal? 

Data centres, powering cloud computing as well as AI systems produce 2.5 to 3.5 percent of global greenhouse gas emissions, according to The Shift Project, a French nonprofit working to reduce energy reliance on fossil fuels. This amounts to the same levels of greenhouse gas emissions as the entire aviation industry.

The energy consumption and the subsequent carbon footprint of different AI models vary significantly. For example, BigScience project BLOOM, which is an AI model with 76 billion parameters (internal variables that a model learns during training) consumes 433 megawatt-hour (MWh) of electricity. 

In contrast, OpenAI’s GPT-3 from 2020, which had a comparable number of parameters at 175 billion, consumed 3 times more electricity at 1287 MWh, according to the Artificial Intelligence Index Report 2024 data, published by Stanford Institute of Human-Centered Artificial Intelligence (HCAI).

The CO2 equivalent emissions (in tonnes), that is the total greenhouse gas emissions expressed in terms of carbon dioxide, for BLOOM was 25 tonnes and for GPT-3, it was a whopping 20 times more, at 502 tonnes.

Latest and Breaking News on NDTV

The report also states that there is a serious lack of transparency from the AI developers on the environmental impact of their models, with most not making their carbon footprints public. 

The recently released Google’s Sustainability Report 2024 also reflects the energy hunger of the nascent technology of AI. The company saw a near 50 percent surge in carbon emissions in the last 5 years on the back of powering its new AI technologies. Microsoft’s Sustainability Report 2024 shows similar trends with a spike of 29 percent increase in CO2 emission in 2023, compared to the previous year.   

According to SemiAnalysis, a US-based independent AI research and analysis company, AI will drive the increase of electricity consumption of data centres to 4.5 percent of the global energy generation by 2030. Another estimate from the International Energy Agency suggests that data centres’ total electricity consumption can double from 2022 levels to 1000 TWh (equaling the current electricity consumption of Japan). India has around 138 data centres, with reportedly 45 more set to be functional by the end of 2025. The United States has the highest number of data centres with 2701 of them.  

Lawmakers are starting to take stock of the situation. The European Union has taken cognisance and adopted a new regulation in March this year. Under the scheme, all data centre operators are required to report their energy and water consumption (used for cooling systems). They are also mandated to provide information on efficiency measures being implemented to ensure reduction. 

In February, US Democrats introduced the Artificial Intelligence Environment Impact Act of 2024. The act proposes setting up an AI Environment Impact Consortium of experts, researchers and industry stakeholders to address the environmental impact of AI.   
 

AI Emissons,AI regulations,AI Climate Change

Source link

Loading