When we discuss artificial intelligence (AI) and language models, we often focus on the vast amounts of data required to train and operate them. However, another important factor to consider is energy consumption.
According to an article written by data scientist Alex de Vries and published at the Free University of Amsterdam suggests that the AI industry could consume as much energy as a country the size of Argentina or the Netherlands by 2027.
Google’s AI, in particular, has been under scrutiny. The study found that if Google were to fully integrate AI into its search engines, its annual energy consumption could rise to 29.3 terawatt-hours (TWh), which is comparable to the energy consumption of a country like Ireland.
In 2021, Google used 18.3 TWh of electricity, with AI accounting for approximately 10% to 15% of that consumption. However, Google’s use of AI is expanding rapidly, with new developments such as the launch of Bard and the integration of AI into its search engine.
Despite these alarming figures, the study points out that large-scale implementation of AI using current hardware and software technology is unlikely to happen quickly.
One of the main challenges is the scarcity of graphics processing units (GPUs) with adequate processing power to handle the vast amount of data required for AI. De Vries suggests that developers should not only focus on optimizing AI but also critically consider the need to use AI in the first place.
The environmental impact of AI is often overlooked. Currently, data centers consume around 1-1.3% of global electricity, and integrating AI into existing applications such as search engines could significantly increase this percentage.
For example, training ChatGPT, which runs on around 10,000 NVIDIA video cards, consumed 1,287 megawatt-hours of energy, enough to power 121 homes for a year.
This staggering energy consumption is a concern, especially as AI and language models become increasingly sophisticated and widely used. As a result, it is essential to develop and implement sustainable practices for training and operating these models.
Chatbots, such as ChatGPT, have been a significant contributor to this trend. The demand for AI chips has grown substantially since the arrival of ChatGPT in 2022.
NVIDIA, a leader in the sector, reported a return of US$16 billion in the quarter that ended in July, indicating a growing trend. This demand has led companies like Google and Amazon to develop their own chips.
The study warns that integrating generative AI into each search on Google’s search engine would significantly increase energy demand1. It is estimated that doing this would require thousands of servers, using the processing power of more than 4 million video cards (GPUs).
This could result in daily electricity consumption of 80 GWh, or 29.2 TWh annually.
The study also highlights the two phases of AI tools: the initial training phase and the inference phase. The training phase uses the most energy, but the inference phase can end up using 500 times more energy.
The author, Alex de Vries, warns the scientific community to pay attention to the inference phase of AIs and their energy consumption.
Despite these concerns, there are potential solutions. Christopher Alexander, director of analytics at Pioneer Development Group, suggests that developers can mitigate energy consumption through creative use of resources. For example, alternative energy sources such as natural gas from oil drilling or biogas from landfills could be used.