Artificial intelligence-powered systems not only consume huge amounts of data for training but also require tremendous amounts of electricity to run on. A recent study calculated the energy use and carbon footprint of several recent large language models.
One of them, ChatGPT, running on 10,000 NVIDIA GPUs, was found to be consuming 1,287 megawatt hours of electricity – the equivalent of energy used by 121 homes for a year in the United States.
As we accelerate towards building one of the greatest technological developments man has ever achieved, we need to ask ourselves, what is the offset of this development?
In a commentary published in the journal Joule, author Alex de Vries argues that in the future, the energy demands to power AI tools may exceed the power demands of some small nations.
“In 2021, Google’s total electricity consumption was 18.3TWh, with AI accounting for 10%-15% of this total. The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland [29.3TWh per year], which is a significant increase compared to its historical AI-related energy consumption,” said Vries.
Over time, AI will use more energy
There’s been a boom in generative AI since AI company OpenAI introduced ChatGPT to the world in late 2022. This has resulted in an increase in the demand for AI chips. NVIDIA, which is at the forefront of supplying high-end chips, reported a record revenue of $16bn in the quarter ending July 2023. This means the demand for AI chips is only rising.
There has recently also been a shift in more companies developing their own chips to meet the heavy AI requirements. Google and Amazon already have their own AI chips, whereas rumours are rift that Microsoft will be unveiling its in-house chip hardware next month.
Microsoft also has heavy investments in OpenAI, which, as per reports, is also in the beginning stages of either developing its own chips or acquiring a semiconductor company that does it for them.
All this means is that there will be a significant rise in the energy footprint of the AI industry. Vries said: “For example, companies such as Alphabet’s Google could substantially increase their power demand if generative AI is integrated into every Google search.”
Next energy consumption boogeyman
As per the leading semiconductor and AI blog SemiAnalysis, it is estimated that integrating a ChatGPT-like chatbot with each Google search would require 512,820 of NVIDIA’s A100HGX servers, which means more than four million GPUs. At a power demand of 6.5kW per server, this would translate into a daily electricity consumption of 80GWh and an annual consumption of 29.2TWh.
The author noted that AI tools have an initial training phase followed by an inference phase. The training phase is the most energy-intensive and has been the centre of AI sustainability research done thus far.
The inference phase is when these tools generate output based on the data they are trained on. The author has called on the scientific community to pay more attention to this phase.
“...OpenAI required 3,617 of NVIDIA’s HGX A100 servers, with a total of 28,936 GPUs, to support ChatGPT, implying an energy demand of 564MWh per day,” said Vries. And this is just to get the chatbot started before any consumer even started using it.
“Compared to the estimated 1,287MWh used in GPT-3’s training phase, the inference phase’s energy demand appears considerably higher,” he added.
The author lastly noted that it is too optimistic to expect that improvements in hardware and software efficiencies will fully offset any long-term changes in AI-related electricity consumption. But efforts are being made.
It was reported recently how a team of MIT researchers are finding ways to reduce the power consumption of AI models. They have been able to reduce the energy consumption of an AI model by 12-15% by capping the power consumed by the GPUs powering it.
Vries also suggested that older and unused GPUs employed in mining cryptocurrency Ethereum can be repurposed.