GenAI’s Giant Energy Appetite
The training and operation of generative AI models is consuming ever-greater quantities of the worlds' energy. What's the scale of the problem and what are the potential solutions?
As GenAI models scale up, their massive energy requirements have sparked growing alarm among many observers. The energy required to run complex algorithms, the need for extensive data processing, and the scale of model training demand massive amounts of energy.
However, not every AI task is created equal.
MIT Technology Review reports, “...creating images is thousands of times more energy-intensive than generating text.” Additionally, a smaller model with a narrower focus uses fewer resources than all-purpose generative AI models. Still, the severity of the larger problem varies based on who you ask. How serious is the problem, and what are the possible solutions?
Transparency in AI energy consumption: how much do we know?
Transparency around the energy usage of AI firms is crucial for addressing the environmental impact of Gen AI models. However, exact numbers are hard to come by.
The Verge reports, “Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.”
As these models grow — and proliferate — it's unclear where the trend lies. The energy consumption may have grown as the complexity of the tasks these AI perform grows and adoption increases. “On the other hand, companies might be using some of the proven methods to make these systems more energy efficient — which would dampen the upward trend of energy costs,” says The Verge. Knowing for sure is nearly impossible, as the companies creating these tools are not sharing their energy bills with the masses.
As the companies behind large language models have become more profitable, they have also become more secretive. A few short years ago, OpenAI published details such as what hardware they use and for how long.
These days, none other than Bill Gates is downplaying the importance of this issue. The Financial Times reports, “Speaking in London, Gates urged environmentalists and governments to ‘not go overboard’ on concerns about the huge amounts of power required to run new generative AI systems, as Big Tech companies such as Microsoft race to invest tens of billions of dollars in vast new data centres.”
He acknowledged that AI’s consumption would increase global electricity usage by 2-6% but asserted that it could offset its usage. Not everyone is as laissez-faire about the situation.
Projected growth in AI energy consumption and its implications
At the 2024 World Economic Forum meeting in Davos, OpenAI’s Sam Altman admitted what researchers have known for years: AI’s energy consumption is an issue. Science periodical Nature reports, “Altman warned that the next wave of generative AI systems will consume vastly more power than expected, and that energy systems will struggle to cope. ‘There’s no way to get there without a breakthrough,’ he said.”
This echoes what the experts have been saying. According to Nature, “It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search.” Meanwhile, a report from the International Energy Agency (IEA) pointed to “data centres, artificial intelligence (AI) and the cryptocurrency sector” as a significant driver in the growth of energy consumption in the coming year, saying it could double by 2026. That is significant: “After globally consuming an estimated 460 terawatt-hours (TWh) in 2022, data centres’ total electricity consumption could reach more than 1 000 TWh in 2026. This demand is roughly equivalent to the electricity consumption of Japan.”
Meeting these kinds of energy demands at a crucial time in the fight against climate change generally means looking for viable alternatives to fossil fuels. Nature asks, “So what energy breakthrough is Altman banking on? Not the design and deployment of more sustainable AI systems — but nuclear fusion. He has skin in that game, too: in 2021, Altman started investing in fusion company Helion Energy in Everett, Washington.”
That may be wishful thinking, however, as experts tend to agree that nuclear fusion will not significantly contribute to decarbonization. So, if alternative energy solutions are not the solution to the problem of AI’s ravenous energy consumption, what is?
Strategies for reducing the energy consumption of Gen AI models
It’s possible that the industry could simply decide to do what’s right for the planet. “As the BigScience project in France demonstrated with its BLOOM model3,” reports Nature, “it is possible to build a model of a similar size to OpenAI’s GPT-3 with a much lower carbon footprint.” However, leaving an industry to regulate itself rarely leads to companies doing the sensible thing.
The IEA advocates for intervention: “Updated regulations and technological improvements, including on efficiency, will be crucial to moderate the surge in energy consumption from data centres.”
MIT News points out the lack of transparency in the industry: “When searching for flights on Google, you may have noticed that each flight's carbon-emission estimate is now presented next to its cost…A similar kind of transparency doesn't yet exist for the computing industry, despite its carbon emissions exceeding those of the entire airline industry.”
However, MIT also points out that researchers are working on tools to address AI’s runaway energy consumption. MIT’s Lincoln Laboratory Supercomputing Center (LLSC) is developing various techniques, such as power-capping hardware and adopting tools to stop AI training early on. “Crucially, they have found that these techniques have a minimal impact on model performance,” MIT reports.
Harvard Magazine presents a different approach. Sara Hooker, head of the nonprofit research group Cohere For AI, suggests that smaller, smarter models may be the answer. “Some benefits of size can also be achieved through other techniques, researchers say, such as efficient parameterizations (activating only the relevant parameters for a given input) and meta-learning (teaching models to learn independently),” reports Harvard Magazine. “Instead of pouring immense resources into the pursuit of ever-bigger models, AI companies could contribute to research developing these efficiency methods.”
In addition to the suggestions above, Harvard Business Review suggests practical ways to cut down on AI’s escalating energy needs for both creators and users:
- Use existing large generative models instead of creating new ones, and fine-tune existing models.
- Use computational methods that are more energy-efficient.
- Use a large model only when it offers distinct value.
- Be thoughtful about when you use generative AI.
- Investigate the energy sources used by cloud providers and data centers.
The tools for reducing the energy footprint of Gen AI models already exist. Employing these tools is essential for sustainable AI development, and the main question is whether the will to do so exists.
The role of legislation in managing AI energy use
Legislation could play a critical role in managing and reducing the energy consumption of Gen AI models. With tools and strategies in place, it may take political action to force companies to employ those solutions.
In February, the U.S. introduced the Artificial Intelligence Environmental Impacts Act of 2024. This calls on the National Institute for Standards and Technology to begin addressing the issues through collaboration with academics, industry stakeholders, and society at large to establish standards for assessing AI’s environmental impact. The Act also creates a voluntary reporting framework for AI developers and operators. The future of that bill could be a harbinger of what’s to come.