Energy Consumption of Artificial Intelligence

Energy Consumption of Artificial Intelligence

The field of artificial intelligence has the potential to bring forth positive changes across the different facets of societies. It can create new economic opportunities and resolve pressing problems at both the micro and macro levels. However, with the rapid pace at which the field is developing and the expansion of its real-world applications, concerns over its negative impact on the environment have emerged. The fact remains that developing and deploying artificial intelligence systems are energy-intensive pursuits that can contribute to carbon emissions or result in misplaced or displaced energy consumption.

How Developing and Deploying Artificial Intelligence Systems are Increasing Global Energy Consumption

General Overview

Note that the energy consumption of artificial intelligence is somewhat similar to the consumption of most data centers but its expanding applications have also made its energy requirement similar to power-inefficient blockchain technology systems and cryptocurrency mining operations based on proof-of-work algorithms.

Developing and deploying AI systems involve computation. Even training and running specific AI models depend on high computational requirements. These requirements involve the use of processors housed in data centers. Furthermore, to keep these data centers running at their optimal performance, efficient cooling systems are required.

Nevertheless, because of the continuous development of experimental models and practical models for various end-use applications, in addition to the widespread utilization of various specific applications across different fields or industries and markets, the total energy consumption of artificial intelligence has increased.

Understanding better the energy consumption of artificial intelligence requires understanding how exactly the field consumes energy. Most of the energy requirement comes from training AI models but there is also substantial energy consumption in building supercomputers, collecting and storing data, and running data centers. Below are the details:

• Consumption During Development: Developing AI systems involves using computational resources. Training a specific machine learning or deep learning model involves collecting and storing huge amounts of data in storage mediums and using processors for processing these data. Both storage mediums and the operations of processors consume electricity. The data centers that house these hardware components need cooling systems to keep them running under ideal temperatures. These cooling systems need uninterrupted access to electricity.

• Consumption During Deployment: Deploying AI systems involves on-demand access to computational resources to make real-time predictions or decisions. This is the interference phase. AI models still depend on storage mediums and processors for retrieving stored data, processing input data, and generating output data. The interference phase might not be as energy-intensive as the training phase but it is important to note that deployed models need uninterrupted access to electricity to keep them operational and available to end-use users.

Specific Examples

The European Commission of the European Union noted that artificial intelligence can help address the ongoing climate emergency but it stresses the fact that the field has an energy problem too. Pushing further advances in the field and promoting its widespread adaptation and applications mean increasing its energy requirement and carbon footprint.

Several reports have determined the energy consumption of artificial intelligence. The same report from the European Commission explained that self-driving cars require up to 20 percent more energy than regular cars. The algorithms used in the content delivery features of social networking sites and streaming services require additional energy inputs.

Researchers from Google concluded that research and development activities related to AI and the utilization of AI applications as part of its core technological capabilities and assets accounted for 10 to 15 percent of the total electricity consumption of the company in 2021. This was equivalent to about 18.3 terawatt hours.

Another report from Stanford University assessed the carbon footprint of three large language models. These were Gopher from Google, OPT from Meta Platforms, BLOOM from Big Science, and GPT-3 from OpenAI. GPT-3 had the biggest footprint. The training of this model released 502 metric tons of carbon due to its energy consumption.

Take note that the training of GPT-3 specifically consumed about 284 megawatt-hours of energy. This was roughly equivalent to the annual energy consumption of 25 average households in the United States. GPT-3 is part of the GPT family of transformer-based large language and foundation models that power the popular ChatGPT chatbot.

Karen Hao of the MIT Technology Review explained that training a single AI model can have a carbon emission equivalent to about five times the lifetime emissions of an average-sized car. The collective energy consumption of activities and pursuits related to the field is even compared with the fuel consumption of the aviation industry.

FURTHER READINGS AND REFERENCES

  • Ekin, A. 2019. “AI Can Help Us Fight Climate Change. But It Has an Energy Problem, Too.” Horizon: The Eu Research and Innovation Magazine. European Commission, European Union. Available online
  • Hao, K. 2019. “Training a Single AI Model Can Emit as Much Carbon as Five Cars in Their Lifetimes.” MIT Technology Review. Available online
  • Stanford University Human-Centered Artificial Intelligence. 2023. Measuring Trends in Artificial Intelligence: The AI Index Report. Stanford University. Available online