Skip to content

AI Consumes Electricity and Generates CO2 Emissions

Artificial Intelligence at Munich University: Latest Research Reveals: Greater AI Responsiveness Correlates with Larger Ecological Footprint.

AI Consumes Electricity and Generates Carbon Emissions
AI Consumes Electricity and Generates Carbon Emissions

AI Consumes Electricity and Generates CO2 Emissions

A groundbreaking study, titled "Energy costs of communicating with AI," has been published in the prestigious journal Frontiers in Communication. The study, led by researchers at Munich University of Applied Sciences in Germany, sheds light on the environmental impact of large language models (LLMs) and offers insights into optimising their energy consumption and response quality.

The research, with the DOI number 10.3389/fcomm.2025.1572947, evaluates various open-source AI models, including those similar to ChatGPT or Claude. Key findings reveal a correlation between energy consumption, response quality, and the number of parameters.

Advanced models, such as DeepSeek, require more computational power due to their size and greater number of parameters. This increased energy consumption leads to higher CO2 emissions. For instance, DeepSeek was found to emit approximately 2,000 grams of CO2 to answer 1,000 test questions, while less advanced models like Qwen used only 27 grams, albeit with lower accuracy.

Smaller models, however, often produce less reliable responses compared to larger ones. Yet, for simple tasks, they can still achieve comparable results while significantly reducing environmental impact. For example, using a smaller model for straightforward questions can result in roughly four times less environmental impact without a significant loss in accuracy.

The study also highlights the trade-off between energy consumption, response quality, and model size. Larger models with more parameters generally produce more accurate responses but at the cost of higher energy consumption and CO2 emissions. To minimise environmental impact without compromising performance, the HM team recommends using high-performance AI models only for complex tasks.

The study's authors, Prof. Dr. Gudrun Socher and Maximilian Dauner from the Munich Center for Digital Sciences and AI (MUC.DAI) at Munich University (HM), call for increased transparency about the energy consumption of AI systems during development and application.

Short and precise user inputs can also help reduce AI-related emissions. The study found that no model achieved an accuracy of 80% without producing more than 500 grams of CO2 equivalents per 500 responses. Moreover, it was discovered that better AI performance in "reasoning" (logical inference) leads to higher energy consumption.

For further information, contact Ralf Kastner at T 089 1265-1922 or by email. Interviews with Maximilian Dauner or Prof. Dr. Gudrun Socher can be arranged upon request.

  1. The study on the energy costs of communicating with AI, specifically examines the environmental impact of advanced models like DeepSeak, which are rooted in the field of environmental science due to their high energy consumption and subsequent CO2 emissions.
  2. While smaller AI models may produce less reliable responses, they are advantageous in terms of environmental science as they significantly reduce environmental impact, even while achieving comparable results for simple tasks.
  3. The advancements in technology, such as artificial intelligence, necessitate increased transparency in their energy consumption during development and application, as the study conducted at Munich University underlines.

Read also:

    Latest