The increasing energy use driven by AI is beginning to affect the climate goals of major tech firms. In their latest sustainability reports, these companies acknowledge that the development of AI is a primary factor behind their rising energy consumption. One leading firm, for instance, has seen its greenhouse gas emissions increase by nearly 50% since 2019, complicating its target of achieving carbon neutrality by the end of the decade.
A report from the International Energy Agency (IEA) found that data centers consumed about 415 terawatt-hours of electricity in 2024—roughly equal to the annual demand of a large country. This number is expected to grow, with projections indicating that by 2030, it will exceed 900 TWh. Data center electricity use is rising several times faster than overall global demand, largely driven by significant investments to support energy-intensive AI operations.
However, it's still unclear what portion of this electricity is specifically used for AI. Data centers host a wide range of services, many unrelated to the heavy computational needs of artificial intelligence. Additionally, most tech companies keep detailed information about their software and hardware energy use confidential.

One recent study approached this issue by examining the AI hardware supply chain, focusing on the production of high-performance components required for AI processing. Using publicly available manufacturing estimates, investor transcripts, and technical specifications, the analysis provided an approximation of AI-related energy usage. Based on this method, it was estimated that AI could consume up to 82 terawatt-hours of electricity in 2025—comparable to the annual consumption of a medium-sized European country. If chip production capacity doubles this year as analysts predict, energy demand from AI could reach nearly half of all data center usage.
Despite the use of publicly available data, many unknowns remain—such as actual utilization rates of AI hardware, efficiency levels, and future trends in the industry. The lack of transparency around AI's energy footprint makes precise analysis difficult and has led to calls for greater openness from tech companies. Although some previous reports have included specific figures, more recent updates on machine learning electricity consumption have not been provided, hindering further research.
Without access to real-world operational data from computing systems, researchers face significant challenges in accurately assessing AI's environmental impact. If companies were more forthcoming with technical details, estimates of AI’s energy demand would be far more reliable.