The Environmental Cost of Artificial Intelligence (AI)
Mohamad Faiz Bin Zainuddin
INTRODUCTION
The launch of ChatGPT in November 2022 has sparked public frenzy in generative artificial intelligence (AI). In just five days, ChatGPT logged 1 million new users, a mind-blowing statistic! Initially, ChatGPT was merely a conversational AI. Currently, ChatGPT can do much more than simply converse with users, such as generating images, poems, short essays, and articles.
The term artificial intelligence was coined by mathematician Alan Turing and computer scientist John McCarthy in 1956. Early efforts in this technology focused on symbolic reasoning and rule-based systems. In the 1970s and 1980s, the progress of technology was initially hampered by limited computing power. In the 1990s and 2000s, the interest in technology began to increase. The 2010s witnessed breakthroughs in deep learning, driven by big data and GPUs, resulting in significant advances in image recognition, natural language processing, and autonomous systems.
Before the launch of ChatGPT, AI technology was primarily inaccessible to the public. However, with the popularity of ChatGPT, major tech companies such as Google, Microsoft and Meta raced to release or improve their own generative AI tools. Startups and investors flooded the space, leading to a surge in generative AI. Nowadays, it is safe to say that AI became central to discussions in education, business, law, art, and software development.
ENERGY CONSUMPTION AND ENVIRONMENTAL IMPACT
At the heart of AI are machine learning models, especially deep learning models, which require vast computational resources. Training these models can demand thousands of powerful GPUs running continuously for days or even weeks. According to an article published by the MIT Technology Review, “training several common large AI models, specifically the natural-language processing (NLP), can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself).”
An article published by the Verge stated that “training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.”
One major contributor to AI's environmental footprint is data center energy consumption. Data centers that host AI workloads require electricity not only for computation but also for cooling systems to prevent overheating. The International Energy Agency (IEA) projected that “electricity demand from data centres worldwide is set to more than double by 2030 to around 945 terawatt-hours (TWH), slightly more than the entire electricity consumption of Japan today. AI will be the most significant driver of this increase, with electricity demand from AI-optimised data centres projected to more than quadruple by 2030.”
The environmental impact also extends to the hardware lifecycle. Manufacturing GPUs and other specialized chips used in AI systems requires mining rare earth elements, a process associated with habitat destruction, pollution, and significant carbon emissions. Additionally, rapid obsolescence in the tech industry leads to increased electronic waste, much of which ends up in landfills or is improperly recycled.
INDUSTRY’S REPONSE
According to an article published in Arbor, leading AI players are actively working to reduce the environmental impact of their technologies. Open AI, for example, collaborates with Microsoft to utilize Azure’s carbon-neutral infrastructure, aiming to lower emissions from large-scale AI operations. “Azure plans to become carbon-negative by 2030, using renewable energy and carbon offset initiatives to reduce its environmental footprint.” On the other hand, the Amazon Web Service (AWS) is “targeting 100% renewable energy usage by 2025 and provides tools to help clients track and manage their carbon emissions.” Google Cloud “powers its operations with sustainable data centers and matches all energy consumption with renewable energy purchases.”
REFERENCES
Hao, K. (2019) “Training a single AI model can emit as much carbon as five cars in their lifetimes,” Source: https://www.technologyreview.com/2019/06/06/239031/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes/ (accessed on 11th June 2025)
IEA.ORG (2025) “AI is set to drive surging electricity demand from data centres while offering the potential to transform how the energy sector works,” Source: https://www.iea.org/news/ai-is-set-to-drive-surging-electricity-demand-from-data-centres-while-offering-the-potential-to-transform-how-the-energy-sector-works (accessed on 11th June 2025).
Vereb, M. (2025) “AI’s Environmental Impact: Calculated and Explained.” Source: https://www.arbor.eco/blog/ai-environmental-impact (accessed on 11th June 2025).
Vincent, J. (2024) “How much electricity does AI consume?” Source: https://www.theverge.com/24066646/ai-electricity-energy-watts-generative-consumption (accessed on 11th June 2025).