top of page
Search

Ecological Impact of AI’s Life Cycle: Is the Solution to the Climate Crisis Becoming Part of the Problem?

  • Jan 21
  • 7 min read

The conversation around Artificial Intelligence (AI) and sustainability is fragmented. While AI has the potential to push sustainability goals, we often ignore the massive and hidden ecological cost of its entire life cycle. 









Behind every search query, every ChatGPT prompt, every Netflix binge and every scroll through social media lies a massive and surprisingly thirsty mechanism: the global data centre network. 


These data centres consume huge amounts of electricity and water to keep machines running and cool. The ecological costs are increasing every day - data centres now account for a growing share of global carbon emissions, and their reliance on energy-intensive cooling systems strains local water supplies, often in drought-prone regions. 


That being said, AI is also a powerful tool for good. It is often regarded as a key to solving the climate crisis. AI applications are being used to optimise energy consumption, reduce emissions and promote efficient resource management. For instance, AI-driven systems are employed to enhance energy efficiency in buildings, streamline transportation logistics to minimise fuel use and improve waste management through predictive analytics and automated sorting. It is also being used for tracking deforestation and assessing biodiversity loss, enabling more informed decision-making for conservation initiatives.


AI is a double-edged sword. It works in both ways as a means of achieving sustainability and as a tool that continuously challenges ways in which we can achieve a sustainable future. 


This warrants the need to work towards Sustainable AI. The idea is not just about using AI to aid in more sustainable futures, but about ensuring the entire lifecycle of AI products is compatible with a sustainable future. 


This movement looks beyond the applications of AI to address its whole sociotechnical system. It is about developing AI in a way that:



Understanding AI’s Environmental Impact and How Viral Trends Add to the Cost


Let us look at some figures and understand some facts to know how the entire AI life cycle works. 


The environmental footprint of AI extends across its full lifecycle and can broadly be categorised into embodied emissions and operational emissions. 


Embodied emissions stem from the production of hardware such as GPUs and semiconductors, as well as the construction of energy-intensive data centres, all of which rely on resource-heavy processes like mining raw materials, fabricating complex circuits, and large-scale construction. 


Operational emissions arise from the electricity consumed during both training and inference phases of AI models. Recent evidence shows that inference is now the dominant source of emissions.


Google reported in 2022 that 60% of its machine learning related energy use was attributable to inference alone. This impact is further magnified by the surging demand for computing power, which is measured in FLOPS and has increased 150-fold since 2004, with global compute usage expanding 550% in just the last decade. 


OpenAI CEO Sam Altman has warned that AI in the future will consume vastly more power than anticipated, making a major energy breakthrough an absolute necessity. 


But How Much Power Are We Talking About?


Industry commentary suggests a single data centre can consume as much electricity as 50,000 homes in a year, though exact numbers vary and public data is limited. In Ireland, data centres already account for over 20% of the country’s total electricity consumption. The process is so energy-intensive that training GPT-3 alone produced the equivalent of around 500 tons of carbon dioxide. 


Different AI applications vary in intensity: text processing is relatively less energy demanding, while tasks like image generation require more energy. Creating one AI-generated image with a powerful model uses as much energy as fully charging a smartphone, whereas running 1,000 text prompts uses 16% of a smartphone charge. Social media trends like the Studio Ghibli inspired AI images that rose to popularity a few months ago USING ChatGPT's 4o image generation, consume disproportionately more energy. A similar trend can be expected from the use of Google’s Gemini Nano feature that has seen a spike in adoption to generate images that look increasingly natural. 


OpenAI CEO Tweeted How ChatGPT 4o Image Creation is Melting GPUs
OpenAI CEO Tweeted How ChatGPT 4o Image Creation is Melting GPUs

Water Consumption


The environmental footprint of our digital ecosystem extends far beyond electricity consumption. Data centres also consume vast quantities of water for cooling, particularly in hotter regions. Studies show that each 5-50 prompts to ChatGPT require roughly 500 ml of water for cooling. It has been observed that ChatGPT receives an average of 2.5 billion requests per day. The accurate water estimate required for cooling the system per day or per prompt has not been provided by OpenAI. If we calculate based on estimates of 0.5 litres of water per 5-50 prompts, this could mean approximately 50 million litres of water per day.


Data Centres Need a Vast Amount of Water for Cooling
Data Centres Need a Vast Amount of Water for Cooling


For instance, Google’s facility in The Dalles, Oregon, reportedly used 355 million gallons of water in a single year, accounting for nearly 29% of the city’s overall water use. It has been recently reported that the new Fab 25 plant near Taichung in Taiwan is expected to use 100,000 metric tons of water a day. This is equivalent to about 7% of the municipal water demand from Taichung’s 2.8 million residents. The plant will also require at least 1 gigawatt of energy, equivalent to the annual power demand of 750,000 urban households. TSMC claims 90% of this wastewater will be reused. 


However, concerns arise with respect to what is left in the leftover waste, as well as transparency with respect to the operation. 


E-waste Problem 


AI’s life cycle creates large volumes of electronic waste as servers and chips are discarded, and produces localised pollution concentrated in regions hosting data centres and semiconductor industries. This unevenly distributes ecological burdens across communities and geographies. 



Designing Sustainable AI? 


There have been attempts to design AI in a way that reduces its overall ecological footprint. However, it has been observed that attempts to reduce energy use in one stage may still cause high inference impact. 


In this context, DeepSeek’s Mixture-of-Experts (MoE) model was regarded as a breakthrough for reducing training costs and emissions, achieving training on just 2,000 NVIDIA H800 chips at a fraction of the expense as compared to models by the United States. While this approach lowered the carbon footprint of development, the efficiency gains did not carry over to deployment. Its reliance on Chain-of-Thought reasoning produces longer, more detailed answers, which makes inference highly energy-intensive. As a result, even though training was frugal, real-world use generates significant emissions. 


DeepSeek generates a significant amount of carbon emissions when used at scale. A study found that having DeepSeek answer around 600,000 questions would produce roughly the same amount of CO₂ emissions as a round-trip flight from London to New York. In contrast, Qwen 2.5, a slightly larger model, demonstrated far greater energy efficiency. For the same overall carbon cost, Qwen 2.5 could answer approximately 1.9 million questions while maintaining comparable accuracy levels. This difference shows that efficiency is not simply a matter of model size; it depends heavily on how the model is designed, trained and optimised. Some models are built to deliver concise responses with fewer thinking steps, while others engage in more energy-intensive reasoning. 


While market-based solutions such as renewable energy certificates and carbon credits have emerged to offset AI’s growing environmental toll, their voluntary and weakly regulated nature often renders them ineffective and prone to greenwashing. 


Researchers have suggested that AI’s environmental impact can be mitigated through deliberate design choices such as adopting energy-efficient programming languages, aligning cooling methods with local weather conditions, improving hardware sustainability, and mandating transparent reporting and certification requirements for AI systems, but they also emphasise the need to assess impacts beyond human-centred concerns by including ecosystem harms to habitats and wildlife. 


The Path to a Sustainable Digital Future


The good news is that tech companies and innovators are actively seeking solutions through various techniques. 


There has been an increasing reliance on Carbon Aware Computing. This approach involves training AI models at specific times when more renewable energy is available on the power grid. Carbon-aware scheduling reduces emissions by running AI tasks during periods of low-carbon energy. Strategies include time shifting, geographic load shifting and multi-centre scheduling (splitting tasks across renewable-rich data centres). These techniques have proven to achieve reductions of 16 - 62% (Read more at STET Review 2024; arXiv, 2025). Cooling systems are also optimised via AI (EcoTaskSched, Deep-Q Scheduler), lowering emissions while maintaining safe temperatures. Even partial adoption can have a significant impact. For instance, even if 50% of AI workloads used green scheduling, millions of tons of CO₂ could be saved annually. 


Visual of time-shifting workload to periods of low-carbon energy. Source: White Paper by Microsoft, UBS, WattTime and Green Software Foundation. Linked here.
Visual of time-shifting workload to periods of low-carbon energy. Source: White Paper by Microsoft, UBS, WattTime and Green Software Foundation. Linked here.

Further, modular data centres allow for incremental expansion, enabling operators to add capacity as needed without overbuilding. They can accommodate various hardware configurations, including hyper-converged infrastructure, which integrates multiple functions into a single device, leading to more efficient power usage. By optimising design and operation, modular data centres contribute to improved energy efficiency in data storage and transmission. Experts predict that by the end of the decade, approximately 75% of the world's data centres will source more than half of their power from renewable sources, up from about 10% currently.


Sustainable hardware is also one of the keys to driving this change. The right to repair movement and extended producer responsibility programs are gaining momentum, aiming to extend the lifespan of devices and dramatically reduce e-waste.


Concluding Remarks


Building a sustainable digital ecosystem is not just up to tech giants, and its burden can not be placed only on individuals. It is a collaborative effort that requires multidisciplinary participation and research. From an everyday point of view, it requires a new way of thinking from society, taking into account the impact of subtle ways of digital engagement that have become a big part of our lives.


 
 
 

Comments


bottom of page