OPN AI
Image Credit: Shutterstock

AI is being hailed as a game-changer across industries, from health care to finance, with predictions of its transformative potential dominating headlines. There are extensive debates on its impact on employment, with studies forecasting both mass displacement and the creation of new jobs.

In defence, discussions have turned towards the ethics and regulation of AI in Lethal Autonomous Weapon Systems (LAWS). Meanwhile, policymakers are focused on how to regulate AI’s use to ensure safety and ethical compliance. But few discussions focus on a key issue: the carbon footprint of AI systems themselves.

In 2019, researchers at the University of Massachusetts Amherst published a study that revealed an alarming fact: training a single large AI model, such as BERT, could emit as much as 626,000 pounds of CO2 — equivalent to the emissions produced by five average cars over their entire lifetimes. This startling finding underscores a critical yet overlooked aspect of AI’s development: its massive energy consumption and the resulting environmental impact.

OpenAI’s GPT-3, with its 175 billion parameters, required an estimated 1,287 MWh to train, translating into approximately 552 metric tons of CO2 emissions. To put this into perspective, this is equivalent to the annual emissions of 60 average American households. As the demand for more sophisticated AI models grows, so does the need for large-scale computing power, driving up energy use in data centres that often rely on fossil fuels.

Read more by Aditya Sinha

Significant environmental footprint

Data centres, essential for AI and digital services, have a significant environmental footprint, especially in terms of water usage. On average, they consume between 11 and 12 lakh litres of water daily to cool servers. As the demand for AI and cloud storage grows, this strain on water resources will intensify, particularly in regions already facing water scarcity.

While the energy demands of data centres are well known, their water consumption is often overlooked. Cooling is vital to keep servers running efficiently. According to the Uptime Institute (2021), the rise of AI-driven data storage has led to greater reliance on water-cooled systems, raising concerns about the sustainability of these operations.

Google (2018) reported that its data centres use water-intensive evaporative cooling systems, adding to the environmental cost. As per the Financial Times (2023), power demand from data centres could reach 1,000 TWh by 2026, putting additional pressure on water supplies as facilities expand.

The paradox is that while developing countries are urged to adopt strict climate commitments, advanced economies are expanding their water-intensive AI ecosystems. AI technologies like machine learning consume vast resources.

The International Energy Agency (2023) estimates that training a single AI model consumes as much energy and water as hundreds of homes in a day. Stechemesser et al. (2024) highlight that AI’s energy and water demands are outpacing green energy production, creating a conflict between technological growth and sustainability goals.

This expansion exacerbates water scarcity in developing countries, many of which are already under stress. The World Resources Institute (2021) warns that as data centres proliferate, water crises in these regions could worsen. Meanwhile, S&P Global Commodity Insights (2024) reports that the power and water demands from AI infrastructure may exceed what developing nations can sustainably supply.

By 2034, global data Centre energy consumption is expected to reach 1,580 TWh — equivalent to India’s current total energy use. Elon Musk has cautioned that AI and EV expansion will create electricity and infrastructure shortages. Without breakthroughs in water-efficient cooling or alternative energy sources, AI’s environmental costs will continue to rise.

AI’s energy appetite

Microsoft’s recent 20-year deal to restart the Three Mile Island nuclear plant highlights a growing trend where Big Tech seeks sustainable energy solutions to meet the rising energy demands of AI infrastructure. This deal, scheduled to provide over 835MW of carbon-free electricity by 2028, is an innovative step in addressing the enormous power needs of Microsoft’s AI operations. The plant’s power output is equivalent to the energy required to power around 800,000 homes annually, emphasising the scale of AI’s energy appetite.

However, such examples remain exceptions rather than the norm. Most tech companies are still heavily reliant on conventional energy sources, and the grid’s capacity to handle AI-driven demand remains a critical issue. The North American Electric Reliability Corporation (NERC) has raised concerns about the strain this rapid growth places on power grids, warning that major upgrades are necessary to prevent reliability issues.

The fact that AI data centres are projected to consume up to 1,580 TWh of electricity globally by next few years, a figure comparable to India’s entire energy consumption, indicates that current sustainable energy solutions, while commendable, are far from widespread.

The discrepancy between the push for AI expansion and climate commitments is increasingly evident, especially as developing nations face pressure to meet stringent environmental goals without the same access to advanced energy solutions like nuclear power.

To mitigate the environmental costs of AI, particularly in energy and water consumption, a shift towards more efficient AI model architectures and advanced cooling systems is essential.

Techniques such as model pruning, quantisation, and knowledge distillation can significantly reduce the computational complexity of training and inference without compromising performance, thereby decreasing energy requirements.

Furthermore, transitioning to edge computing can distribute computational loads, reducing the reliance on centralised data centres.

In cooling, moving from traditional water-based systems to direct-to-chip liquid cooling and phase-change materials can minimise water usage while improving energy efficiency.

Additionally, integrating real-time AI workload management systems that dynamically allocate resources based on energy availability and system demands can optimise energy consumption.

Finally, investment in AI-specific hardware accelerators, such as neuromorphic chips and optical processors, which are inherently more energy-efficient than traditional hardware, is critical for long-term sustainability.

Aditya Sinha (X: @adityasinha004) is an Officer on Special Duty, Research at the Economic Advisory Council to the Prime Minister of India. Views are personal.