Data centre electricity demand will more than double by 2030, reaching approximately 945 terawatt-hours, slightly above Japan’s total annual consumption. That figure, from the International Energy Agency, sits at the centre of a challenge that International Data Centre Day (25 March) is meant to spotlight: the tension between rapid digital expansion and finite energy capacity.
Professor Aoife Foley, IEEE Senior Member and Chair in Net Zero Infrastructure at the University of Manchester, argues that the industry needs a “spring clean” rather than just more efficient cooling.
Although it is impossible to calculate precisely, the entire ICT sector is estimated to account for about 1.4 per cent of CO2 emissions globally. Infrastructure and operations leaders have a responsibility here and need to consider the unnecessary waste associated with data storage and commit to generating power from more renewable sources.
Her argument goes beyond the usual focus on cooling innovation. While liquid immersion and direct-to-chip systems improve efficiency, Foley contends they address symptoms rather than the deeper inefficiencies in model design and compute intensity. AI workloads consume significantly more energy than traditional cloud computing tasks, and growing dependence on water-based cooling places additional pressure on local resources.
The practical recommendation: eliminate unstructured data, improve operational efficiency and reduce maintenance costs while improving regulatory compliance. Sometimes the most effective energy saving is not running the workload at all.