express gazette logo
The Express Gazette
Sunday, December 28, 2025

AI data centres spark trillion-dollar investment, energy and water concerns

Industry-scale AI computing drives massive construction and hardware spending, while energy, water and regulatory questions shape the pace and location of new facilities

Technology & AI 3 months ago
AI data centres spark trillion-dollar investment, energy and water concerns

Global spending on data centers that support artificial intelligence is projected to reach about $3 trillion worldwide by 2029, according to investment bank Morgan Stanley. About half of that total will fund construction, and half will cover the specialized hardware powering AI workloads. The estimate highlights how AI is reshaping the economics and geography of data infrastructure, with implications for local grids, labor markets and government policy. The figure also underscores a shift toward hyperscale facilities designed to house thousands of high-power processors in a single, dense footprint.

In the United Kingdom, analysts expect roughly 100 additional AI-focused data centers to be built in the coming years to meet demand for processing. Microsoft has announced a $30 billion investment in the UK's AI sector, underscoring the corporate push behind the new generation of infrastructure. What makes these centers different from traditional server farms is not only their scale but the need for extreme proximity and density. AI workloads rely on Nvidia GPUs arranged in dense cabinets, with only a short distance between units to minimize latency. The result is a computing environment that operates as one very large machine, with a density that increasingly defines the practical limits of capital deployment.

The Nvidia GPUs that power AI training are packed into large cabinets that can cost around $4 million each. These cabinets are organized into tightly spaced rows to reduce the tiny but cumulative delays that would otherwise slow processing. Large language models, which require turning language into innumerable micro-meaning components, operate best when thousands of processors work in near lockstep. That near-real-time coordination is what researchers and engineers describe as parallel processing, and it hinges on keeping components as close together as possible. The emphasis on proximity and density is often called the density imperative, a defining feature of AI data centers as they scale.

But the same density that enables AI to train and run enormous models also pushes electrical demand higher and more erratically than in traditional facilities. Capacity to cool and power these systems becomes a central engineering challenge. Data-center experts describe AI workloads as imposing a jagged, high-intensity load on local grids, with spikes that resemble thousands of kettles boiling at once. The Uptime Institute’s Daniel Bizo characterizes AI demand as a singular workload on the grid—unprecedented in scale and variability. The industry responds with a mix of on-site and off-grid energy solutions, advanced cooling, and smarter grid interactions to keep outages and voltage dips at bay.

In response to energy and cooling challenges, major operators are pursuing a range of strategies. Nvidia CEO Jensen Huang has said in discussions with the media that in the near term, gas-turbine capacity could be used off the grid to reduce the burden on electricity networks, while AI itself could inform the design of more efficient energy systems, including turbines, solar, wind, and possibly future fusion energy. The energy push is visible in the commitments from large tech players. Microsoft has backed a broad energy program, including deals to advance nuclear power projects, with the aim of supplying AI data centers with lower-carbon power sources. Google, owned by Alphabet, is pursuing nuclear power as part of its strategy to operate on carbon-free energy by 2030. Amazon Web Services, the cloud unit of Amazon, remains the single largest corporate buyer of renewable energy, a reflection of the sector’s effort to align AI growth with environmental objectives.

As data centers spread, regulators and local utilities are weighing the environmental externalities that could accompany a wave of AI facilities. In the United States, Virginia lawmakers are considering a bill that would tie approvals for new data centers to water-use figures, acknowledging the cooling needs of these facilities. In the United Kingdom, a proposed AI factory in northern Lincolnshire has drawn objections from Anglian Water, which operates the region’s water system. The utility notes it is not obliged to supply water for non-domestic use and suggests recycled water from the final stage of effluent treatment as a coolant option rather than drinking water, highlighting practical constraints around cooling and resource use.

Some observers question whether the current spending spree is sustainable over the long term. A data-center specialist at a tech investment advisory firm has warned that there has been a lot of public hype around AI capacity. Still, others argue that AI represents a strategic opportunity with lasting impact, possibly greater than that of the internet itself. They describe AI data centers as the real estate of the tech world, providing the physical platform for rapid advances in software and services. Yet the market will need to prove that a return on investment can be sustained across cycles, or the expansion could cool as costs and regulatory pressures rise. The debate reflects a broader tension between extraordinary potential and the practical realities of energy and infrastructure.

Even with caution about the pace and scale, many analysts expect AI data centers to remain a foundational piece of the computing landscape for years to come. The industry will be judged on its ability to balance explosive demand for AI compute with the realities of energy consumption, water use, and the resilience of local grids. If the necessary improvements in efficiency and energy sourcing can keep pace with growth, AI data centers may solidify their place as a durable backbone for future technology, blending performance with sustainability. The road ahead will likely shape where and how new centers are built, influencing regional economies and national strategies around technology and energy.

Nvidia GPUs in a data centre

Power infrastructure near data centre


Sources