An opinion piece in PV Magazine Australia on May 4, 2026 argues that AI is reshaping data centre design, citing forecasts that generative AI could reach $1.8 trillion in market size by 2032 and consume around 36% of global data centre capacity. The article discusses how AI‑driven optimisation and liquid cooling can help manage high‑density, GPU‑heavy workloads sustainably.
This article aggregates reporting from 1 news source. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.
This piece captures the physical reality behind the AI hype: next‑generation models are a power and cooling problem as much as a software story. AI training racks pulling 100–140 kW each, and the prospect of AI taking more than a third of global data‑centre capacity, force a re‑think of how we design, cool and orchestrate compute. The article argues that AI is both the driver of that demand and part of the solution, via predictive load management, workload shifting and smarter cooling strategies. ([pv-magazine-australia.com](https://www.pv-magazine-australia.com/2026/05/04/energy-for-ai-ai-for-energy-designing-ai-ready-data-centres/))
In AGI terms, none of this advances capabilities directly, but it defines the feasibility frontier. If we can’t build and power enough AI‑ready data centres without blowing past grid and climate constraints, AGI timelines will be dictated by infrastructure bottlenecks rather than algorithms. Conversely, if AI‑optimised energy systems can “bend the energy curve” for data centres, as Schneider Electric suggests, it makes more aggressive scaling of parameter counts and training runs politically and economically palatable.


