On May 7, Australian outlet iTnews, citing Bloomberg, reported that Microsoft is considering delaying or abandoning its 2030 goal of matching all its electricity use hourly with renewable energy as AI data‑center demand surges. The company said it is still looking for ways to meet the target but acknowledged that massive new AI infrastructure is straining earlier climate commitments.
This article aggregates reporting from 2 news sources. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.
If one of the world’s most aggressive corporate buyers of clean power is re‑evaluating its flagship 2030 target because of AI, it tells you how dramatic the compute build‑out has become. Microsoft, like its peers, is racing to stand up multi‑gigawatt data‑center campuses to support Copilot and Azure AI. Matching that load with clean energy on an hourly basis is technically and commercially challenging, especially when grid upgrades and renewable projects move far slower than GPU procurement. The uncomfortable trade‑off is that accelerating AI—and by extension the path toward AGI—may come at the cost of slower decarbonisation.
Strategically, this underscores that compute scale is now constrained less by chips and more by power, permitting and public tolerance for mega‑projects. Labs and hyperscalers that can secure long‑term, low‑carbon energy at scale will enjoy a structural advantage; those that can’t may lean harder on gas and nuclear. For the AGI community, this should be a wake‑up call: discussions about model safety and governance need to sit alongside hard questions about energy systems, climate risk and the political optics of burning more fuel to train ever‑larger models.



