In a Dwarkesh Podcast episode reported on Feb. 6, 2026, Elon Musk said that within roughly 36 months space will become the cheapest place to host AI data centres, citing superior solar power and no need for batteries. He linked the vision to SpaceX’s recent acquisition of xAI and plans to launch orbital AI compute using Starship.
This article aggregates reporting from 2 news sources. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.
Musk’s latest prediction pushes the AI infrastructure arms race into literal orbit. The underlying diagnosis—that terrestrial power grids and permitting are becoming binding constraints on hyperscale compute—is widely shared, even if his three‑year timeline for shifting AI data centers to space is highly speculative.([m.economictimes.com](https://m.economictimes.com/tech/artificial-intelligence/elon-musk-says-space-will-be-the-cheapest-place-to-put-ai-data-centres-in-3-years/articleshow/127976335.cms)) What matters strategically is that one of the largest private AI and space holders is openly orienting its roadmap around orbital compute as a differentiator for xAI and its broader ecosystem.
If even part of this vision materializes, it could create a two‑tier compute world: labs with access to vertically integrated launch and space infrastructure, and everyone else fighting over constrained terrestrial power and land. That would tilt the AGI race toward actors that can finance both advanced models and massive physical capital projects, further raising the bar for new entrants. At the same time, space‑based compute would introduce fresh governance questions around export controls, safety, debris and dual‑use military applications. For now, this is more a statement of intent than an operational reality—but it underscores how far leading players are willing to go to escape the bottlenecks that currently limit scaling.



