Micron said on 18 December that tight global memory supplies driven by AI data centre demand will nearly double its next-quarter adjusted profit versus Wall Street expectations. The company’s shares jumped about 14% in premarket trading, with management forecasting memory market tightness well past 2026.
This article aggregates reporting from 1 news source. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.
Micron’s blowout forecast is another data point that we’ve entered the AI memory supercycle many analysts have been predicting. High‑bandwidth memory (HBM) is the critical consumable for training and serving large models, and Micron is one of just three suppliers alongside Samsung and SK Hynix. When Micron tells investors that demand from AI data centres is tight enough to drive record profits and justify $20 billion in capex for 2026, it’s effectively saying the industry is betting on years of sustained model scaling. ([reuters.com](https://www.reuters.com/business/micron-shares-up-12-europe-after-blowout-forecast-2025-12-18/))
For the race to AGI, this matters because the constraint is shifting from GPUs in isolation to full stacks: accelerators plus HBM plus advanced packaging, all under power and yield constraints. Whoever can secure long‑term HBM supply and co‑optimise models with emerging memory technologies will enjoy a structural advantage. It also reinforces the geopolitical lens: AI scaling is now tightly coupled to a handful of East Asian memory fabs and their supply chains, which is why Chinese upstarts like MetaX and Moore Threads are drawing speculative capital even while lagging technologically. Investors should read Micron’s commentary as a leading indicator that we’re nowhere near a saturation point for AI infrastructure demand, even if valuations get choppy.

