NEW YORK: A growing chorus of tech leaders – from Elon Musk to Tim Cook – is warning that the world is sliding into a full‑blown memory‑chip crisis.
A shortage of DRAM, the basic component underpinning almost every modern device, is beginning to erode profits, disrupt production plans and push up prices across consumer electronics, automobiles and data centre infrastructure. And the squeeze is tightening.
Since early 2026, Tesla, Apple and a string of major manufacturers have cautioned that limited DRAM supply will constrain output.
Cook has warned of margin pressure on the iPhone, while Micron Technology has described the bottleneck as “unprecedented”.
Musk, never one to understate a structural threat, said Tesla may have no choice but to build its own memory‑chip fabrication plant. “We’ve got two choices: hit the chip wall or make a fab,” he said in January.
The root cause is the explosive buildout of artificial intelligence (AI) data centres. Companies such as Alphabet and OpenAI are buying millions of Nvidia AI accelerators, each packed with vast amounts of high‑end memory, to power chatbots and other generative‑AI applications.
That surge has diverted supply away from consumer electronics makers, leaving firms from Samsung to Micron struggling to meet demand.
The resulting price shock has been dramatic. One category of DRAM jumped 75% between December and January, accelerating a trend of daily price adjustments by retailers and intermediaries.
Industry insiders have begun referring to the phenomenon as “RAMmageddon”.
Infrastructure plans
Tim Archer, chief executive officer (CEO) of Lam Research, captured the scale of the moment: “We stand at the cusp of something bigger than anything we’ve faced before.
“What is ahead of us between now and the end of this decade will overwhelm all other sources of demand.”
What alarms analysts is that the crunch is emerging even before the AI giants fully ramp up their infrastructure plans.
Alphabet and Amazon have announced record‑shattering capital expenditure programmes for 2026 – US$185bil and US$200bil, respectively – the largest annual corporate spending commitments in history. As those data centre projects accelerate, memory demand is expected to soar further.
Bernstein analyst Mark Li warns that DRAM prices are going “parabolic”.
That spells windfall profits for Samsung, Micron and SK Hynix, but a painful reckoning for the rest of the electronics sector.
Lenovo CEO Yang Yuanqing said the imbalance between supply and demand is structural, not cyclical, and will persist through at least the end of the year.
The fallout is already reshaping product strategies.
Sony is considering delaying the launch of its next PlayStation console to 2028 or 2029, a major disruption to its carefully sequenced hardware roadmap.
Nintendo, which helped fuel demand in 2025 with its Switch 2 launch, is weighing a price increase for the device in 2026.
Smartphone makers, including Xiaomi, Oppo and Transsion, are cutting shipment targets, with Oppo reportedly slashing its forecast by up to 20%.
Manufacturers are scrambling to secure supply. A laptop‑maker executive said Samsung has shifted from annual to quarterly reviews of memory supply contracts. Norwegian IT firm Atea described the situation as a storm that must be managed “hour by hour and day by day”.
Cisco cited the memory squeeze when issuing a weak profit outlook that triggered its steepest share drop in nearly four years. Qualcomm and Arm have also warned of further disruption.
The strain is visible on the ground. At Sunin Plaza, Seoul’s DIY PC hub, the usual weekday bustle has evaporated.
“It’s wiser to hold off doing business today, as prices are almost certain to be higher tomorrow,” said Suh Young‑hwan, who runs three PC shops in the area.
“Unless Steve Jobs rises from the dead to declare that AI is nothing but a bubble, this trend is likely to persist.”
The premium PC segment has been hit particularly hard.
Micron’s decision last year to discontinue its long‑running Crucial brand of consumer memory sticks triggered a rush to secure remaining inventory.
Falcon Northwest CEO Kelt Reeves said the scramble pushed memory prices to new highs in January, lifting the company’s average selling price by US$1,500 to around US$8,000 per custom‑built machine.
All of this carries unmistakable echoes of the Covid‑era semiconductor shortages that paralysed global supply chains.
Back then, an unexpected surge in demand for home‑office equipment and contact‑free technology overwhelmed production of basic auto and power chips, forcing automakers from Ford to Volkswagen to halt assembly lines and pushing smartphone makers into costly stockpiling.
Training models
That crisis triggered a worldwide push – especially in the United States – to build domestic chip manufacturing capacity.
This time, the disruption is rooted not in a temporary demand shock but in a structural pivot towards AI.
Meta, Microsoft, Amazon and Alphabet are pouring unprecedented sums into data centres capable of training and hosting AI models.
Their combined capital expenditure has surged from US$217bil in 2024 to about US$360bil last year, and is projected to reach an extraordinary US$650bil in 2026 – a level of investment that rivals the most expensive human endeavours in history.
The race is existential: each company is spending aggressively to outbuild its rivals and secure the infrastructure that will define its future.
Few industries have been reshaped more dramatically by this arms race than global memory.
In the three years since the launch of ChatGPT, Samsung, SK Hynix and Micron have redirected the bulk of their manufacturing capacity, research and development, and capital spending towards high‑bandwidth memory (HBM), the specialised stacked DRAM used in Nvidia and AMD’s AI accelerators.
The shift has come at the expense of conventional DRAM production – the basic memory used in phones, PCs, cars and countless everyday devices.
The logic is simple. Every Nvidia AI accelerator purchased by hyperscalers requires a large volume of HBM, often stacked in eight or 12 layers. Nvidia’s latest Blackwell chip carries 192GB of RAM, roughly six times the memory of a high‑end PC.
Higher margins
An NVL72 rack‑scale system, which bundles 72 Blackwell chips, contains 13.4 terabytes of memory.
Each system consumes enough DRAM to supply a thousand premium smartphones or several hundred powerful PCs.
As hyperscalers buy these systems by the thousands, the strain on global memory supply becomes obvious.
TrendForce estimates that demand for HBM will jump 70% in 2026 alone. By then, HBM will account for 23% of total DRAM wafer output, up from 19% last year.
The economics reinforce the shift: HBM commands far higher margins than standard DRAM, allowing Samsung, SK Hynix and Micron to capitalise on the imbalance. Micron’s revenue is expected to more than double this fiscal year, while SK Hynix’s sales – which already doubled in 2024 –are likely to double again.
But the boom in HBM spells trouble for the rest of the world. The diversion of capacity is leaving consumers and manufacturers short of the memory needed to store photos, run apps, steer vehicles and power industrial systems. — Bloomberg
