Doug O'Laughlin: rapid AI deployment could trigger broad deflation, and policy is nowhere near ready
Feb 25, 2026 with Doug O'Laughlin
Key Points
- Doug O'Laughlin argues rapid AI deployment could trigger broad deflation within years, distributing instantly via the internet unlike historical tech shifts that took decades to ripple through economies.
- Software engineering job postings remain strong despite AI coding advances because existing engineers become vastly more productive, while junior hires and new entrants face a shrinking ladder.
- Server CPU shortages and power constraints are becoming critical infrastructure bottlenecks; NVIDIA has more chips allocated than customers can power, giving Google a structural advantage.
Summary
Doug O'Laughlin, founder of Semi Analysis, warns that rapid AI deployment could trigger broad deflation while policymakers remain unprepared. The mechanism differs from historical technology shifts. If AI agents become 20 times more productive, the indexed cost of knowledge work could collapse from $100 to $20 within years. The railroad took 50 years to build out. Electricity and the printing press required decades of physical infrastructure. AI distributes instantly through the internet. "You just press a button, you download it, and it's there." Deflation, if it occurs, would broadcast everywhere simultaneously rather than ripple through the economy over generations. The risk is not gradual workforce retraining but sudden, broad disruption that outpaces policy response.
Employment lag
O'Laughlin notes a real tension in the data. AI coding is already the most advanced application of large language models in knowledge work, yet software engineering job postings have risen year over year and layoffs explicitly attributed to AI remain rare. His explanation departs from a pure "no impact" reading. He describes a "sugar high" phase where existing engineers become dramatically more productive, pegging CPU utilization at 100% as they work harder than ever while capturing outsized output gains. The pain concentrates on net-new entrants. Junior engineers and new college graduates face a shrinking ladder because companies optimize the installed base rather than hire to expand. "It really is great for the install base of people who've been doing it, but very terrible for net new." Layoffs will follow, but with a lag. Executives typically decide on cuts weeks or months before announcing them. O'Laughlin also notes that admitting large-scale AI-driven job elimination carries political and reputational cost. CEOs avoid the framing even when true.
Market moats under pressure
The broader market has absorbed the AI narrative unevenly. Stocks sit near all-time highs year-to-date, but beneath the surface, software names have been "very, very, very painful." Insurance brokers, real estate services, and other network-effect businesses are being "nuked" by the possibility of AI disruption. O'Laughlin argues that number two, three, or five players in concentrated markets should defect fastest to agentic tools to win share from the leader. DoorDash might defend through exclusivity. Walmart should go all-in on agentic commerce because it is not Amazon. The moat question is not whether network effects disappear but whether the incumbent can move faster than the insurgent using cheap AI.
Agentic commerce adoption
O'Laughlin points to China as the bellwether for agentic commerce adoption, citing the region's lead on mobile payments, livestream shopping, and e-commerce generally. A recent Tencent promotion involving millions of subsidized bubble tea purchases may be the first wave of meaningful adoption. The parallel is the 2015 Lunar New Year gala, when Alipay and WeChat Pay gave away mobile payment credits and catalyzed a shift. U.S. adoption remains in the 0.01% range for agentic checkouts on Shopify. Worth watching, but still negligible.
Server CPU shortage
O'Laughlin identifies a secondary shortage of server CPUs, not GPUs, that compounds the AI compute bottleneck. Three factors drive it: real demand from reinforcement learning gym simulation needed to train agentic systems like simulated e-commerce environments, the surge of new AI apps requiring infrastructure, and a supply-side lag from the five-year depreciation cycle on 2020–2021 server hardware. Companies deliberately avoided CPU purchases to redirect capital to GPUs. Now even modest CPU demand exhausts inventory.
TSMC is capacity-constrained across all nodes. NVIDIA occupies all available N3, N2, and N2b capacity. Intel becomes the swing supplier for CPUs that cannot be made at TSMC because accelerators take priority. Graviton from AWS, Google's Axion, and other custom ARM CPUs compete for TSMC slots but face the same bottleneck.
Power becomes the ceiling
O'Laughlin flagged power availability as an underappreciated bottleneck for 2025–2026. NVIDIA has more chips allocated than customers have power to consume. Google, by contrast, has both chips and power, giving it a structural advantage. NVIDIA will likely sell out anyway. The shortage of GPUs is real and underappreciated. But power infrastructure will emerge as the hard ceiling once chip supply stabilizes.
H100 chips, now four years old, are completely sold out today. That is the single most bullish signal. Prior-generation hardware is no longer available even at premium prices, suggesting next-generation demand will be even more constrained.