Patrick Bet-David said AI data centers are consuming 70% of global DRAM production, causing consumer memory prices to skyrocket and forcing technology giants to consider nuclear power solutions.

Bet-David argued this massive diversion of memory chips to artificial intelligence infrastructure has created a severe shortage for consumer electronics. "AI data centers will absorb 70% of all global DRAM production," he said. He claimed chips typically destined for smartphones and computers are being redirected to power AI systems.

The market impact has been dramatic, with the price of a 16GB DDR5 RAM module exploding from approximately $125 in July 2024 to $1,250 currently. This thousand-percent increase reflects intense competition for limited memory resources as AI development accelerates globally.

The scale of AI infrastructure growth has been extraordinary, with the number of AI data centers in America expanding from zero in 2016 to between 4,000 and 5,400 today. This rapid expansion has created unprecedented demand for memory chips, with approximately 1,500 additional AI data centers planned though half are currently paused due to slow chip production speeds.

Semiconductor manufacturing costs have escalated dramatically alongside this demand surge. TSMC's Arizona project has seen its estimated cost balloon from $11 billion to approximately $165 billion, reflecting the immense capital requirements of advanced chip production.

The global RAM market remains highly concentrated, with Samsung, SK Hynix, and Micron collectively controlling 93-95% of production. This oligopoly structure has given these companies significant pricing power as demand outstrips supply. In October 2025, OpenAI's Sam Altman allegedly signed letters of intent with Samsung and SK Hynix to reserve 900,000 RAM wafers per month by 2029, representing roughly 40% of global DRAM production.

Bet-David contended that Google's recent technological breakthrough could potentially alleviate this memory crisis. The company announced a software compression algorithm on March 25, 2026 that reduces AI memory demand by 600% with no loss in accuracy. "Software level compression algorithm for large language models reduce AI memory demand by 600% with zero loss in accuracy," he said.

The energy requirements of these massive AI data centers have become so substantial that major technology companies including Microsoft are exploring nuclear power solutions. Bet-David said Microsoft has been "quietly saying we got to go nuclear" to meet the electricity demands of their AI operations.

Bet-David said, "70% of these chips that they're building that we would use for consumer electronics are being used right now for AI data centers."

China has been aggressively building nuclear power facilities while other countries have hesitated, creating potential competitive advantages in energy-intensive chip manufacturing.