
On January 27, 2025, a Chinese AI lab most investors had never heard of broke the market's brain.
DeepSeek released R1 — a reasoning model that matched or exceeded the performance of OpenAI's best offerings. That alone would have been a headline. What made it a market event was the price tag. DeepSeek claimed it trained R1 for approximately $6 million, using roughly 2,000 Nvidia H800 chips — export-controlled, older-generation hardware that American labs had dismissed as inadequate. OpenAI reportedly spent over $100 million training GPT-4, using approximately 25,000 A100 GPUs. The gap was staggering.
Nvidia dropped nearly 17% in a single session, erasing roughly $593 billion in market capitalization — the largest single-day loss for any company in stock market history at that point. The broader AI trade sold off hard. The dominant market narrative — that AI supremacy required spending unlimited money on the most cutting-edge silicon — was, at minimum, called into serious question.
For income investors who had been methodically building positions in AI-adjacent dividend payers — data center REITs, utilities with power purchase agreements, infrastructure plays — January 27 posed an uncomfortable question. Was the entire thesis broken? Was this a warning to sell, a rare buying opportunity, or something more nuanced?
Fourteen months later, the answer is clearer than it was in the panic. But it requires separating what DeepSeek actually threatened from what it didn't.
Understanding the Commoditization Thesis
"AI commoditization" sounds abstract. It isn't. Think of it through a historical lens.
In the early internet era, companies paid enormous premiums for bandwidth. Owning fiber was a license to print money. Then bandwidth became a commodity. Prices collapsed. The value migrated from the pipes to the applications running on them — Google, Amazon, Facebook. The infrastructure providers that survived had to find new moats or accept utility-level margins.
The same pattern is now playing out in AI. Model training costs have been falling at roughly 10x every 12–18 months. DeepSeek accelerated that trajectory by demonstrating that algorithmic efficiency — smarter training techniques, better data curation, architectural innovations — can compensate for brute-force hardware spending. If this trend continues, the "intelligence layer" becomes cheap and ubiquitous. The value migrates to whoever owns the proprietary data, the distribution channels, or the physical infrastructure underneath all of it.
This is a fundamental challenge to the thesis that the biggest AI spender always wins. It doesn't mean Nvidia's chips are worthless. It means the market was pricing them as if every AI lab on the planet would need unlimited quantities forever, at any price. DeepSeek showed that assumption was fragile.
But here's where the analysis gets interesting — and where most commentary went wrong.
The Jevons Paradox
In the 19th century, economist William Stanley Jevons observed something counterintuitive: as steam engines became more fuel-efficient, total coal consumption increased, not decreased. Why? Because cheaper energy made more applications economically viable. Factories that couldn't afford steam power at the old price suddenly could.
The same logic applies to AI. If the cost of running a large language model drops by 90%, you don't get 90% less spending on compute. You get AI deployed in a thousand new use cases that weren't economical before — customer service automation, real-time translation, drug discovery, code generation at scale, autonomous agents running 24/7. Total compute demand could grow even as per-query costs fall.
This is the central tension in the AI commoditization story, and it's the key to understanding which investments survive and which don't.
What DeepSeek Did Not Destroy
Let's walk through the AI stack and separate the vulnerable from the resilient.
What Commoditization Threatens
GPU makers at extreme valuations. Not the chips themselves — the valuations predicated on infinite spending growth. When the market assumed every hyperscaler would spend without limit on the latest silicon, it priced Nvidia like a company with a permanent monopoly on intelligence. DeepSeek showed that monopoly has cracks.
Software companies charging AI-premium pricing without defensible moats. If you slapped "AI-powered" on your SaaS product and raised prices 30%, but your AI features are built on commodity models that anyone can replicate, your pricing power is on borrowed time.
Cloud AI API providers whose margins erode as model costs fall. Selling API access to frontier models is a good business — until the models themselves become interchangeable and the price per token races toward zero.
What Commoditization Does NOT Threaten
Power infrastructure. Whether a model costs $6 million or $600 million to train, it still needs electricity. Whether it runs in Shenzhen or Santa Clara, it still needs cooling, substations, and grid connections. And if the Jevons Paradox holds — if cheaper AI means dramatically more AI usage — then electricity demand from data centers continues to grow regardless of model efficiency improvements.
Data center real estate. The physical space for servers is constrained by geography, permitting timelines, and power availability. You cannot download a data center. Efficiency gains in AI models don't build new facilities, don't fast-track municipal permits, and don't create new power interconnections. REITs like Equinix (EQIX), Digital Realty (DLR), and Iron Mountain (IRM) are physical scarcity plays. Equinix signed a record number of leases in Q1 2025 — not despite the efficiency revolution, but arguably because of it.
Network infrastructure. More AI queries — even at dramatically lower cost per query — require more bandwidth and lower latency. Fiber networks, edge computing facilities, and networking equipment see demand growth regardless of model cost trajectory. Cell tower REITs like American Tower (AMT) and Crown Castle (CCI) sit on infrastructure that becomes more valuable as AI pushes to the edge.
Any AI-adjacent investment whose thesis depends on model training costs staying high is exposed. DeepSeek proved that algorithmic breakthroughs can collapse cost curves overnight. If your dividend stock's revenue projections assume hyperscalers will spend exponentially more on GPUs every year, indefinitely, you're holding a thesis that has already been challenged by real-world evidence. Stress-test every AI holding against a world where intelligence is cheap.
The Hyperscaler Spending Paradox
Here's what actually happened after DeepSeek rattled the market.
Microsoft, Amazon, Alphabet, and Meta didn't cut their infrastructure budgets. They accelerated them. In Q1 2025, these four companies collectively announced or reconfirmed over $320 billion in combined CapEx guidance for 2025 — all higher than 2024 levels. The highest combined infrastructure spend in corporate history.
Why would the companies with the deepest AI insight respond to cheaper models by spending more on infrastructure?
Because they read the Jevons Paradox and bet on it massively. Cheaper models mean more deployment, more use cases, more enterprise adoption, more consumer integration. Every cost reduction in the intelligence layer makes the physical layer more valuable, not less. When Microsoft knows the model cost is falling and still doubles its data center CapEx, that's an extraordinarily strong signal about where the value is flowing.
For income investors, this is the critical data point. The companies selling services to the hyperscalers — power, land, cooling, fiber — are seeing their demand validated by the most informed buyers on the planet.
The most resilient income plays in the AI stack are the ones you can touch. Power plants, data center buildings, fiber cables, transformer substations. Model architectures change every six months. A 50-megawatt power interconnection and a 200,000 square foot data center hall take years to build and decades to depreciate. Physical scarcity doesn't get disrupted by a better algorithm.
Reassessing the AI Infrastructure Dividend Plays
With fourteen months of post-DeepSeek data, let's categorize the income-relevant AI plays by conviction level.
High Conviction: Physical Scarcity
Data center REITs remain the core holding. EQIX, DLR, and IRM are not going anywhere — in fact, commoditization accelerates their leasing demand as more companies deploy AI workloads at lower cost. Their revenue is tied to rack space leased and power consumed, not to the price of the model running on the hardware.
Utilities with data center exposure — particularly those with power purchase agreements (PPAs) attached to hyperscaler campuses — have pricing power that is locked in for years. The grid bottleneck means new data centers can't get power connections fast enough. Utilities that can deliver megawatts have leverage.
Midstream infrastructure — fiber conduit owners, tower REITs — benefit from the same physical scarcity dynamics. More AI usage means more data moving through more pipes.
Moderate Conviction: Repricing Required
Semiconductor equipment companies still sell to chipmakers who are still building chips regardless of model cost trajectories. But these names were priced at peak AI hysteria multiples. The thesis works at a reasonable valuation; it doesn't work at 40x earnings.
Cloud platform providers (AWS, Azure, GCP as business segments) are still growing overall revenue, but AI API pricing will compress over time. The gross margin on serving AI inference is already eroding. These are great businesses — just not at any price.
Low Conviction: Directly Exposed
Pure-play AI software vendors without proprietary data moats are the most vulnerable. If your product is a thin wrapper around a commodity model, your competitive advantage has a half-life measured in months.
Companies charging AI premiums on SaaS products without demonstrated user retention at those price points are running a pricing experiment. Some will succeed. Many won't. The income investor's job is to demand proof before allocating capital.
The Income Investor's DeepSeek Checklist
Before adding or holding any AI-adjacent position in an income portfolio, run it through these four questions:
1. Does this company's revenue depend on AI model costs staying high, or on AI usage staying high? Model costs are falling. Usage is rising. You want exposure to the second, not the first. A data center REIT benefits from more AI usage. A GPU maker at 50x earnings needs costs to stay high to justify the multiple.
2. Is this company a landlord or a tenant? Landlords own the physical infrastructure — power, real estate, fiber. Tenants depend on software moats that may or may not survive the next model generation. Landlords collect rent regardless of which tenant is winning.
3. What is the payout ratio, and is the dividend funded by today's cash flow or by future AI revenue projections? A REIT paying dividends from existing lease revenue is fundamentally different from a tech company that initiated a dividend based on projected AI-driven growth. One is backed by signed contracts. The other is backed by a forecast.
4. Did management accelerate or reduce CapEx after DeepSeek? This is the revealed preference test. Companies that accelerated spending are telling you they believe total AI demand is growing. Companies that pulled back are telling you their thesis depended on the old cost structure. Listen to what they do, not what they say on earnings calls.
DeepSeek didn't kill the AI infrastructure trade. It refined it. The money is still flowing — it's just flowing to the physical layer, where scarcity is real and rent is due on the first of every month. For the income investor, that's not a crisis. It's a clarification.
Disclaimer: This blog post is for informational and educational purposes only and should not be construed as financial, investment, or tax advice. The financial markets involve risk, and past performance is not indicative of future results. Always conduct your own thorough research and consult with a qualified financial advisor or tax professional before making any investment decisions. The tools and information provided are not a substitute for professional advice tailored to your individual circumstances.