- Latestly AI
- Posts
- Why AI’s Data Center Economics Don’t Work
Why AI’s Data Center Economics Don’t Work
Our analysis of IBM CEO Arvind Krishna’s recent warning on AI’s data center economics.
AI’s Core Economic Problem, Summarized
IBM CEO Arvind Krishna just issued one of the most uncomfortable truths in tech: the math behind the AI boom may be fundamentally broken (Fortune, 2025; Yahoo Finance, 2025).
In a direct, unusually candid interview on The Verge’s Decoder, Krishna laid out a set of calculations that should make investors, founders, and operators pause (The Verge, 2025).
His core argument:
If the tech industry spends about 8 trillion dollars building AI data centers over the next several years,
Those centers would need to generate approximately 800 billion dollars in annual profit just to service the capital — essentially interest-only payments (Fortune, 2025; Tom’s Hardware, 2025).
No company on Earth comes close to that profit level.Not Apple.
Not Microsoft.
Not Google.
None of them earn anywhere near 800 billion dollars in profit today (CNBC, 2025).
This is not theoretical pessimism.
It is a hard economic reality built on infrastructure math,
Five-year GPU replacement cycles, and
A widening gap between what AI costs to build and what AI currently earns (McKinsey, 2025).
Why Founders Should Care
For operators and startups building on top of this compute layer, this tension creates:
Enormous macro risk, and
Surprising tactical opportunity (McKinsey, 2025).
This article breaks down:
The real numbers behind the global AI infrastructure bet,
What is actually delivering ROI today,
And what all of this means for anyone who is not writing trillion-dollar checks.
The Warning: When a Legacy CEO Calls Out the Entire Industry
Arvind Krishna, IBM CEO since 2020, has spent over three decades in enterprise tech (IBM, 2025).
IBM has invested billions in AI and automation and reports 4.5 billion dollars in productivity savings by the end of 2025 (IBM, 2025).
He is not an outsider critic, which is why his statement that “there is no way” current AI data center spending makes economic sense carries weight (Fortune, 2025; Yahoo Finance, 2025).
Krishna’s Cost Calculation
Krishna says a 1 gigawatt AI data center costs around 80 billion dollars to build and equip with compute (Fortune, 2025; Tom’s Hardware, 2025).
Other estimates vary:
35 billion dollars per gigawatt from Bernstein Research
8 to 40 billion dollars from consulting analyses and LinkedIn data (LinkedIn, 2025)
Even the low end is massive.
The Scale Problem
Big Tech’s public plans collectively point toward 100 gigawatts of AI data center capacity intended for AGI efforts (Fortune, 2025; Business Chief, 2025).
At Krishna’s estimate, this equals 8 trillion dollars in capex.
Even at Bernstein’s lower estimate, it still totals 3.5 trillion dollars, larger than most national infrastructure programs.
The Profitability Barrier
Servicing 8 trillion dollars of capital requires 800 billion dollars in annual profit, assuming a 10 percent return hurdle (Fortune, 2025).
For comparison in 2024:
Apple generated 383 billion dollars in revenue
Microsoft generated 245 billion dollars
Alphabet generated 307 billion dollars (CNBC, 2025; Statista, 2025)
None of these companies alone or combined produces anywhere near 800 billion dollars in profit.
Krishna’s point: the profit requirement does not exist in today’s economy.
The Five-Year Treadmill: Why AI Infrastructure Never Stops Costing Money
If AI data centers were a one time expense, the economics might be manageable.
The core problem is the replacement cycle. Krishna notes that GPU based infrastructure must be refreshed every five years because chips age, architectures advance, and model requirements escalate (Fortune, 2025; Tom’s Hardware, 2025).
Accelerating Hardware Cycles
Nvidia now releases major GPU generations every year with major improvements in performance per watt and cost per token (Business Insider, 2025).
Older GPUs remain useful. Cloud providers like Lambda and Crusoe still profitably run H100s and A100s.
However, frontier model training requires the newest hardware, creating constant competitive pressure to upgrade (Business Insider, 2025; WhiteFiber, 2025).
Depreciation Squeeze
Hyperscalers officially depreciate GPUs over five to six years.
Analysts argue the true economic lifespan for top tier training hardware may be one to three years (CNBC, 2025; Business Insider, 2025; Reddit, 2025; GITSupport, 2025).
Operators disagree with the shortest estimates, but all acknowledge one reality. Hardware ages fast.
The Financial Reality
An 8 trillion dollar infrastructure build is not like a 30 year bridge or a 50 year power plant.
AI infrastructure must be rebuilt every five years to stay competitive (Fortune, 2025; McKinsey, 2025).
The commitment is not 8 trillion once. It is:
8 trillion at year zero
8 trillion at year five
8 trillion at year ten
The result is a compounding treadmill that almost no industry in history has ever faced.
What Are We Actually Getting? The Expensive Entertainment Problem
Get the full economic breakdown behind the AI race.
Join Premium AI Intelligence for under 2 dollars a week.
Unlock Premium AI Intelligence — Free 7-Day Trial
Free 7-Day Trial. Then less than $2/week, billed yearly. Cancel anytime with one click.
Already a paying subscriber? Sign In.
Trusted by Perplexity, Notion & 50,000+ professionals learning AI:
- • Unlimited premium articles & case studies
- • Proven AI income strategies
- • New exclusive reports every week



