#186 - AI Investment: Boom, Bubble, or Breakthrough?
- 20 hours ago
- 3 min read
“A tree is known by its fruit; a man by his deeds. A good deed is never lost; he who sows courtesy reaps friendship, and he who plants kindness gathers love.”― Saint Basil
AI Investment: Boom, Bubble, or Breakthrough?
We’ve seen this movie before. Railroads, electrification, the internet—each sparked massive capital cycles long before the payoff was clear. Today’s AI buildout is following that script, with hyperscalers pouring $217 billion in 2024—and an estimated $320 billion in 2025—into GPUs, data centers, and hyperscale compute. Yet the business case is shaky: MIT’s Project NANDA found 95% of enterprise AI pilots fail to create measurable value. Most corporate AI spend today fuels tools that act as personal assistants, not engines of transformation.
Shadow AI Is the Real Success Story. While only about 40% of firms have purchased official enterprise LLM subscriptions, employees at over 90% of companies already use AI tools—ChatGPT, Claude, Copilot—on personal accounts to automate large parts of their jobs, often without IT oversight. This “shadow AI economy” has delivered the most tangible ROI to date, showing that AI is quietly reshaping workflows. But these gains aren’t yet the kind of enterprise transformation needed to justify trillion-dollar infrastructure bets - yet.
One of the biggest problems with the AI arms race today is that none of the hyperscalers have a clear Moat. Today’s LLM race is driven more by capital than by unique intellectual property. Whoever can source GPUs, energy, and data infrastructure at the lowest cost will dominate. This is a scale-and-cost game—think Standard Oil’s refining dominance, but with data and energy. The hyperscalers’ strategy is to spend aggressively to achieve platform lock-in, but the sheer pace of hardware obsolescence (training GPUs last under three years) makes this an expensive gamble.
Meanwhile, small language models (SLMs) models running more efficiently on edge devices could rewrite the economics of AI. If inference moves from hyperscale clusters to smartphones, laptops, and local servers, today’s GPU megafarms could become tomorrow’s stranded assets. Open-source communities are also catching up fast, with models that match or exceed proprietary performance at a fraction of the cost. Together, these forces threaten to flatten the hyperscaler capex curve.
Why does this all matter? The implications ripple far beyond AI capex. Energy producers, chip manufacturers, geopolitical negotiations, and data center developers are all riding this capex wave. The easy wins—writing aids, code generation, analytics— have been widely deployed. But the hard use cases—automating supply chains, complex business processes, and enterprise workflows—remain elusive. Much of today’s spending is a moonshot on AGI and market dominance, not a response to proven ROI - yet.
The bottom line is AI is in the midst of a historic investment cycle, but its future will be shaped by cost curves, model efficiency, and where inference happens. Winners will be those who bridge the gap between hype and durable, measurable returns before capital costs catch up. This is a boom with massive potential—but also massive downside risk if the promised transformation doesn’t materialize.
What are your thoughts on the current AI Landscape? Shoot me an email at bmoss@sh-cre.com
Investor’s Corner
Why is the yield curve steepening? Because of coming Fed cuts and fiscal challenges - Apollo - Link
CMBS issuance surges amid investor demand - Link
Daily data on consumer spending - Apollo - Link
GMO’s quarterly letter - American Unexceptionalism - Link
What the super investors bought and sold in Q2 - Link
Technology Corner
An MIT report finding 95% of AI pilots fail - Link
Politics, Philosophy, and Theology Corner
Tyranny is here - the Church in America must fight back - Link
The rise and fall of English exceptionalism - Link
Steven Sanofsky and Balaji Srinivasan on the future of M&A, tech, and AI - Link
Comments