GM. Welcome back to Milk Road AI, where we track how AI innovation turns into real infrastructure and real money.
Here’s what we’ve got for you today:
- ✍️ Why the AI boom just ran headfirst into a memory shortage
- 🎙️ The Milk Road AI Show: AI Isn’t Overpriced... The Market Is Still Massively Underestimating It w/ Eric Jackson
- 🍪 Alphabet buys Intersect to power Google’s AI expansion
Bridge helps businesses send, store, accept, and launch stablecoins instantly. Learn how Bridge is powering stablecoin-backed money movement.

Prices as of 10:00 AM ET.

THE AI BOOM JUST RAN INTO A MEMORY WALL
You know that moment when your phone says “Storage Almost Full” and you immediately start deleting photos like you’re erasing evidence?

That’s basically the global semiconductor industry right now.
Except instead of your camera roll, it’s memory chips.
And instead of cute dog pictures, it’s the fuel that powers AI data centers, smartphones, PCs, and basically every device with a pulse.
Late 2025 has delivered a full-blown memory chip shortage, and the impact is spilling over from the tech giants to the average consumer.
We just got the official confirmation from Micron, one of the three kings of the global memory market.
Their latest earnings report was a total mic drop.
Revenue hit $13.64 billion, completely crushing the $12.88 billion forecast.

The stock is already up 200%+ this year, and investors are basically saying, “Yeah... memory isn’t getting cheaper anytime soon”.
Micron isn’t some obscure chip name you squint at on an earnings calendar.
If you use a smartphone, a laptop, or anything connected to an AI data center, there’s a very good chance Micron’s chips are inside it.
So the obvious question is: why is this happening now, and why does it feel different than the usual boom-bust memory circus?
Why this time is different
Historically, memory is one of the messiest industries on earth.
The cycle usually goes like this:
Demand rises → companies panic-build factories → chips flood the market → prices collapse → everyone cries → repeat.

But this time, there’s a new character that breaks the script: AI infrastructure.
For decades, consumer electronics (your iPhone, your Dell laptop, your PlayStation) were the main drivers of demand.
Now AI data centers have shown up, and they don’t shop like normal customers.
They don’t buy “a little more memory”.
They lock it up years in advance and dare the rest of the market to figure it out.
If you think that is an exaggeration, look at what OpenAI just did.
They signed a partnership with Samsung and SK Hynix (the two biggest suppliers in the world) for their massive "Stargate" initiative.
(A $500B effort to build AI supercomputers at national-infrastructure scale.)
The deal secures them 900,000 DRAM wafers per month.
DRAM is a computer’s short-term memory. It’s where information lives while the machine is actively thinking.
That single agreement represents around 40% of global DRAM output.

One single AI project just swallowed nearly half of the global supply.
That is how you get shortages that spread across everything from data centers to consumer devices.
And the next part is where most people completely misunderstand how AI hardware actually works (keep reading).
STRIPE JUST BET BIG ON STABLECOINS
Stablecoins are arguably the biggest use case in crypto right now.
But let’s be real, using them on-chain is still way too complex.
That’s exactly the problem Bridge is solving.
Recently acquired by Stripe (yep, that Stripe).
Bridge helps businesses send, store, accept, and launch their own stablecoins instantly.
Here are the three facts about Bridge:
- Users can seamlessly transition between fiat and stablecoins
- Businesses can launch a custom stablecoin in as little as one day
- All Bridge-built stablecoins are interoperable with one another
Stablecoins are going global, Bridge is powering that shift.
👉 Learn how Bridge is powering stablecoin-backed money movement.

THE AI BOOM JUST RAN INTO A MEMORY WALL (P2)
Most people think AI training is limited by the GPU (the brain).
Not exactly.
A lot of the time, that $30,000 Nvidia chip is just sitting there, filing its nails, waiting for data to arrive.
Quick refresher: a GPU is the part of the computer that does the heavy math. It’s insanely fast at calculations, which is why AI runs on GPUs in the first place.
But computers really only do two things:
- Think (calculate).
- Fetch (move data).
Over the last 10 years, the “thinking” part got about 1,000x faster.
The “fetching” part? It only got a little faster.
If you want to see exactly why this matters, look at this chart:

The red line represents the demand coming from AI, specifically how fast model sizes are growing.
We are seeing roughly a 410x increase every two years. That is exponential growth on steroids.
Now look at the green line which represents the actual physical improvement of memory hardware, which is only growing at 2x every two years.
That massive, widening gap between the two lines is the "Memory Wall” and it’s one of the biggest bottlenecks in tech right now.
To fix this, companies are buying High Bandwidth Memory (HBM), which sits right next to the GPU and feeds it data at warp speed.
But HBM is brutal to make.
It uses roughly 3x more silicon than standard DRAM because instead of building a single-story house (phone memory), you’re stacking a skyscraper and running elevators between every floor.
The industry is effectively choosing to feed the rich kid (AI) first, while everyone else gets rationed.
And it’s not just the chips.
Storage is getting crushed too.
Because AI needs to read massive datasets instantly, data centers are buying up all the high-end enterprise SSDs (Solid State Drives, the super-fast hard drives that make your laptop wake up instantly).
When those run out, they start buying the flash memory (NAND) usually reserved for consumers.
That is how a shortage in a server farm ends up making your next laptop more expensive.
And now, the Hyperscalers (Google, Microsoft, Meta) are treating memory like a strategic weapon.
They are locking down supply with multi-year contracts like they’re hoarding concert tickets.
And because they cleared out the inventory, the shortage is finally spilling over into your life.
For example, memory is a huge chunk of a phone’s cost.
When prices spike, makers like Samsung or Xiaomi have three bad options:
- Raise the price.
- Make the phone dumber (cut memory specs).
- Eat the cost (spoiler: they won't).
Mid-range phones in 2026 are about to get pricier or worse.
The premium players (Apple) might survive because they lock in supply years in advance, but even they might slow down their annual upgrades if costs stay this crazy.
How long does this last?
This is the trillion-dollar question.
Most credible timelines point to tight conditions lasting through 2026 and potentially into 2027.
Goldman Sachs is modeling DRAM supply staying tight through 2026, with shortages actually worsening before they get better, and only easing meaningfully in 2027.

In other words, the industry isn’t about to flip from shortage to glut overnight.
Even in Wall Street’s base case, memory stays constrained for another couple of years.
Could it end sooner? Sure, but it takes one of two things:
- A real slowdown in AI capex.
- A supply surge that arrives faster than expected, which is hard in memory, especially for HBM.
Until then, we are in the scarcity era.
As long as AI data center capex keeps running hot, memory prices stay elevated and we remain bullish on Micron, which we think is the most likely outcome.
If AI capex slows, memory snaps back fast, because this industry still has a history of going from famine to oversupply.
So this might not be a forever trade.
The question is whether that timer is two years or five.
Alright, that’s it for this edition of Milk Road AI, but we’d love to know what you think.
Hit reply and tell us:
Are you buying Micron here?
- Yes, riding the AI memory wave.
- No, cycle always turns.
- Already in.

AI ISN’T OVERPRICED, IT’S UNDERHYPED 🤖
In today’s episode, we sat down with Eric Jackson (EMJ Capital) to talk about why the market is still massively underestimating AI, and what that means for crypto, data centers, and investing.
Take a look at what surprised us:
- Why Eric thinks there is no AI bubble, and why agentic AI is the real unlock
- The crypto + AI thesis: Bitcoin as collateral, Ethereum as the execution layer for agents
- The bitcoin-miner pivot into AI data centers, and what energy constraints mean from here
- The Nextdoor thesis, plus how EMJX aims to add an AI hedging overlay to BTC and ETH exposure
Tune in and see for yourself 👇
YouTube | Spotify | Apple Podcasts

Fuse is a $300M ARR energy company launching The Energy Network on Solana. Discover the future of energy now.
Chainlink is the industry-standard oracle platform bringing the capital markets onchain, and the market leader powering the majority of decentralized finance. Learn more about Chainlink.
Arc is a new Layer-1 blockchain designed to support real-world economic activity and it’s now live in testnet. Dive into Arc and start experimenting.

BITE-SIZED COOKIES FOR THE ROAD 🍪
Nvidia plans to ship H200 AI chips to China by mid-February, pending approval. The move follows a U.S. policy shift allowing sales with a 25% fee.
Alphabet is buying clean energy developer Intersect for $4.75 billion. The deal supports Google’s growing AI and data center power needs.
Waymo resumed robotaxi service in San Francisco after a blackout stalled vehicles across the city. The outage disrupted traffic signals and temporarily forced service to pause.

MILKY MEMES 🤣


ROADIE REVIEW OF THE DAY 🥛












