GM. Welcome back to Milk Road AI, where we track the technologies shaping the next decade as we head into a new year.
Here’s what we’ve got for you today:
- ✍️ Jensen Huang’s next AI power grab
- 🎙️ The Milk Road AI Show: Why AI Is the Biggest Investment Shift Since the Internet w/ Kris Patel
- 🍪 Meta goes shopping for AI agents
Bridge helps businesses send, store, accept, and launch stablecoins instantly. Learn how Bridge is powering stablecoin-backed money movement.

Prices as of 10:00 AM ET.

WHY NVIDIA JUST PAID $20 BILLION FOR GROQ
We all know the story of Thanos.
To become the most powerful being in the universe, the guy had to galaxy-hop, beat up the Avengers, and sacrifice his daughter just to collect the Infinity Stones.
Each stone gave him control over a different part of reality: Time, Space, Power, Soul, Reality, and Mind.
Jensen Huang just did the exact same thing.
But instead of fighting Iron Man on Titan, he fought a corporate cold war.
And instead of a magic gold gauntlet, he used a $20 billion check.

Source: Gemini
Nvidia announced a massive deal to absorb the brains and technology behind a startup called Groq.
(Before you get all confused: Yes, we mean Groq with a Q and not Elon's chatbot with a K.)
So why would Jensen spend $20 billion on a company that is projected to make just $500M this year?
Because the CEO of Groq, Jonathan Ross, isn't just some guy.
He is the "Mind Stone" of the AI world.
He is the engineer who invented the TPU (Tensor Processing Unit) while he was at Google.
Then he left to start Groq.
And now, Nvidia just snatched him up to complete their collection.
To understand why this check had so many zeros, you have to understand the secret war happening inside data centers.
The "Swiss army knife" problem 🔪
For the last 10 years, Nvidia has been selling the world the ultimate tool: The GPU.
Think of a GPU like a Swiss army knife.
It is an incredible piece of engineering:
- It can render the pores on a soldier’s face in Call of Duty.
- It can simulate complex weather patterns for climate scientists.
- It can mine Bitcoin for crypto bros.
- And obviously, it can run AI.

But because it does everything, it carries a lot of "baggage" (like your wallet still holding those 2021 altcoins).
A GPU has thousands of tiny parts designed for video games that AI simply doesn’t care about.
It’s heavy, complex, and wastes energy figuring out which tool to pull out.
Google realized this back in 2013.
Their Chief Scientist, Jeff Dean, ran the numbers on a terrifying scenario:
If every Android user used the new "Voice Search" feature for just 3 minutes a day, Google would have to double its global data centers.
That would have cost billions.
So Jeff Dean sounded the alarm, and guess who answered that call to save Google’s universe:
Jonathan Ross.
Ross and his team of engineers built a different tool. He called it the TPU (Tensor Processing Unit).
It couldn't run video games. It couldn't mine crypto.
But it did exactly one thing: AI Math (Matrix Multiplication).
And because it only had one job, it was insanely fast, insanely cheap, and used a fraction of the power.
The team moved at lightspeed, going from a whiteboard sketch to actual silicon in data centers in just 15 months.
By 2015, these secret chips were powering Google Maps and Translate before the world even knew they existed.
Google used this secret weapon to survive and build their empire.
But Ross left to conquer the next frontier because he knew what was coming.
STRIPE JUST BET BIG ON STABLECOINS
Stablecoins are arguably the biggest use case in crypto right now.
But let’s be real, using them on-chain is still way too complex.
That’s exactly the problem Bridge is solving.
Recently acquired by Stripe (yep, that Stripe).
Bridge helps businesses send, store, accept, and launch their own stablecoins instantly.
Here are the three facts about Bridge:
- Users can seamlessly transition between fiat and stablecoins.
- Businesses can launch a custom stablecoin in as little as one day.
- All Bridge-built stablecoins are interoperable with one another.
Stablecoins are going global, Bridge is powering that shift.
👉 Learn how Bridge is powering stablecoin-backed money movement.

WHY NVIDIA JUST PAID $20 BILLION FOR GROQ (P2)
What Jonathan Ross saw coming was a massive shift in how the world uses AI.
For the last few years, everyone has been obsessed with "Training" (The Learning Phase).
This is where the model goes to university.
You feed it massive datasets, and it adjusts its brain to learn patterns.
It’s like building a spam filter: you feed the computer thousands of Nigerian Prince emails so it learns what a scam looks like.
This takes massive computing power.
Nvidia’s big GPUs are perfect for this.
They are freight trains, great at hauling massive loads of data while the model builds its brain.
But now, we are entering the "Inference" era (The Application Phase).
The model is deployed to the real world to make split-second decisions.
When a new email hits your inbox, the model has to instantly decide: Is this spam?
For Inference, you don't need a freight train. You need a Formula 1 car.
And that’s not just a metaphor. Groq literally sponsors the McLaren Formula 1 team.

Anyways, you need speed. You need instant reaction times. You need low latency (that awkward pause between asking a question and getting an answer).
We already established that Nvidia built a "Swiss army knife", and that Groq built a precision "Scalpel".
Here is the difference in raw numbers:
- Nvidia H100: Built for raw power. It burns 700 Watts per chip and takes 8-10 milliseconds to process a thought.
- Groq LPU: Built for speed. Because it strips out the "baggage", it runs on just 200 Watts and processes thoughts in 1-2 milliseconds.
That is an 18x speed difference and it’s 10x more energy efficient.

And why is it faster? It comes down to memory:
- The GPU acts like a library. It has massive storage (80GB), but every time it needs a number, it has to walk to the shelf and back. That walk takes time.
- The LPU acts like a desk. It uses a special memory called SRAM to keep the exact data it needs open right in front of its face. Zero commute time.

How Nvidia bought Groq without "buying" Groq
So, why buy them?
Jensen Huang laid out the corporate-speak version in a leaked email: “We plan to integrate Groq’s low-latency processors into the NVIDIA AI factory architecture...” blah, blah, blah.
Translation: We are crushing the competition.
Nvidia knows the wind is changing.
Jensen wants to ensure that whether you are training a model or using one, you never leave the Nvidia ecosystem:
- Need raw power? Use Nvidia GPUs.
- Need instant speed? Use Nvidia (Groq) LPUs.
He is building a walled garden so high that nobody can climb out.
But here is the absolute genius part.
He didn't just "buy" the company in a standard merger that would get stuck in court for years.
Instead, he structured it as a licensing deal and a mass "acquihire" of the engineering team (including Jonathan Ross).
By "licensing" the tech rather than buying the corporate entity, he bypasses the regulatory headaches (like the FTC) that blocked his previous acquisitions.
He gets the Infinity Stone and the wizard who made it, without asking for permission.
That is why I remain bullish on Nvidia into 2026.
Thanos would be proud. Snap .
Now we want to hear from you. Hit reply and tell us:
Did Jensen make the right move?
- Yes, he locked down inference.
- No, speed alone won’t matter.
- Ask me again in 2026.

AI IS STILL JUST STARTING 🤖
In today's episode, we sat down with Kris Patel, a high-profile retail investor, to talk about why AI could be the biggest investment shift since the internet.
Here's what you'll hear:
- Why AI is still early in a decades long shift, with real value built over 10 to 20 years (and some hype along the way).
- How to invest through the cycle with gradual allocation, and why heavy leverage and short duration options can blow up fast.
- Who can win the AI race, plus a Meta deep dive on distribution, cash flow, and turning AI into ad dollars.
- The energy angle most people miss: data center power demand, nuclear and SMRs longer term, and gas and LNG nearer term (with key risks).
Don't sleep on this one 👇
YouTube | Spotify | Apple Podcasts

Fuse is a $300M ARR energy company launching The Energy Network on Solana. Discover the future of energy now.
Chainlink is the industry-standard oracle platform bringing the capital markets onchain, and the market leader powering the majority of decentralized finance. Learn more about Chainlink.
Arc is a new Layer-1 blockchain designed to support real-world economic activity and it’s now live in testnet. Dive into Arc and start experimenting.

BITE-SIZED COOKIES FOR THE ROAD 🍪
Meta acquired AI startup Manus in a deal reportedly worth about $2 billion. The buzzy AI agent startup brings real revenue and new tools Meta plans to integrate across its apps.
SoftBank has completed its $40 billion investment in OpenAI. The deal ranks among the largest private funding rounds ever and deepens its AI bet.
The U.S. approved Samsung and SK Hynix to ship chipmaking equipment to China through 2026. The move offers temporary relief as export controls tighten.

MILKY MEMES 🤣


ROADIE REVIEW OF THE DAY 🥛













