Can the artificial intelligence (AI) boom continue?
That’s the #1 investing question as we enter 2025.
AI stocks topped the performance charts again last year. Palantir Technologies (PLTR), which uses AI to catch “bad guys,” was the best-performing S&P 500 stock, surging 340%.
#2 was Vistra Corp. (VST), the boring ol’ power provider profiting from AI’s bottomless appetite for energy.
#3 was Nvidia (NVDA).
We own PLTR and NVDA in Disruption Investor and booked big gains on both by taking “Free Rides” in 2024. Upgrade here.
Can AI stocks really burn bright in 2025? Yes, but…
There will be a whole new class of winners. A group of stocks considered “dogs” today will be the AI champs of 2025.
Our job as investors isn’t to predict the future—it’s to see the present clearly. And a recent reveal from ChatGPT creator OpenAI signals the next phase of the AI boom.
- Imagine if your iPhone had a “think hard” button…
Press it, and suddenly your phone becomes as smart as a room full of PhDs. It could solve physics puzzles and decode biology riddles that stump most experts.
ChatGPT creator OpenAI made this real with its new AI model, o3. It “thinks” before answering. And the more it thinks, the smarter it gets.
On a competitive programming exam, o3 scored among the world's top 175 coders. It already solves math problems that baffle professional mathematicians. It even tackles reasoning challenges that weren't supposed to be solvable for years to come.
|
Under the hood, o3 breaks down complex tasks into smaller steps. If one approach doesn't work, it tries something else. The AI often spends 15 minutes on a single problem, keeping track of everything it tries along the way.
All this “thinking” uses a lot of “brain power,” all powered by computer chips. Industry insiders tell me models like o3 use roughly 50X more “compute” than the latest ChatGPT.
The key investing insight from o3 is spending on AI chips is about to surge (again). But this time, it won’t be Nvidia GPUs slurping up all the money.
- AI has amnesia.
ChatGPT can’t remember past conversations. Each time I start a new chat, I must tell it who I am and what I want.
Thinking models like o3 overcome AI’s forgetfulness. But they need mind-bending amounts of memory to do it.
Picture yourself as a detective trying to solve a murder. Your desk is covered with photos, witness statements, and sticky notes showing connections between clues. As you dig deeper and unearth more evidence, you need bigger and bigger desks.
That's essentially how o3 works. It must store and process enough data to fill thousands of iPhones. That’s triggering a boom in memory chips.
There are two main types of computer chips:
#1: Logic: Nvidia GPUs, which process data.
#2: Memory, which stores data.
- Memory has always been the “dog” of the chip world.
Memory is cheap and commoditized. And the companies that make it are notoriously boom-and-bust.
Nvidia sells its latest AI chips for $40,000 each. Memory chips often cost just a few dollars.
But AI needs huge banks of memory. AI servers use 8X more memory than classic computers. And that’s doubling with every new chip.
The memory part of Nvidia’s new AI racks costs more than the actual processing unit!
And we're just getting started. Today's models handle about 20,000 tokens (think words) in their reasoning chain. AI that can analyze entire medical journals or design new drugs will need to “remember” the equivalent of a small library all at once.
- The world’s richest companies are racing to create a “digital god.”
That’s why Amazon (AMZN)… Google (GOOG)… Microsoft (MSFT), and Meta Platforms (META) will spend $200 billion+ this year building their AI data centers.
With o3, OpenAI showed that one path to building superhuman intelligence is to get AI to think long and hard about a problem.
Every AI lab will copy it. Google is set to release its own “thinking” model soon. By the end of 2025, every AI company will have their own.
Read: Memory chip sales are about to go through the roof.
But forget the cheap, slow chips of yore. AI needs high-performance memory that can keep up with its rapid thinking.
Companies making high-bandwidth memory (HBM) chips are about to have their “Nvidia moment.”
Memory is a key bottleneck today. More than 90% of the time it takes ChatGPT to answer your question is spent shuffling data between logic and memory chips. It's like having the world's fastest chef stuck with a tiny kitchen, spending most of their time moving ingredients around.
HBM turbocharges the process by integrating a stack of memory chips together with the logic chips. That’s why HBM memory sales are forecast to jump from $4 billion last year to $81 billion next year.
SK Hynix (listed on the South Korea stock exchange) is the leading HBM chipmaker. It already sold out its entire production. US rival Micron Technology (MU) is backed up through 2025.
SK Hynix and Micron are selling the essential ingredient that makes this new class of AI models possible. I predict they’ll be two of the best-performing stocks of 2025.
The AI boom isn't slowing down. It's evolving. And in this next phase, memory will leap from “dog” to “top dog” of the chip world.
Stephen McBride
Chief Analyst, RiskHedge
PS: We held the S&P 500’s #1 best-performing stock in our Disruption Investor portfolio in both 2024 (Palantir) and 2023 (Nvidia). If you’d like to join us for 2025, we just published our brand-new issue, which lays out our game plan for this year and what to expect. It’s an important issue, and you can access it (and our entire portfolio) by becoming a Disruption Investor member today. Upgrade here.