Warning: We’re touching on some thorny issues in today’s Jolt.
I’ve said many times, “This is the WORST artificial intelligence is ever going to be.”
Google (GOOG) just proved me wrong.
Last week, Google launched Gemini, its much-anticipated ChatGPT competitor.
If you own Google because you think it will be a leader in AI, think again. It has almost no chance of winning the AI race.
Google has everything it takes to win: money, data, and chips. But it’s been hijacked by what I call the “PCP”—political correctness police.
Like ChatGPT, Google Gemini can generate images. Here’s what you get when you ask it to portray America’s Founding Fathers. Notice anything weird?
There are dozens more examples of Gemini rewriting history (see them on X).
What’s going on?
Remember, Google had an AI chatbot ready to go before ChatGPT. But senior execs were afraid to release it because the AI might say something “politically incorrect.”
So, Google fine-tuned Gemini to remove stereotypes. The result: Gemini refuses to depict white people. I dug into Google’s Gemini documentation to see exactly how it works.
In short, when you give Gemini a prompt, it first gets routed to an AI model trained on diversity and “anti-harm” rhetoric. This model secretly edits the prompt before passing it to the image generator (you can’t see the edits).
Gemini’s refusal to depict—or even say—positive things about white people isn’t a silly mistake. The AI works as designed and reflects the political leanings of those who built it.
Gemini is trained on ideological datasets whose stated purpose is to oppose “stereotypes,” even if those stereotypes are true.
Gemini prioritizes diversity over facts. It sacrifices accuracy for ideology.
This affects you, even if you plan to never touch AI.
First off, we’re investors. And Google is the world’s fourth most valuable company.
Many of you own Google to profit from AI. And even if you don’t own it directly, you’re exposed through popular ETFs like SPY and QQQ.
I’ve suggested avoiding Google many times. With this latest embarrassment, Google has irreparably damaged its reputation in AI. Oh, how the mighty have fallen.
Google still dominates search and will continue to rake in billions of dollars selling ads. But as for being an AI leader? “Forgetaboutit,” to quote the great philosopher Tony Soprano.
Google’s stock is still below its 2021 highs. Avoid it.
But the reason this really matters is that by blatantly rewriting history, Google could speed run us into a dystopian nightmare.
Here’s George Orwell in his classic 1984:
Every record has been destroyed or falsified, every book has been rewritten, every picture has been repainted, every statue and street and building has been renamed, every date has been altered.
… I know, of course, that the past is falsified, but it would never be possible for me to prove it, even when I did the falsification myself. After the thing is done, no evidence ever remains.
The solution isn’t to shut down or ban AI. No, no, the total opposite. We must fight to keep AI open and push back against regulation.
Big AI players like Google and OpenAI tell us this tech is dangerous. That it must be heavily regulated to keep it out of the hands of the bad guys.
What they’re trying to do is craft regulation in their favor, so they have full control over AI.
This would mean a handful of companies, unlikely to be staffed by freedom lovers, will be able to censor what we see online without most people even realizing it.
Imagine if people like this got full control of AI and used it to present their ideology as fact?
I remain pro-AI.
Every tech brings tradeoffs. Fire warms us and allows us to cook food. Armies also used it to burn down cities.
Let’s embrace the “good:” personalized AI tutors, robo-doctors, and self-driving cars.
Let’s also fight to keep AI open. We have freedom of speech. We need freedom of “compute.”
I’m interested to hear your thoughts on this topic. Write me at stephen@riskhedge.com.
The biggest nuclear breakthrough in 50 years just happened.
A UK-based engineering firm developed a technique that could slash the time it takes to “weld” a nuclear reactor from 150 days to 24 hours.
Welding is a major bottleneck for building nuclear plants. For safety reasons, reactor vessels are made of thick, heavy-duty metal that requires specialized welding.
The welding of these vessels can take up to a year. Sheffield Forgemasters figured out how to do it within a day.
Nuclear offers the cleanest, safest source of energy known to man. That’s indisputable.
The problem is that it takes deep pockets and decades to build a new plant. America’s newest reactor in Georgia came in seven years late and $17 billion over budget. Yikes.
This breakthrough was made on a small modular reactor (SMR), which is essentially a mini nuclear reactor. It looks less like a traditional power plant with giant concrete cooling towers, and more like an oil tank.
SMRs have fewer moving parts and are far cheaper to build. We can build many more of them, much quicker and easier.
SMRs already power many military submarines. Crews sleep just a few feet away from the reactor, which holds enough energy to power the submarine for years without refueling.
Once regulation catches up with the technology, each town could have its own SMR. Clean, safe energy that’s too cheap to meter. Sign me up.
This will also make it much cheaper to manufacture stuff across America. And that will unlock a golden era of disruption.
My wife gave me the greatest gift in the world: two beautiful kids.
It’s no coincidence that most ultra-successful people are parents. Kids are the best reason to jump out of bed in the morning and work your butt off.
But you’d think parenthood was a hellscape judging by these two recent pieces from Vox and The New Yorker.
The media has gone so nutty that something as wholesome as a family is to be “dreaded.”
Sources: Vox; The New Yorker
I hope no one falls for this nonsense. We need more kids, not less.
More people = more brainpower to invent new things and drive humanity forward.
And remember, don’t read the news.
Stephen McBride
Chief Analyst, RiskHedge