What superforecasters and extinction experts are saying
Ric Edelman: It's Tuesday, August 8th. Yesterday we talked a lot about AI and all the cool things that it's going to do along with some creepy things that are scary. And yeah, I've gotten a lot of questions about all this. The most common question I've been getting is AI going to kill humans? Are we headed to extinction? You know, Skynet and the Terminator, the Matrix and all the other dystopian Sci Fi movies you've seen?
Well, you're not the only one to be asking that question. And now we have the answer from the world's experts. A new paper from the Federal Reserve Bank of Chicago and the University of Pennsylvania, they surveyed two groups of experts. The first group consists of experts in nuclear war, bioweapons, AI and extinction itself. The other group consists of people known broadly as futurists or superforecasters. These are consultants whose careers are devoted to making incredibly accurate predictions on all sorts of topics, from election results to whether a product will be popular.
The researchers interviewed 89 superforecasters, 65 experts and 15 extinction risk experts. They were all asked to consider two different kinds of disasters - a catastrophe that kills 10% of all the people (and by the way, World War II killed about 3%), and an extinction event that kills everybody except maybe about 5,000 people.
The experts in nuclear war bioweapons and AI, they collectively said there's a 20% chance of a catastrophe and a 6% chance of extinction. The superforecasters were much less pessimistic. They said there's only a 9% chance of catastrophe and a tiny 1% chance of extinction. And what about the AI experts? They were in the middle. The AI experts said there's a 12% chance of catastrophe and a 3% chance of extinction.
The superforecasters were the least pessimistic of this group. They note that the more you focus on only one thing, the more important you think that one thing is. Since AI experts spend all their time looking at AI, they see it as the worst possible. But people who put AI into a broader context don't have such negative views.
In fact, they made a point by looking at nuclear weapons. The world has lived with nuclear weapons for 80 years. We've still never had a nuclear war. But that doesn't stop nuclear experts from being convinced that it's going to happen. By the way, which group of experts gets the most media coverage? The ones who are the most pessimistic. So you need to be aware that just because you see an expert on TV telling you that he thinks AI is going to kill us all, remember that a lot of other experts don't agree at all. The reason I'm mentioning all of this is because when people start to think about extinction, that AI will steal your job. You're going to be homeless. You start to change how you invest. Don't let fears of technological innovation panic you into selling everything.
Instead, invest in the tech, capitalize on it, put it to work for you. Use it to create wealth and get rich. And hey, if you're wrong and the world does come to an end, well, it's not going to matter where you had invested.
So instead of thinking about mattresses and holes in the ground for your money, consider this.
Grandview Research says the global AI boom will grow tenfold by the end of the decade. The Marketing AI Institute says AI will boost corporate profits 38%. Accenture says AI will add $14 trillion to global GDP by 2035.
So instead of thinking about mattresses, I'm thinking about an ETF, the Global X Artificial Intelligence and Technology ETF, symbol AIQ. This ETF invests in companies worldwide, both household names and newcomers that invest in information tech communication services, consumer discretionary. Industrial stocks. Financial stocks. Healthcare stocks. Yeah. Healthcare stocks. AI is going to touch everything and AIQ is your opportunity to be part of it. I encourage you to take a look at that. You can learn more at Global X ETFs.com. The links in your show notes or talk to your financial advisor. And yes, you can stop worrying that AI is going to cause the extinction of humans.
Hey, today at 2 p.m. eastern, I'm presenting a special webinar on the Bitcoin ETF applications that are right now before the SEC. Will the SEC say yes? Why might they say yes now when they've said no to these applications for the past 7 or 8 years? And what does it mean for the price of Bitcoin if the SEC does say yes or maybe says no? We're going to explore all of this. It's today, 2 p.m. eastern. You can attend for free. Just register at DACFP.com. See you there. 2:00 pm today.