In Donald Trump, America voted for chaos. That’s even more dangerous than it seems.

November 13, 2024:

It doesn’t take a political genius — whose ranks seem to have grown lately, based on the sheer number of very confident post-election takes over the past week — to see that many, many Americans have voted to blow up the system. Donald Trump has, if nothing else, incarnated a belief that the way America was being run was fundamentally broken and needed to be overhauled from top to bottom.

That, more than any policy specifics around taxes or immigration or foreign policy, was my takeaway from November 5. A (bare) majority of Americans wants to take a wrecking ball to everything.

But those feelings and the anger that feeds them runs deeper than just Trump voters. One bit of news that caught my attention this week was Rep. Alexandria Ocasio-Cortez (D-NY) asking her Instagram followers why some of her constituents cast ballots both for her and for Trump.

What I see in these answers is that frustration with the system isn’t something that can be attributed just to one party or another, even if it is currently concentrated in the GOP. An avowed leftist like AOC and President-elect Trump are about as far apart as two American politicians can be, but large segments of their supporters are united by anger at the way things are and by a thirst for radical change of some sort.

I can understand their point. In the nearly 25 years that I’ve been a professional journalist, I’ve seen a catastrophic overreaction to 9/11 lead to a two-decade war on terror; thousands of dead American soldiers and hundreds of thousands of dead civilians in Iraq, Afghanistan, and elsewhere; and a Middle East that remains chaotic. I’ve seen the 2008 Great Recession and the years of economic misery that followed.

I’ve seen the failure to prepare for a major pandemic that many people saw coming, and I’ve seen the failure to learn from it in a way that prepares us for the next one. I’ve seen political barriers harden to economic and technological progress that could meaningfully improve people’s lives. And I’ve seen very few people in power held accountable for those failures.

Depending on where you fall on the political spectrum, you can undoubtedly add your own points to this list. I may believe, as I have written repeatedly, that the long run has seen human life improve immeasurably, and I retain confidence that better days ultimately lie before us. Yet I can still understand why voters on both the right and the left would look at the wreckage of the past 20 years and pull a lever for radical change, consequences be damned.

Here’s the thing, however, about radical change. It is, as our more numerate readers might say, a “high-variance strategy,” meaning that the range of possible outcomes is far wider than what we might expect from more incremental, within-the-system change.

Perhaps we nail the jackpot and manage to hit upon the political choices that really can create something meaningfully better out of a broken system. But just as likely — perhaps more likely if you know anything about political revolutions in recent history — is that radical change will leave us worse off, and it will turn out that the system so many had come to despise was, in fact, our last line of defense against something much, much worse.

The night is dark and full of terrors

If you, like much of the electorate, think things couldn’t possibly get worse, I have some reading for you.

Less than a week before the election, the pointy-heads at the RAND Corporation published a 237-page report on Global Catastrophic Risk Assessment. (I did not say it would be light reading.)

The report is a response to the 2022 Global Catastrophic Risk Management Act, which required the Secretary of Homeland Security and the director of the Federal Emergency Management Agency to assess really big risks to human survival and develop and validate a strategy to safeguard the civilian population in the face of those risks. If the ultimate purpose of government is to keep us safe in a dangerous world, that law is meant to prompt the US government to anticipate and prepare for the most dangerous risks of all.

The RAND report breaks down catastrophic risk into six main possibilities: asteroids and comet impacts; supervolcanoes; major pandemics (both natural and human-made); rapid and severe climate change; nuclear conflict; and, of course, artificial intelligence. (I’d call them the Sinister Six, but I suspect that might send Marvel’s trademark office calling.)

What these six have in common, the report notes, is that they could “significantly harm or set back human civilization at the global scale … or even result in human extinction.”

It’s important to pause for a moment on what that really means. We just finished an election in which a majority of Americans indicated that they were very unhappy with the way things are going. They’re mad about high prices, mad about immigration, mad about Joe Biden, or mad about Donald Trump.

Despite all the fury, however, these are fairly ordinary things to be mad about, ordinary political and economic problems to suffer through. Thinking about catastrophic risks helps put them in some perspective. A nuclear war — a possibility that is more likely now than it has been in decades — could kill hundreds of millions of people, and leave the planet so battered that the living would envy the dead.

We already know from Covid the damage a pandemic with a relatively low death rate could do; something more virulent, especially if it were engineered, could resemble something out of dystopian fiction — except the possibility is very real. The risk from out-of-control powerful artificial intelligence is almost entirely unknowable, but we would be fools to completely dismiss the dire warnings of those in the field.

And with the exception of asteroids and comets — where actual, intelligent space policy has helped us better understand the threat and even begin to develop countermeasures — the RAND report judges that the threat of all of these risks is either static or increasing. (Supervolcanoes, the one risk that remains unchanged, is largely outside human prediction or control, but thankfully we know enough to judge that the probability is very, very low.)

So why are the risks from nuclear conflict, major pandemics, extreme climate change, and artificial intelligence all increasing? Because of human decisions, otherwise known as policy.

Will we act as though climate change is the catastrophic threat so many of us believe it to be and engineer our society and economy to mitigate and adapt to it? Will we reverse the collapse of global arms control treaties and edge back from the brink of nuclear conflict? Will we actually learn from Covid and empower the policies and unleash the science to stop the next pandemic, wherever it comes from? Will we do anything about AI — and can we?

The answers aren’t easy, and no one political party or candidate has a monopoly on all the best ways to handle catastrophic risk. Reducing the risk of extreme climate change may mean getting serious about the consequences of what we eat and what we drive, in a way sure to anger Republicans — but it may also mean taking the brakes off rapid energy development and housing construction that have too often been defended by Democrats. Minimizing the danger of future pandemics may require defending the global health system, but it may also demand cutting the red tape that often strangles science.

Above all, it will demand dedication and professionalism in those we choose to lead us, here in a country where that’s still possible; men and women who have the skill and the understanding to know when caution is required and when action is inescapable. And from us, it will demand the wisdom to recognize what we need to be defended from.

The system has failed us. But there are far worse things than the failure we’ve experienced. As we continue down a 21st century that is shaping up to be the most existentially dangerous one humanity has ever faced, we should temper the pull of radical change with an awareness of what can go wrong when we pull down all that we have built.

Source link