Unapocalyptic

Unapocalyptic

Share this post

Unapocalyptic
Unapocalyptic
A Guiding Principle for Foresight

A Guiding Principle for Foresight

Milan Ćirković reads Nick Bostrom, and finds some optimism for our long-term future

Karl Schroeder's avatar
Karl Schroeder
Jan 03, 2025
∙ Paid
10

Share this post

Unapocalyptic
Unapocalyptic
A Guiding Principle for Foresight
4
2
Share

Is there a way to convert what I’ve called ‘earned optimism’—optimism generated by taking actual, real-world steps to secure a better future—into policy? Well, I just received a wonderful Christmas present from astrophysicist and Stapledonian speculator Milan Ćirković that goes a long way to answering that question.

Milan and I have been corresponding for over twenty years. He’s written critical papers about my ideas, and I’ve incorporated his analyses into my work. If you want a sense of how far out on the cutting edge Milan is, check out his recent paper “Gaia as Solaris: An Alternative Default Evolutionary Trajectory,” written with Srdja Janković and Ana Katić. It’s mind-blowing stuff, but par for the course for Milan.

Now, in “Metatechnological Mapping and the Ye Wenjie Effect: Mitigating Civilizational Vulnerabilities,” he’s applied his incisive and creative mind to the question of preventing extinction-level apocalypse. The paper is behind a paywall, but in essence it takes an important idea of Nick Bostrom’s and extends it in what I would call an ‘unapocalyptic’ direction.

Bostom’s 2021 paper, “The Vulnerable World Hypothesis” introduces a rather depressing possibility:

VWH: If technological development continues, then a set of capabilities will at some point be attained that make the devastation of civilization extremely likely, unless civilization sufficiently exits the semianarchic default condition.

The ‘semianarchic default condition’ is where we are now. It’s the situation where technologies are developed with no oversight, essentially at random, and there is no overarching governance to prevent the creation of potentially extinction-driving tech. Bostom uses the example of ‘easy nukes,’ an idea I explored in my short story “Laika’s Ghost” (2011). Another example is public-domain genetic libraries and democratized gene-hacking, which could make possible artificial pathogens such as Frank Herbert’s White Plague (1982) which targets only women. Even seemingly benign technologies, such as fast nuclear drives for interplanetary travel, can be weaponized; any of the spacecraft in The Expanse series would carry the kinetic energy of a nuclear bomb when traveling at speed between worlds. To safeguard a busy solar system, it would be necessary for Earth to institute a policy of destroying any rapidly moving ship whose trajectory might intersect the planet’s.

This is Bostrom’s conclusion: that surveillance and oversight will have to progressively tighten to prevent increasingly dangerous inventions from escaping into a world where what he calls the ‘apocalyptic residue’ of society might use them. This residue is all the people who have lost any investment in civilization, for whatever reason. They are the nihilists, the doomsters, and all it takes is one ‘super-empowered individual’ to spoil the party for the rest of us.

I like Bostrom, but he tends toward black-and-white thinking. His conclusions are always rigorously worked out but sometimes lack a broader meta-level of thinking. This is where Milan’s new paper comes in. He points out that the tightening and restriction of society’s access to information Bostrom prescribes could have the opposite of its intended effect. Totalitarianism is the perfect breeding ground for an apocalyptic residue. Milan Ćirković calls this the ‘Ye Wenjie Effect’ after the radicalized anti-heroine of Cixin Liu's novel The Three-Body Problem (2014). The injustices that are built into totalitarianism naturally create disaffected and traumatized people, any of whom could decide to burn the whole edifice to the ground. If Milan is right, this puts us on the horns of a dilemma: how do we restrict the knowledge that can produce Frank Herbert’s rogue geneticist John Roe O'Neill, while avoiding a totalitarian level of control that will surely produce a Ye Wenjie?

Ćirković’s (partial) solution is brilliant and shows why he and I get along so well. If you think about it, Bostrom is proposing that we try to root out or defend against the apocalyptic residue once it exists. Milan suggests instead that we make it as unlikely as possible for the residue to exist in the first place.

This flips the narrative of social change, from a focus on restrictions on knowledge by some monolithic authority (which could itself become the apocalyptic residue!) to a focus on removing conditions of social injustice and psychological trauma that could lead an individual or group to develop apocalyptic goals. Bostrom is making an assumption akin to the Biblical dismissal that “the poor will always be with us.” It takes for granted that the residue itself is an inevitability; what is not inevitable is our response to it. Milan Ćirković lifts the curtain on this assumption. His conclusion is that, while it’s clearly a good idea to restrict the accessibility of “easy nukes” and other trivially simple routes to an apocalypse, the best overall strategy is to ensure that people live in a just, equitable society and don’t desire its destruction.

An Ethical Compass for Foresight

Keep reading with a 7-day free trial

Subscribe to Unapocalyptic to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Karl Schroeder
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share