Secure Things
Tim Maly
Senior Lead
Strategic Design & Communications
RISD
︎Download PDF
1. Some good advice about storytelling: Keep your causes tightly linked to their effects. A triggers B, which leads to C. But wait! D happens and therefore E. Chekhov’s gun is the idea in dramatic narrative that if there is a gun on the wall in the first act of the play, it must go off before the end.
This premise encourages compact storytelling: don’t introduce something if it won’t be relevant. Keep the promise that your mise-en-scene makes. Deus ex machina is when a dramatic situation is resolved through some intervention that was previously outside the narrative. These are deeply unsatisfying plot contrivances and best avoided.
A more advanced storytelling move is to have two or more threads going at the same time. That way, you can jump back and forth between them. You can skip over the boring parts. If you’re going to have multiple threads, they need to come together in a satisfying way.
2. In 1983, the story goes, global civilization came within one judgment call of total annihilation.
3. Towards the end of 2020, the Center for Complexity ran a studio to examine notions of security. The idea of security is at the heart of debates about the future of nuclear weapons. As we’ve worked in the field, it has become clear to us that security takes on an enormous number of different meanings. It’s not at all clear that people in the field mean the same thing when they are arguing with one another about the best way to ensure “security.”
To queue up the bigger question, we asked participants to bring in some examples of how they ensured their own security. Contributions ranged from investment accounts for a comfortable retirement to back-up generators in case civilization collapses.
4. There is a chart that haunts me. It was made in the early days of the pandemic. It shows the rise of cases reported in Hubei province, overlaid with a timeline of administrator decisions. It shows how even after the lockdown, cases continued to rise. Of course they did. There is a delay between infection and the onset of symptoms. There is a delay between the onset of symptoms and visiting a doctor for a diagnosis. A medical system can only find out about that trajectory after the visit to a doctor. Armed with that retrospective knowledge, the chart adds another overlay. It shows the true explosion of covid-19 cases in Hubei—about two weeks before they could be diagnosed. When Wuhan was locked down, they knew of hundreds of cases. In reality, there were thousands.
The true pandemic is a shadow pandemic, happening weeks ahead of the medical system’s ability to detect it. Decision makers are perpetually operating on old information and needing to wait weeks to evaluate the effects of their actions. The people diagnosed right after lockdown were already doomed.
5. It takes about 30 minutes for a nuclear-armed intercontinental ballistic missile to fly from the US to Russia.
6. In Who’s Invading Whom? Zika and Intergenerational Public Health, Nathaniel Hupert contrasts a public health strategy with more direct intervention. Using smoking to illustrate the difference, he writes, “successful anti-smoking public health interventions target both antecedent (cigarette production and marketing) and downstream effects (behavioral and pharmacological aids for cessation), typically keeping their hands off cigarettes that have already made it into circulation.”
There is a kind of heartlessness to this approach. You mean you’re going to just let the cigarettes that are out there stay out there and kill again? Shouldn’t someone do something?
7. I’ve read that HBO’s Chernobyl mini series is best understood as a horror story. The reactor core is a cosmic being beyond human comprehension. To gaze upon it is to be disintegrated. To even catch a glimpse of it is to risk doom by a creeping sickness that kills you years later. One part of the clean-up required human crews to work a 90-second shift, once in their life.
Poor decision after poor decision drives the story forward. The audience—knowing more about what’s coming than the characters—grip their arm rests and squirm in their seats. The whole story is, “Don’t go in there!” performed at an institutional scale.
It turns out, as in all horror stories, that humans were the real monster all along.
8. Some security things offered their owners greater independence. Some bound people tightly to their community. Some depended on institutions continuing to function reliably. Others were for people who clearly did not trust institutions at all.
9. Here’s the story about how we almost lost global civilization. Stanislav Petrov was the Russian officer on duty on September 26, 1983. A Soviet early warning system detected an American intercontinental ballistic missile inbound. In order to deter such an attack, the Soviets had a strict “launch on warning” protocol. If they detected an incoming attack they would retaliate. The Americans, also looking to deter a nuclear attack, had a similar strategy. If they’d detected the USSR’s counterattack, protocol would have dictated launching everything. The idea was that this mutual assurance of destruction would keep both powers in check and prevent any launch from occurring at all.
Yet, here was Petrov with the early warning system screaming that deterrence had failed and it was time to make good on the promises they’d made.
This is a moment for Hollywood if ever there was one. A lone hero faced with an uncertain situation in a moment of total mortal peril must make a choice. The fate of the world hangs in the balance. The rules say he must report the detection, which would trigger a nearly-mechanical series of events that end in total destruction. Is it a real attack? A false alarm? What should he do?
Close-up on his face. Tense music is playing. A bead of sweat trickles out from under his officer’s hat. Cut to the monitor showing the apparent missiles arcing closer and closer…
10. Reality is nothing like stories.
11. In the studio, we saw security things meant to prevent bad things from happening alongside items meant to ensure capacity in case things went bad. Some things sought to avoid danger entirely, while others were threat detection systems. Some were meant to hide from threats. Some were meant to meet them head on.
12. The butterfly effect is a fable of chaos theory used to illustrate the idea that large complex systems are incredibly sensitive to initial conditions. You’re probably familiar with a version of it. Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas? is the title of a talk given by Edward Norton Lorenz in 1972.
In its enduring form as a fable, the idea ends up retreating to the cause and effect mechanisms of narrative. This butterfly flapped its wings, so that tornado was set off. The point is that there are actually many butterflies and all kinds of weather all the time.
13. Retellings of the 1983 false alarm almost always treat the detection of the missiles en route as the inciting incident. There’s such a tight cause and effect to that story. There’s such a clear sense of beginning, middle and end. But narrative structures are contrivances.
Was the inciting incident the false alarm? Maybe it was the decision to launch the satellites that triggered the alarm. Maybe it was the decision to design the sensors the way they were designed. Maybe it was the decision to have a policy of mutually assured destruction. Maybe it was the decision to develop nuclear weapons. Maybe it was Hitler’s decision to annex Europe.
14. “Humans, it seems, have an extremely poor capacity to adjudicate short- versus long-term risks, consistently prioritizing those that are sitting in front of our noses,” writes Hupert. “In this manner, we act more like doctors trying to maximize the health of our next patient—sometimes in unthinking ways and at the expense of future patients—rather than as practitioners of public health who are supposed to consciously and conscientiously apportion limited resources in order to attain improvements in overall population health, even if this may not maximize the health of every individual member of that community.”
15. Petrov decided, correctly, that he was witness to a false alarm and elected not to set in motion the destruction of global civilization. He was first hailed as a hero and then pushed aside since his situation cast some rather embarrassing light on the Soviet early warning systems and the people who had conceived of, authorized, designed and built them.
16. The public health approach requires a kind of inhumane compassion. You ignore the suffering in front of you in order to take action on a much wider, far more diffuse form of suffering.
It is also a profoundly anti-narrative approach. Causes and effects are loosely connected by probabilistic factors. They are years or decades apart. It is very difficult to determine if any promises that have been made were kept. Exogenous factors come out of nowhere all the time to disturb the course of events.
17. Petrov’s story only becomes a story because he intervened. If he’d followed the rules and the whole terrible apparatus of mutually assured destruction had been permitted to run its course, his anonymous contribution would not have been noted at all. Whatever human survivors there were might have said the inciting incident was the decision to build nuclear weapons in the first place. In that telling, the disaster was set in motion decades before the consequences played out and we were—all of us—doomed from the start.
18. Some security things were reassuring. Some were uncomfortable but necessary to have around the house. Some security things were used on a daily basis. Others were held in the hope that they would never be used at all.
19. The near destruction of global civilization was not made public until 1998. For 15 years, decision makers deployed weapon systems, set postures and debated strategy without access to that knowledge.
20. What must those moments for Petrov and his subordinates have been like? Waiting between the detection and the projected moment of impact—wondering if you were already doomed.