Most of the time, we worry far too much about tail risk.
We worry about terrorist attacks and necrotizing fasciitis, but not much about heart disease or car crashes. But in 2011, 17 US citizens worldwide died as a result of terrorism and approximately 150 from necrotizing fasciitis. There were nearly 600,000 deaths resulting from heart disease and over 32,000 from car crashes.
Based on current data, you are about 35,000 times more likely to die from heart disease than from a terrorist attack. So everyone smart says that we worry about terrorism way too much, and so far, they’ve been right.
For whatever reason, we seem to be wired to overweight the risk of the dramatic, scary, but very unlikely and underweight the risk of the mundane, familiar, and probable.
But maybe there are some tail risks we should really worry about.
Our risk-evaluation miscalibration leads to important blind spots. We’ve seen images of a nuclear explosion; we know how terrifying that is, and so we fear it. Most people have had the flu, and so we don’t fear that—we know it’s possible to die from the flu, but most people don’t. Death from the flu doesn't trigger most peoples' panic sensors because the version of it we know is boring and familiar.
However, I don’t think we have collectively thought enough about how biotechnology is going to change the landscape. Of all “technologies", it’s the one thing that really scares me.  Biotech has incredible potential to improve our lives, probably even more so than computers, but of course that comes with much graver downside.
Also in 2011, some researchers figured out how to reengineer H5N1—avian influenza virus—to make it much scarier by causing five mutations at the same time that all together made the virus both easy to spread and quite lethal. These five mutations could all occur in nature, but it’d be unlikely in the same copy of the virus. I have no doubt that the media overstated the danger, but it’s still worth thinking about.
We now have the tools to create viruses in labs. What happens when someone creates a virus that spreads extremely easily, has greater than 50% mortality, and has an incubation period of several weeks? Something like this, released by a bad guy and without the world having time to prepare, could wipe out more than half the population in a matter of months. Misguided biotech could effectively end the world as we know it.
When the H5N1 work happened, there was a lot of debate about whether or not to release the research. The researchers put a voluntary moratorium on releasing the information, which they lifted earlier this year.
Trying to keep things secret is not the answer. Trying to criminalize knowledge of dangerous things (we tried this with the atomic bomb) is definitely not the answer.
But ignoring real danger is not the answer either. The world is very bad at coordinated action. Unlike an atomic bomb, which has grave local consequences, the first of these pathogens that gets released could have grave global consequences almost instantly, and give us very little time to react. While enriching uranium requires the resources of nations, biotech development is already routinely privately funded.
Spending a lot of effort on proactive defense against bioattacks is something we should prioritize very highly.
When we first became able to create software programs in garages, it changed the world in very fundamental (mostly positive!) ways. As we begin to be able to create biology programs in garages, we should remember that bigger changes are likely coming--hacking our bodies will likely be more powerful than hacking bits. We may have to move even faster to adapt our society than we did with the computer revolution.
Thanks to Patrick Collison, Connie Gibstine, and Nick Sivo for reading drafts of this.
 Biotechnology is scary in a lot of non-obvious ways. Sure, it’s easy to understand why superviruses are scary. But another possibility is that we engineer the perfect happiness drug, with no bad side effects, and no one wants to do anything but lay in bed and take this drug all day, sapping all ambition from the human race. There are a lot of other possibilities too, and it’s very hard to think of them because we don’t have much experience with what's about to happen.