A new paper published in the European Journal of Risk Regulation considers the danger from existential terrorism, defined as acts that threaten the existence of humanity. The authors highlight what they term ‘spoiler attacks’ involving AI or other new technology, which might enable a group with limited resources to cause unprecedented destruction.
“I don’t expect existential terrorism to be at the top of global agendas, nor do I believe it should be,” Zachary Kallenborn, one of the report’s, authors told Forbes. “But global discourse is clearly changing around existential risk.”
Kallenborn is a Policy Fellow at the Schar School of Policy and Government, a member of the U.S. Army’s Mad Scientist team, and national security consultant. The paper is part of a special issue on long-term risks and special governance, with the unexpected effects of emerging technology are a key consideration.
ADVERTISEMENT
“Technology is definitely bringing more power to the people,” says Kallenborn. “The open question is how much capability is really needed to generate existential harm.”
Kallenborn notes that unlike state actors, terrorist groups generally lack capacity to build effective weapons of mass destruction such as nuclear warheads. The best-known apocalyptic group, the Aum Shinrikyo cult, carried out several research projects including work on biological warfare. But they were forced to scale back their ambitions, and the cult’s final effort was a nerve gas attack on the Tokyo subway in 1995 which caused fourteen deaths and affected thousands more. This was an appalling total, but still far short of the group’s apocalyptic goal.
Rather than developing a superweapon themselves, a modern terrorist group could carry out a form of sabotage, a spoiler attack, to cause a cataclysm.
For example, terrorists could leverage the potential risks in advanced AI research, an areas which which some warn carries “risk of extinction,” and leading to calls for strict safeguards on research. Rather than building their own super-intelligent AI, terrorists might carry out a spoiler attack to break through the safeguards preventing an AI from being developed beyond a certain stage or released. This might be carried out remotely via hacking, on the spot by recruiting or subverting researchers, or by an armed intrusion into a research facility.
ADVERTISEMENT
Spoiler attacks might also target biological research or nanotechnology projects, both areas where high levels of safeguarding are required. The authors notes that new tools such as CRISPR, rapid DNA sequencing and DNA/RNA synthesis mean that there are now far more groups working on potentially hazardous biological projects. The unproven lab leak theory that COVID-19 escaped from a Chinese research facility could be a blueprint for a spoiler attack.
A spoiler attack breaching safeguards will not necessarily bring about the end of the world, or even cause casualties. A super-AI might be entirely benevolent, and a virus might be relatively harmless, or easily brought under control. Escaping nanotechnology might not bring about the sort of world-ending gray goo nightmare that technologists fear and commentators, including now-King Charles have warned about. But a spoiler attack is a low-cost approach with a small but significant chance of triggering a global catastrophe. It is a risk that governments need to be aware of.
“To combat existential terrorism, governments should focus on incorporating terrorism-related risks into broader existential risk mitigation efforts,” says Kallenborn. “For example, when thinking about artificial super intelligence risks, governments should think about how terrorists might throw a wrench in their plans or simply ignore safeguards.”
ADVERTISEMENT
This is not so very different to the requirement that nuclear power stations need to be robust enough to withstand terrorist attack, except that the threat is broader and the stakes even higher.
“Governments should dedicate resources to more effectively characterizing and assessing the threat and response options,” says Kallenborn. “That’s not a big investment.”
It might be argued that the risk of existential terror attacks has receded as millennial cults have now declined. The 90s saw a slew of such groups obsessed with the end of the world. In some cases these groups were involved with loss of life on a large scale, including Aum Shinrikyo, Heaven’s Gate
GT
ADVERTISEMENT
Gary Ackerman, an associate professor and associate dean at the College of Emergency Preparedness, Homeland Security and Cybersecurity at the University at Albany and the report’s other co-author, told Forbes that the many of the conspiracy-minded, internet-based movements of today are modern incarnations of the same philosophies.
“There are several ideologies that foresee doom, whether these are environmentally-based or technology-based,” says Ackerman. “A lot of the more modern movements are also more syncretic in that they tend to blend, often in a contradictory manner, a variety of strains of thought…Many of these groups are simply lumped in with all the other far-right extremist groups, when they actually have a much more apocalyptic outlook that encompasses many of the worldviews of previous cults.”
As the paper notes world-ending terrorists might be motivated by somethign other than religion, such as extreme environmentalism. The Voluntary Human Extinction Movement seeks to phase out humans, and it is a small step from there to genocide to save to planet. The authors also mention Strong Negative Utilitarianism, the philosophical view that human suffering can best be ended by ending humans.
ADVERTISEMENT
Existential terror may sound like the stuff of Hollywood thrillers rather than real life, something for people to worry about in the far future. But it would be a mistake to ignore it.
“There are lots of uncertainties exactly when the threat might grow to something that is significant,” says Ackerman. “But if we don’t start at least thinking about it and monitoring the threat fairly regularly, it might be too late to do anything about it whenever the inflection point is reached.”
Until recently, a global pandemic was also considered a theoretical risk, one which experts said was possible but only happened in the movies. Now we know how such threats can easily become reality, perhaps existential terrorism will get the attention it needs.
Source: https://www.forbes.com/sites/davidhambling/2023/06/23/new-report-warns-terrorists-could-cause-human-extinction-with-spoiler-attacks/