230124_afrl_leidos_hypersonic

AFRL is researching a large air-breathing hypersonic vehicle for both ISR and strike missions, called Mayhem. (AFRL/Leidos rendering)

WASHINGTON — The US and other nations risk global catastrophe is they do not adopt a new overarching framework to regulate disruptive technologies like artificial intelligence, autonomy, robotics, cyberweapons and hypersonic missiles, a new report from the Arms Control Association warns.

The report, released Tuesday [PDF], says that without adopting such measures, “cutting-edge technologies will be converted into military systems at an ever-increasing tempo, and the dangers to world security will grow apace. 

“A more thorough understanding of the distinctive threats to strategic stability posed by these technologies and the imposition of restraints on their military use would go a long way toward reducing the risks of Armageddon,” according to the report. 

The report lays out the challenges with emerging technologies and calls out the Defense Department on its shortfalls surrounding its recent AI guidelines. It specifically points to how DoD failed to incorporate department-wide implementation of AI regulation procedures in its AI “ethical principles” and “merely reiterated the principles principles incorporated into the original guidelines and attached a blueprint for further action by DoD agencies” in its Responsible AI Strategy.

“But even as the Department of Defense—and the militaries of the other major powers—rush ahead with the weaponization of advanced technologies, many analysts and policymakers have cautioned against moving with such haste until more is known about how the various military capabilities stemming from these technologies may lead to unintended and hazardous outcomes,” according to the report. 

“Non-military devices governed by AI, such as self-driving cars and facial-recognition systems, have been known to fail in dangerous and unpredictable ways; should similar failures occur among AI-empowered weaponry during wartime, the outcomes could include the unintended slaughter of civilians or the outbreak of nuclear war,” the report continues.

So what do the authors propose? A framework that reduces “the risk to strategic stability” and includes educating policymakers and the public about the risks posed by unregulated use of emerging technologies. It also calls for experts from the US, China and Russia to “meet on an informal basis to discuss possible limits on the deployment of hypersonic missiles or on methods for reducing cyber threats to nuclear command-and-control systems.”

“Given the current state of international affairs, it will prove difficult for the U.S. and Russia or the U.S. and China—or all three meeting together—to agree on formal measures for the control of especially destabilizing technologies,” the report says. “It should, however, be possible for these states to adopt unilateral measures in the hope that they will induce parallel steps by their adversaries and eventually lead to binding bilateral and multilateral agreements.”

The framework also calls for a series of “strategic stability talks” from government officials, military officers and technical experts from opposing sides to better understand how adversaries view the risks of disruptive technologies, while noting that some preliminary efforts have already been made in this area with no real results. 

Lastly, the framework says bilateral and multilateral agreements between the US, China and Russia should be made to minimize the risks posed by the weaponization of emerging technologies. 

“As an example of a useful first step, the leaders of the major nuclear powers could jointly pledge to eschew cyberattacks against each other’s nuclear C3I systems,” according to the report. “This need not take the form of a binding treaty, but could be incorporated into a joint statement by leaders of the countries involved.”