Two nuclear policy experts wrote a controversial article advocating making it easier for the president to use nuclear weapons and for an AI controlled dead hand switch
The use of nuclear weapons has been a Sword of Damocles hanging over the heads of humanity. During the Cold War, the United States and the Soviet Union each built fearsome arsenals that could have wiped humanity off the planet several times over, which they then used in a dangerous game of brinksmanship. Endless aggressive moves and appeasement strategies lead to a generation of people worried that the next political crisis would turn into an apocalypse.
An article printed in the National Security blog War on the Rocks written by nuclear policy wonks turned college professors Adam Lowther and Curtis McGiffin, proposed making it easier for the president to launch a nuclear strike AND to create a "dead hand" switch that would give control of American nuclear weapons to an AI system.
From the article:
While the psychology of deterrence has not changed, we believe that time compression is changing the risk-reward calculation of our adversaries. Nuclear deterrence creates stability and depends on an adversary’s perception that it cannot destroy the United States with a surprise attack, prevent a guaranteed retaliatory strike, or prevent the United States from effectively commanding and controlling its nuclear forces. That perception begins with an assured ability to detect, decide, and direct a second strike. In this area, the balance is shifting away from the United States.
While many opponents of nuclear modernization oppose the current plan to field the ground-based strategic deterrent and long-range stand-off cruise missile, we believe these programs, while necessary, do not fundamentally solve the attack-time compression challenge. Rather than simply replacing current systems with a new version, it is time to fundamentally rethink the U.S. approach to nuclear deterrence.
In short, the idea of a Dead Hand switch is that an AI system will be able to launch a counterattack if no one living is able to control or orchestrate a retaliation. This grim scenario, which is also the basic plot of the Terminator series, is a way to prevent an enemy nation from thinking they can attack with such ferocity that the United States wouldn't be able to launch a counterattack.
Vice wrote an article on the reaction to the article, where many experts were alarmed at the suggestion.
“Its, uh, quite the article,” Peter W. Singer, a Senior Fellow at the New America Foundation, said of the War on the Rocks blog in an email. Singer admitted Lowther and McGiffin proposed some good ideas, such as increasing investment in reconnaissance. “Then some ideas cross into bad science fictionland.”
Singer says the use of artificial intelligence in America’s nuclear command and control systems set off alarm bells, but it wasn’t the worst thing the pair suggested. “For me the stand out was proposing a change in ‘first-strike policy that allowed the president to launch a nuclear attack based on strategic warning,’” Singer said. “We have a President who just anger-tweeted Grace from Will & Grace and pondered nuking hurricanes and you're proposing that we should LOWER the threshold for the use of nuclear weapons? Read the room.”
We are entering a new and scary age. Russia is now developing nuclear weapons that can hit anywhere in the United States and our commander in chief is an impulsive goon with an easily wounded ego. It's not particularly comforting to add fewer preventative measures to nuclear war.