- Gen. Paul Selva advocated for keeping the ethical rules of war in place
- Selva said humans needed to remain in the decision making process
Selva was responding to a question from Sen. Gary Peters, a Michigan Democrat, about his views on a Department of Defense directive that requires a human operator to be kept in the decision-making process when it comes to the taking of human life by autonomous weapons systems.
Peters said the restriction was "due to expire later this year."
"I don't think it's reasonable for us to put robots in charge of whether or not we take a human life," Selva told the Senate Armed Services Committee during a confirmation hearing for his reappointment as the vice chairman of the Joint Chiefs of Staff, during which a wide range of topics were covered, including North Korea, Iran and defense budget issues.
He predicted that "there will be a raucous debate in the department about whether or not we take humans out of the decision to take lethal action," but added that he was "an advocate for keeping that restriction."
Selva said humans needed to remain in the decision making process "because we take our values to war." He pointed to the laws of war and the need to consider issues like proportional and discriminate action against an enemy, something he suggested could only be done by a human.
His comments come as the US military has sought increasingly autonomous weapons systems
In July 2016, a group of concerned scientists, researchers and academics, including theoretical physicist Stephen Hawking and billionaire entrepreneur Elon Musk, argued against the development of autonomous weapons systems. They warned of an artificial intelligence arms race and called for a "ban on offensive autonomous weapons beyond meaningful human control."
But Peters warned that America's adversaries may be less hesitant to adopt such lethal technology.
"Our adversaries often do not to consider the same moral and ethical issues that we consider each and every day," the senator told Selva.
Selva acknowledged the possibility of US adversaries developing such technology, but said the decision not to pursue it for the US military "doesn't mean that we don't have to address the development of those kinds of technologies and potentially find their vulnerabilities and exploit those vulnerabilities."