Legal Uncertainty, Growing Concerns Show Urgent Need for Regulation
- Governments should open negotiations to adopt new international law on lethal autonomous weapons systems, also known as “killer robots.”
- Existing international law is not adequate to address the urgent threats posed by such weapons, which several countries are developing.
- Countries should consider options for moving the process ahead, including a stand-alone process or through the United Nations General Assembly.
(Washington, DC, December 1, 2021) – Governments should agree to open negotiations on a new treaty to retain meaningful human control over the use of force, Human Rights Watch said in a report released today. Countries will be meeting at the United Nations in Geneva in December 2021 to decide whether to begin negotiations to adopt new international law on lethal autonomous weapons systems, also known as “killer robots.”
The 23-page report, “Crunch Time on Killer Robots: Why New Law Is Needed and How It Can Be Achieved,” by Human Rights Watch and the Harvard Law School International Human Rights Clinic, finds that international law should be strengthened and clarified to protect humanity from the dangers posed by weapons systems that select and engage targets without meaningful human control.
“After eight years discussing the far-reaching consequences of removing human control from the use of force, countries now need to decide how to respond to those threats,” said Bonnie Docherty, senior arms researcher at Human Rights Watch and associate director of armed conflict and civilian protection at the Harvard Human Rights Clinic. “There’s an urgent need for a dedicated treaty to address the shortcomings of international humanitarian law and update it to deal with the legal, ethical, and societal challenges of today’s artificial intelligence and emerging technologies.”
The Sixth Review Conference of the Convention on Conventional Weapons (CCW), to be held from December 13-17, is a major juncture for international talks on killer robots. At the last CCW meeting on killer robots in September, most countries that spoke called for a new legally binding instrument on autonomous weapons systems. Chile, Mexico, and Brazil urged treaty members to agree to initiate negotiations of new international law. Other proponents included the ‘Group of Ten’ states (Argentina, Costa Rica, Ecuador, El Salvador, Palestine, Panama, Peru, Philippines, Sierra Leone, and Uruguay) and states of the Non-Aligned Movement.
There are various possible forums for negotiating a new treaty on autonomous weapons systems. Other than the CCW, options include a stand-alone process, as was used for the treaties banning antipersonnel landmines and cluster munitions, and the United Nations General Assembly, where the nuclear weapons ban treaty was negotiated.
Existing international humanitarian law is not adequate to address the problems posed by autonomous weapons systems, Human Rights Watch and the Harvard Clinic said. There is widespread support for developing new law and any divergence of views reinforces the need to clarify existing law. A new treaty would address the concerns raised by these weapons systems under international humanitarian law, ethics, international human rights law, accountability, and security.
Such a treaty should cover weapons systems that select and engage targets on the basis of sensor, rather than human, inputs. Most treaty proponents have called for a prohibition on weapons systems that by their nature select and engage targets without meaningful human control, such as complex systems using machine-learning algorithms that produce unpredictable or inexplicable effects.
Some countries have also expressed an interest in banning weapons systems that rely on profiles derived from biometric and other data collected by sensors to identify, select, and attack individuals or categories of people.
Many countries propose complementing these prohibitions with regulations to ensure that all other autonomous weapons systems are only used with meaningful human control. “Meaningful human control” is widely understood to require that technology is understandable, predictable, and constrained in space and time.
Progress toward negotiations at the CCW seems unlikely given that the body operates by consensus and there is opposition from a small number of military powers, most notably India, Russia, and the United States, which regard existing international humanitarian law as sufficient to address any problems raised by these weapon systems. These and countries such as Australia, China, Israel, South Korea, and Turkey are investing heavily in the military applications of artificial intelligence and related technologies to develop air, land, and sea-based autonomous weapons systems.
“An independent process to negotiate new law on killer robots would be more effective and inclusive than the current diplomatic talks and other alternatives,” Docherty said. “But moving to a fast-track process can only be done with the active support of political leaders.”
A broad range and growing number of countries, institutions, private companies, and individuals have reiterated their desire for a ban on killer robots. In May, the International Committee of the Red Cross called for countries to negotiate an international treaty to prohibit autonomous weapons systems that are unpredictable or target people and establish regulations to ensure human control over other systems. Since 2018, United Nations Secretary-General António Guterres has urged states to prohibit weapons systems that could, by themselves, target and attack human beings, calling them “morally repugnant and politically unacceptable.”
Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, the coalition of more than 185 nongovernmental organizations in 67 countries that advocates for a treaty to maintain meaningful human control over the use of force and prohibit weapons systems that operate without such control.
“Much opposition to killer robots reflects moral repulsion to the idea of machines making life-and-death decisions,” Docherty said. “A new treaty would fill the gap in international treaty law and protect the principles of humanity and dictates of public conscience in the face of emerging weapons technology.”