news News

A critical opportunity to ban killer robots – while we still can
by Amnesty, Campaign to Stop Killer Robots
1:21pm 8th Nov, 2021
Nov. 2021
Amnesty International and the Stop Killer Robots campaign have unveiled a social media filter which provides a terrifying glimpse of the future of war, policing and border control. Escape the Scan, a filter for Instagram and Facebook, is part of a major campaign calling for a new international law to ban autonomous weapons systems.
It uses augmented reality (AR) technology to depict aspects of weapons systems that are already in development, such as facial recognition, movement sensors, and the ability to launch attacks on ‘targets’ without meaningful human control.
Several countries are investing heavily in the development of autonomous weapons, despite the devastating human rights implications of giving machines control over the use of force.
In December, a group of UN experts will meet to decide whether to begin negotiating new international law on autonomy in weapons systems. Amnesty International and Stop Killer Robots have launched a petition calling on all governments to voice their support for negotiations.
“We are stumbling into a nightmare scenario, a world where drones and other advanced weapons can choose and attack targets without human control. This filter is designed to give people an idea of what killer robots could soon be capable of, and show why we must act urgently to maintain human control over the use of force,” said Verity Coyle, Amnesty International’s Senior Advisor on Military, Security and Policing.
“Allowing machines to make life-or-death decisions is an assault on human dignity, and will likely result in devastating violations of the laws of war and human rights. It will also intensify the digital dehumanisation of society, reducing people to data points to be processed. We need a robust, legally binding international treaty to stop the proliferation of killer robots – before it’s too late.”
“We have had a decade of talks on autonomous weapons at the United Nations, but these are being blocked by the same states that are developing the weapons,” said Ousman Noor of the Stop Killer Robots campaign.
“The UN Secretary General, the International Committee of the Red Cross, Nobel Prize Winners, thousands of scientists, roboticists and tech workers, are all calling for a legal treaty to prevent these weapons – governments need to draw a line against machines that can choose to kill.”
On 2 December 2021, the Group of Governmental Experts to the Convention on Conventional Weapons (CCW) will begin critical talks on whether to proceed with negotiations on a new treaty to address the threat posed by killer robots. So far 66 states have called for a new, legally binding framework on autonomy in weapons systems.
But progress has been stalled by a small number of powerful states, including Russia, Israel and the US, who regard the creation of a new international law as premature.
The replacement of troops with machines will make the decision to go to war easier. What’s more, machines can’t make complex ethical choices within the context of unpredictable battlefield or real world scenarios; there is no substitute for human decision making.
We have already seen how technologies like facial, emotion, gait and vocal recognition fail to recognize women, people of colour and persons with disabilities; and how they cause immense human rights harms even when they “work”. Employing these technologies on the battlefield, in law enforcement or border control would be disastrous.
Despite these concerns, countries including the US, China, Israel, South Korea, Russia, Australia, India, Turkey and the UK are investing heavily in the development of autonomous systems.
For example, the UK is developing an unmanned drone which can fly in autonomous mode and identify a target within a programmed area. China is creating small drone “swarms” which could be programmed to attack anything that emits a body temperature, while Russia has built a robot tank which can be fitted with a machine gun or grenade launcher.
Stop Killer Robots is a global coalition of more than 180 international, regional, and national NGOs and academic partners working across 66 countries to ensure meaningful human control over the use of force through the development of new international law. Amnesty International is one of nine organizations on the coalition’s steering committee.
Aug. 2021
Killer Robots: Urgent need to fast-track talks.
Governments should make up for lost time by moving urgently to begin negotiations on a new treaty to retain meaningful human control over the use of force, Human Rights Watch said in a report released this week.
Representatives from approximately 50 countries will convene on August 3, 2021 at the United Nations in Geneva for their first official diplomatic meeting on lethal autonomous weapons systems, or “killer robots,” in nearly a year.
The report, “Areas of Alignment: Common Visions for a Killer Robots Treaty,” co-published by Human Rights Watch and the Harvard Law School International Human Rights Clinic, describes the strong objections to delegating life-and-death decisions to machines expressed by governments at the last official Convention on Conventional Weapons (CCW) meeting on killer robots. That meeting, held in September 2020, featured proposals from many countries to negotiate a new international treaty to prohibit and restrict autonomous weapons.
“International law needs to be expanded to create new rules that ensure human control and accountability in the use of force,” said Bonnie Docherty, senior arms researcher at Human Rights Watch and associate director of armed conflict and civilian protection at the Harvard Human Rights Clinic.
“The fundamental moral, legal, and security concerns raised by autonomous weapons systems warrant a strong and urgent response in the form of a new international treaty.”
Nearly 100 countries have publicly expressed their views on killer robots since 2013. Most have repeatedly called for a new international treaty to retain meaningful human control over the use of force, including 31 that have explicitly called for a ban on lethal autonomous weapons systems.
Yet a small number of militarily advanced countries – most notably Israel, Russia, and the United States – regard any move to create new international law as premature. They are investing heavily in the military applications of artificial intelligence and developing air, land, and sea-based autonomous weapons systems.
Governments have expressed support for banning autonomous systems that are legally or morally unacceptable, the groups said.
There is strong interest in prohibiting weapons systems that by their nature select and engage targets without meaningful human control, including complex systems that use machine-learning algorithms to produce unpredictable or inexplicable effects.
There are further calls to ban antipersonnel weapons systems that rely on profiles derived from biometric and other data collected by sensors to identify, select, and attack individuals or categories of people.
“Killing or injuring people based on data collected by sensors and processed by machines would violate human dignity,” Docherty said. “Relying on algorithms to target people will dehumanize warfare and erode our humanity.”
Many countries have proposed complementing these prohibitions with regulations to ensure that all other autonomous weapons systems are only used with meaningful human control, the groups said.
“Meaningful human control” is widely understood to require that technology is understandable and predictable and that its operations are constrained in space and time.
An October 2020 report by Human Rights Watch and the International Human Rights Clinic recommended elements for a new treaty on killer robots that largely align with the proposals made by countries that participated in the September 2020 meeting.
Decisions at the Convention on Conventional Weapons are by consensus, which allows a few countries – or even a single country – to block an agreement sought by a majority. A new treaty, however, does not have to be negotiated under Convention on Conventional Weapons auspices, and there are signs that political leaders are anxious to move on and achieve a faster, more lasting result.
In July, New Zealand’s minister for disarmament and arms control, Phil Twyford, warned that the current diplomatic talks “are not delivering” and suggested those concerned by the prospect of autonomous weapons systems come together and “design something truly fit-for-purpose.”
He added, “For many of us, the idea that a computer could autonomously identify and attack a target will be unconscionable.”
A broad range and growing number of countries, institutions, private companies, and individuals have reiterated their desire for a ban on lethal autonomous weapons systems.
In May, the International Committee of the Red Cross (ICRC) called for countries to negotiate an international treaty to prohibit autonomous weapons systems that are unpredictable or target people and establish regulations to ensure human control over other systems.
Since 2018, United Nations Secretary-General António Guterres has urged states to prohibit weapons systems that could, by themselves, target and attack human beings, calling them “morally repugnant and politically unacceptable.”
The 31 countries demanding a ban on killer robots are Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China (on use only), Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, El Salvador, Ghana, Guatemala, the Holy See, Iraq, Jordan, Kazakhstan, Mexico, Morocco, Namibia, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe.
Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, the coalition of more than 180 nongovernmental organizations in 67 countries that advocates for a treaty to maintain meaningful human control over the use of force and prohibit weapons systems that operate without such control.
“It’s feasible and essential to draw the line now on problematic emerging technologies by negotiating a new international treaty to retain meaningful human control over the use of force,” Docherty said. “There should be no more delays.”
May 2021
International Committee of the Red Cross backs Killer Robot Ban, writes Mary Wareham, Advocacy Director, Arms Division at Human Rights Watch.
The Geneva-based International Committee of the Red Cross (ICRC) is calling on governments to ban fully autonomous weapons.
ICRC President Peter Maurer said yesterday that he hopes the humanitarian organization’s public backing for new legally binding rules to prohibit and regulate autonomous weapons will help spur “political action at the international level” and “collectively draw a line that is in the interest of people” and “ultimately, our shared humanity.”
As technology develops rapidly, the ICRC has found that the laws of war “do not provide all the answers” to ensure that commanders and weapon operators retain sufficient human control over weapons systems.
The increased use of weapons systems with autonomy in today’s armed conflicts underscores the importance of creating a new international legal standard now, before it is too late.
A United Nations report issued last year details how fighters in Libya “were subsequently hunted down and remotely engaged” by the Turkish-manufactured STM Kargu-2 loitering munition. During the recent conflict over Nagorno-Karabakh, Azerbaijan government forces used various loitering munitions, such as the Harop developed by Israel Aerospace Industries.
Once launched, this so-called “suicide drone” loiters in the air for a period searching for a target, which it attacks once detected. Regarded by the ICRC as the only real form of “offensive” autonomous weapon deployed today, loitering munitions are configured to allow the human operator to monitor and intervene in its operation.
Since 2018, United Nations Secretary-General António Guterres has urged states to move to prohibit weapons systems that could, by themselves, target and attack human beings, calling them “morally repugnant and politically unacceptable.” Last year, Pope Francis warned lethal autonomous weapons systems would “irreversibly alter the nature of warfare, detaching it further from human agency.”
Dozens of countries have expressed support for negotiating a new international law to prohibit and restrict autonomous weapons. But major military powers – most notably Russia and the United States – have repeatedly thwarted moves to begin negotiations, arguing it is “premature” to attempt regulation.
Frustration over the lack of progress in diplomatic talks suggests a new process should be undertaken to negotiate an international treaty on killer robots that many countries seek.
The ICRC decision may mark a turning point, given that no international arms treaties have been adopted in recent decades without its support and participation. As the guardian of international humanitarian law, the ICRC is an indispensable partner for governments, UN agencies, and nongovernmental organizations working to protect civilians during armed conflict.
Nov. 2020
Given the urgent need for regulation, the Campaign to Stop Killer Robots is disappointed that diplomatic talks on lethal autonomous weapons systems have been postponed until 2021.
It is essential to abide by restrictions put in place by authorities to slow
the spread of the Covid-19 pandemic. However, the Convention on Conventional Weapons (CCW) could have adapted to the circumstances by proceeding to hold the meeting virtually.
Highly militarized nations cannot be allowed to perpetuate hegemony through technology. The Campaign to Stop Killer Robots cannot tolerate the unrestrained development of weapons systems enabled to use computer programming and sensors to identify and select targets. This brings the world closer to machines making decisions over whom to kill.
States must develop coherent and comprehensive national policy on killer robots that values humanity and promotes the principle of human control. They must cooperate to launch international treaty negotiations in 2021.
States should collaborate on the structure and key elements for a new ban treaty to retain meaningful human control over the use of force.
The last Convention on Conventional Weapons (CCW) meeting on killer robots took place at the United Nations in Geneva on 21-25 September under Group of Governmental Experts chair Jivan Gjiorginski from North Macedonia. Russia did not attend the CCW meeting, but raised procedural concerns in the lead-up and strongly recommended that two meetings planned for 2020 be postponed until 2021.
On 29 October, the Campaign convened a virtual briefing to share new research on concerns raised by removing human control from the use of force and potential impact on marginalized groups as well as elements of and precedent for a new international ban treaty.
Nov. 2020
Prevention of an Arms Race in Outer Space
This statement was drafted on behalf of civil society by Project Ploughshares Senior Researcher Jessica West. Dr. West presented it to the United Nations General Assembly First Committee for Disarmament and International Security on October 13, 2020. Published in The Ploughshares Monitor
"We have just marked World Space Week, designated by the United Nations to celebrate the contributions of space to the betterment of humanity. This year’s theme is “Satellites Improve Life.” Never has this been more evident than during the Covid-19 pandemic, when satellite communications have become a universal lifeline in a time of physical separation.
Today, our dependence on space is matched by its growing vulnerability to the use of weapons and the conduct of warfare.
While the international community has struggled to preserve outer space as a peaceful domain free of weapons, an arms race has been bubbling beneath the surface.
We know that electronic warfare – the jamming of satellite communications – is rampant. We have witnessed three states demonstrate a hit-to-kill anti-satellite capability using ground-based weapons systems; this capability is not limited to these actors.
There is evidence that the development of other anti-satellite capabilities such as directed energy weapons is accelerating. And there are suggestions that satellites with weapons capabilities may already be in orbit.
These actions threaten war. No one wants it, yet multiple states are actively preparing for it. The risk of unintentional conflict through mishaps, misinterpretations, and miscommunications is great. Diplomatic action is needed.
At this Committee, support for the Prevention of an Arms Race in Outer Space – PAROS – remains strong. But the divides over how to implement this objective – whether through legal restrictions, political commitments or normative understandings of responsible behaviour – remain equally strong. These are not mutually exclusive options. None can progress without efforts to enhance trust and transparency.
It’s time to reset the conversation. A new initiative by the United Kingdom to support “a global discussion to avoid conflict in space” is welcomed. By asking what kind of behaviours or activities in space seem threatening, there is an opportunity to find common ground and to avoid slipping into unwanted military confrontations.
But success will depend on good-faith participation, as well as a willingness to listen. These are qualities that should be applied to all initiatives.
It is in this spirit that we also urge states to: Oppose the use of any space-based or ground-based capabilities to deliberately disrupt, damage or destroy space assets. Indicate support for an agreement to prevent an arms race in outer space, and for necessary transparency and confidence-building measures towards that end.
Beyond these political commitments, there is a clear need for states to lead through example: to refrain from testing weapons systems targeting space, to bring greater transparency to military activities, to demonstrate the type of behaviours in outer space that contribute to stability and peaceful uses, and to call out those who violate these principles.
Any use of force in outer space would be difficult to contain. There is no separate zone in outer space for warfighting: the whole domain would become the battlefield. It threatens thousands of satellites, connected to billions of people all around the world. It risks mass contamination of a fragile environment. And it has the potential to spill over into other domains. We cannot wait for a crisis to act.
* Signed: Project Ploughshares; Women’s International League for Peace and Freedom; Canadian Pugwash Group; Rideau Institute

Visit the related web page
Next (more recent) news item
Next (older) news item