People's Stories Peace


Syria: Why we fear bloodshed in Idlib
by ICRC, Norwegian Refugee Council, agencies
Syria
 
12 Oct. 2018
 
Aid groups fear for civilians in Idlib as ceasefire deal deadline looms - Report from CARE, Mercy Corps, International Rescue Committee, Save the Children
 
Four international aid agencies working in Syria’s north-west region of Idlib have warned of dire consequences for millions of civilians if the Russia-Turkey deal, due to be implemented by October 15, doesn’t result in a sustained reduction of violence in this overcrowded province.
 
Local organizations that partner with CARE International, the International Rescue Committee (IRC), Mercy Corps and Save the Children, as well as civilians receiving aid have expressed fears that violence could spiral out of control in the next few days if either the deal collapses or fighting escalates in areas not covered by it. Almost 3 million people live in Idlib, and it is estimated that even a limited military offensive would displace hundreds of thousands of people.
 
“Idlib residents, and aid workers hold their breath as the deadline for a political deal looms. While the terms of the agreement are known, we don’t know what the plan is if parties on the ground fail to implement it. Will it be all out war? Over and over again, similar deals have simply ended in a bloodbath. Civilians caught in this stand-off must be spared at all costs,” said Wouter Schaap, Syria country director for CARE International.
 
“The people of Idlib need a deal that offers long-term protection to civilians and allows aid to reach all those in need. Aid efforts are already stretched to the maximum in Idlib, where the population has doubled in recent years as people relocated there from areas retaken by the government of Syria. Aid organizations are at full capacity responding to the current needs of both displaced people and local communities. Though we are prepared to respond to any emergency, if this deal falls short and military operations start, many hundreds of thousands will struggle to get the help they will so badly need,” said Lorraine Bramwell, IRC Syria Country Director.
 
In September, Russia and Turkey agreed to create a demilitarized area in Idlib, which armed groups must leave by a provisional deadline of October 15. Provided it is implemented in line with International Humanitarian Law and does not result in an increase in violence in areas outside the demilitarized zone, the agreement could offer a potential lifeline to the people of Idlib. Civilians have already lived through years of war, during which many families have been forced to flee their homes multiple times.
 
Now, reports that different parties to the conflict are refusing to engage with the terms of the deal and commit to it long-term threaten to undermine the hope of a reduction in violence in Idlib.
 
“We already see the impact of this nerve-wracking situation on children, who tell us they are terrified at the prospect of more violence. The school year has barely started, but the facilities we support are making contingency plans to suspend classes and training young children on how to evacuate in the event of an attack. Many children in Idlib have been forced to flee their homes up to a dozen times, forcing them to miss years of school and causing stress and upset. Renewed conflict would compound the suffering of more than a million children in Idlib,” said Sonia Khush, Syria Response Director at Save the Children.
 
“Some people we help have stocked up on food, expecting to be stuck at home for days on end if fighting resumes. Others have packed their bags and are ready to move at the first airstrike. In both cases, our aid workers might not be able to reach those people if the security situation doesn’t allow them to move. And everyone fears losing their life if bombs start raining from the sky. What we need is a deal that not only holds but is also extended to other parts of Idlib and guarantees full humanitarian access to people in need,” said Arnaud Quemin, Syria Country Director for Mercy Corps.
 
http://bit.ly/2yCvRy6 http://bit.ly/2Olxs6g
 
Sep. 2018
 
Syria: Why we fear bloodshed in Idlib
 
An escalation of violence in Idlib province could quickly become a humanitarian catastrophe, warns the Norwegian Refugee Council.
 
Idlib hosts the highest concentration of displaced people in Syria. Violence in and around Idlib over the last year has forced people to flee time and again. Half of Idlib’s population of three million has been displaced from other parts of Syria. Civilian lives would be at stake if hostilities intensify.
 
Idlib is the last remaining so-called “de-escalation” zone in Syria. The makeup of Idlib is similar to places like Eastern Ghouta, southern Syria and Raqqa. It has a diverse mix of armed groups – some designated terrorists – and a high number of civilians in the same areas. Efforts must be taken to avoid targeting civilians, their homes, hospitals and schools.
 
Tens of thousands of civilians depend entirely on aid. Their greatest lifeline is assistance from humanitarian organisations. Many families lack access to clean water and sanitation facilities. Children who have been out of school for years have forgotten how to read. A battle over Idlib’s future risks cutting off lifesaving aid to countless people in need.
 
A staggering 700,000 people could be displaced overnight. Already people are on the move again out of fear of what might come next, and are struggling to find shelter. As NRC witnessed in Eastern Ghouta or Aleppo, routes to safety came too little too late for thousands of civilians trapped in the violence.
 
The only way to truly ensure the safety of civilians in Idlib is to prevent an outbreak in violence altogether. We must see the world’s top powers commit to peaceful negotiations and agreements which protect civilians and grant them safety.
 
As we look ahead, the humanitarian community must nevertheless prepare for the worst-case scenario. We must draw on what we have learned from previous emergencies in Syria and do our best to help people as they flee. http://bit.ly/2CkQ8xj
 
04 Sep 2018
 
Syria: Hostilities in Idlib should not produce massive civilian suffering - International Committee of the Red Cross (ICRC)
 
Statement from Fabrizzio Carboni, regional director for the Near and Middle East at the International Committee of the Red Cross.
 
As fears of renewed violence mount in Syria''s Idlib, I am concerned that a further increase in hostilities is bound to turn desperation into misery for large numbers of civilians. Syria has experienced more than seven years of agony, and we fear that renewed fighting in Idlib could produce suffering to rival the human misery seen in Aleppo, Eastern Ghouta and Raqqa. Living conditions for displaced people and host families are already extremely difficult, particularly in makeshift camps, where access to the most basic necessities is insufficient.
 
Intensified fighting in the vast Idlib area will put tens of thousands, if not hundreds of thousands, of people on the move. Civilians who flee, just like civilians who stay, are protected from attack. Those who leave must be allowed access to health care, food, water and sanitation.
 
The ICRC calls on all parties to the conflict to protect the wounded and sick, health personnel, humanitarian workers and infrastructure essential for people''s survival -- medical facilities, schools, water facilities, bakeries and agricultural lands -- at all times and in accordance with international humanitarian law. No distinction should ever be made among wounded and sick on any grounds other than medical ones.
 
Humanitarian agencies must be able to work. They must be able to reach the affected areas to provide lifesaving aid. Humanitarian aid should be allowed regularly and unconditionally to all those in need.
 
http://www.icrc.org/en/document/syria-hostilities-idlib-should-not-produce-massive-civilian-suffering
 
29 August 2018 (UN News)
 
United Nations Secretary-General António Guterres has expressed deep concern over the growing risk of a humanitarian catastrophe should a full-scale military operation take place in Syria’s war-battered Idlib province.
 
Mr. Guterres urgently appealed to the Government of Syria and all parties to exercise restraint and to prioritize the protection of civilians in the event of any escalation of the conflict.
 
The Secretary-General further called on all parties “to take all necessary measures to safeguard civilian lives, allow freedom of movement, and protect civilian infrastructure, including medical and educational facilities, in accordance with international humanitarian law and human rights law.”
 
The statement came one day after John Ging, Director of Operations with the UN Office for Coordination of Humanitarian Affairs (OCHA), told the UN Security Council that intense aerial bombardment and shelling in Idlib and three other governorates in north-west Syria have left death, damage and destruction in their wake, and placed an even greater strain on aid workers and communities hosting displaced people.
 
14 Aug. 2018
 
Statement by Panos Moumtzis, Regional Humanitarian Coordinator for the Syria Crisis, on Civilian Casualties in Northwest Syria. (OCHA)
 
I am appalled over the reported deaths of at least 116 civilians, many of them women and children, in Idleb and Aleppo governorates over the weekend due to ongoing violence and hostilities.
 
This extreme violence is completely unacceptable. I remain deeply concerned for the safety and protection of the millions of civilians living in this area, many of them displaced multiple times, and am alarmed such incidents are part of a further escalation of the conflict in the area.
 
A military operation in Idleb and surrounding areas similar to what was seen in other parts of Syria will not only endanger many of the more than 3 million civilians in this densely populated area, but will likely severely impact humanitarian partners’ ability to deliver life-saving assistance.
 
On Friday alone, heavy airstrikes on Big Orem town in western rural Aleppo Governorate reportedly killed at least 37 people, over half of whom were children, and injured dozens more.
 
Separately, at least nine people were reportedly killed and at least 40 people were injured, including women and children, after shelling on the town of Khan Shaykun in southern rural Idleb, while two people reportedly lost their lives after barrel bombs were dropped on the village of Tah and one person was killed in shelling on the village of Tahtay in southern rural Idleb.
 
On Sunday, at least 67 civilians lost their lives, 17 of them children, when a weapons and ammunition depot in a residential building near Sarmada town in northern rural Idleb Governorate exploded, injuring dozens more. Seventeen people were rescued from under the debris.
 
The UN condemns these horrific attacks directed against civilians and civilian infrastructure, including hospitals and schools. UN-sponsored peace talks should prioritize gaining the commitment of all parties to stop attacks on such infrastructure which are essential to the civilian population.
 
It is imperative all parties to the conflict and those with influence over them to come to a genuine and inclusive agreement to settle the conflict in Syria in a peaceful manner, to prevent the further suffering of the Syrian people. Civilians should not and must not be a target.
 
The humanitarian community reminds all parties to abide by their obligations under international humanitarian and human rights law to protect civilians and to spare no effort to prevent civilian casualties. http://bit.ly/2OF7ZAa
 
9 Aug. 2018
 
Syria: UN warns of ''bloodbath'' in Idlib. (BBC World Service)
 
The UN has warned that the destruction of Aleppo could be repeated in Idlib - the last part of Syria still held by rebel groups. Syrian government troops are reportedly preparing for a ground assault that could displace hundreds of thousands of people, many of them refugees and civilians from other nearby battles. Jan Egeland is the chair of the UN task force on humanitarian access in Syria - he tells us what the UN fears: http://bbc.in/2Bgqucu
 
http://news.un.org/en/tags/syria


Visit the related web page
 


Autonomous weapons systems that require no meaningful human control should be prohibited
by Campaign to Stop Killer Robots, agencies
 
August 21, 2018
 
Basic humanity and the public conscience support a ban on fully autonomous weapons, Human Rights Watch said in a report released today. Countries participating in an upcoming international meeting on such “killer robots” should agree to negotiate a prohibition on the weapons systems’ development, production, and use.
 
The 46-page report, Heed the Call: A Moral and Legal Imperative to Ban Killer Robots, finds that fully autonomous weapons would violate what is known as the Martens Clause. This long-standing provision of international humanitarian law requires emerging technologies to be judged by the “principles of humanity” and the “dictates of public conscience” when they are not already covered by other treaty provisions.
 
“Permitting the development and use of killer robots would undermine established moral and legal standards,” said Bonnie Docherty, senior arms researcher at Human Rights Watch, which coordinates the Campaign to Stop Killer Robots. “Countries should work together to preemptively ban these weapons systems before they proliferate around the world.”
 
The 1995 preemptive ban on blinding lasers, which was motivated in large part by concerns under the Martens Clause, provides precedent for prohibiting fully autonomous weapons as they come closer to becoming reality.
 
The report was co-published with the Harvard Law School International Human Rights Clinic, for which Docherty is associate director of armed conflict and civilian protection.
 
More than 70 governments will convene at the United Nations in Geneva from August 27 to 31, 2018, for their sixth meeting since 2014 on the challenges raised by fully autonomous weapons, also called lethal autonomous weapons systems. The talks under the Convention on Conventional Weapons, a major disarmament treaty, were formalized in 2017, but they are not yet directed toward a specific goal.
 
Human Rights Watch and the Campaign to Stop Killer Robots urge states party to the convention to agree to begin negotiations in 2019 for a new treaty that would require meaningful human control over weapons systems and the use of force. Fully autonomous weapons would select and engage targets without meaningful human control.
 
To date, 26 countries have explicitly supported a prohibition on fully autonomous weapons. Thousands of scientists and artificial intelligence experts, more than 20 Nobel Peace Laureates, and more than 160 religious leaders and organizations of various denominations have also demanded a ban. In June, Google released a set of ethical principles that includes a pledge not to develop artificial intelligence for use in weapons.
 
At the Convention on Conventional Weapons meetings, almost all countries have called for retaining some form of human control over the use of force. The emerging consensus for preserving meaningful human control, which is effectively equivalent to a ban on weapons that lack such control, reflects the widespread opposition to fully autonomous weapons.
 
Human Rights Watch and the Harvard clinic assessed fully autonomous weapons under the core elements of the Martens Clause. The clause, which appears in the Geneva Conventions and is referenced by several disarmament treaties, is triggered by the absence of specific international treaty provisions on a topic. It sets a moral baseline for judging emerging weapons.
 
The groups found that fully autonomous weapons would undermine the principles of humanity, because they would be unable to apply either compassion or nuanced legal and ethical judgment to decisions to use lethal force. Without these human qualities, the weapons would face significant obstacles in ensuring the humane treatment of others and showing respect for human life and dignity.
 
Fully autonomous weapons would also run contrary to the dictates of public conscience. Governments, experts, and the broader public have widely condemned the loss of human control over the use of force.
 
Partial measures, such as regulations or political declarations short of a legally binding prohibition, would fail to eliminate the many dangers posed by fully autonomous weapons. In addition to violating the Martens Clause, the weapons raise other legal, accountability, security, and technological concerns.
 
In previous publications, Human Rights Watch and the Harvard clinic have elaborated on the challenges that fully autonomous weapons would present for compliance with international humanitarian law and international human rights law, analyzed the gap in accountability for the unlawful harm caused by such weapons, and responded to critics of a preemptive ban.
 
The 26 countries that have called for the ban are: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China (use only), Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, Ghana, Guatemala, the Holy See, Iraq, Mexico, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe.
 
The Campaign to Stop Killer Robots, which began in 2013, is a coalition of 75 nongovernmental organizations in 32 countries that is working to preemptively ban the development, production, and use of fully autonomous weapons. Docherty will present the report at a Campaign to Stop Killer Robots briefing for CCW delegates scheduled on August 28 at the United Nations in Geneva.
 
“The groundswell of opposition among scientists, faith leaders, tech companies, nongovernmental groups, and ordinary citizens shows that the public understands that killer robots cross a moral threshold,” Docherty said. “Their concerns, shared by many governments, deserve an immediate response.”
 
http://www.hrw.org/news/2018/08/21/killer-robots-fail-key-moral-legal-test http://www.stopkillerrobots.org/2018/08/unsg/
 
Apr. 2018
 
Artificial intelligence researchers from nearly 30 countries are boycotting a South Korean university over concerns a new lab in partnership with a leading defence company could lead to “killer robots”.
 
More than 50 leading academics signed the letter calling for a boycott of Korea Advanced Institute of Science and Technology (KAIST) and its partner, defence manufacturer Hanwha Systems. The researchers said they would not collaborate with the university or host visitors from KAIST over fears it sought to “accelerate the arms race to develop” autonomous weapons.
 
“There are plenty of great things you can do with AI that will save lives, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern,” said Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales. “This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms.”
 
The boycott comes ahead of a United Nations meeting in Geneva next week on autonomous weapons, and more than 20 countries have already called for a total ban on killer robots. The use of AI in militaries around the world has sparked fears of a Terminator-like situation and questions have been raised about the accuracy of such weapons and their ability to distinguish friend from foe.
 
Hanwha is one of South Korea’s largest weapons manufacturers, and makes cluster munitions which are banned in 120 countries under an international treaty. South Korea, along with the US, Russia and China, are not signatories to the convention.
 
Walsh was concerned when a Korea Times article described KAIST as “joining the global competition to develop autonomous arms” and promptly wrote to the university asking questions but did not receive a response.
 
Subsequently KAIST’s president, Sung-Chul Shin, said he “would like to reaffirm that KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots,” Shin said in a statement.
 
“As an academic institution, we value human rights and ethical standards to a very high degree.. I reaffirm once again that KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.”
 
Toby Walsh said the reponse sets a very clear precedent for the future. I hope now that KAIST will lobby the Republic of Korea to call for meaningful human control at next week’s UN meeting on the topic of killer robots. Such small steps we hope will eventually lead to an outright ban.
 
http://bit.ly/2GC1A9l http://bit.ly/2yV0Nwv http://www.hrw.org/news/2018/09/11/were-running-out-time-stop-killer-robot-weapons http://futureoflife.org/lethal-autonomous-weapons-pledge/
 
Apr. 2018
 
April 2018 marks five years since the launch of Campaign to Stop Killer Robots. It is also the fifth time since 2014 that governments are convening at the Convention on Conventional Weapons (CCW) in Geneva to discuss concerns over lethal autonomous weapons systems, also known as fully autonomous weapons or “killer robots.”
 
The campaign urges states to participate in the CCW Group of Governmental Experts meeting, which opens at the United Nations (UN) on Monday, 9 April, and to commit to retain meaningful human control of weapons systems and over individual attacks.
 
Why the concern about killer robots?
 
Armed drones and other autonomous weapons systems with decreasing levels of human control are currently in use and development by high-tech militaries including the US, China, Israel, South Korea, Russia, and the UK. The concern is that a variety of available sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control.
 
If the trend towards autonomy continues, humans may start to fade out of the decision-making loop for certain military actions, perhaps retaining only a limited oversight role, or simply setting broad mission parameters.
 
Several states, the Campaign to Stop Killer Robots, artificial intelligence experts, faith leaders, and Nobel Peace Laureates, among others, fundamentally object to permitting machines to determine who or what to target on the battlefield or in policing, border control, and other circumstances. Such a far-reaching development raises an array of profound ethical, human rights, legal, operational, proliferation, technical, and other concerns.
 
While the capabilities of future technology are uncertain, there are strong reasons to believe that devolving more decision making over targeting to weapons systems themselves will erode the fundamental obligation that rules of international humanitarian law (IHL) and international human rights law be applied by people, and with sufficient specificity to make them meaningful.
 
Furthermore, with an erosion of human responsibility to apply legal rules at an appropriate level of detail there will likely come an erosion of human accountability for the specific outcomes of such attacks. Taken together, such developments would produce a stark dehumanization of military or policing processes.
 
What is a “killer robot”?
 
A weapons system that identifies, selects and employs force against targets without meaningful human control should be considered a lethal autonomous weapons system. It would have no human in the decision-making loop when the system selects and engages the target of an attack. Applying human control only as a function of design and in an initial deployment stage would fail to fulfill the IHL obligations that apply to commanders in relation to each “attack.”
 
Why the need for “human control”?
 
Sufficient human control over the use of weapons, and of their effects, is essential to ensuring that the use of a weapon is morally justifiable and can be legal. Such control is also required as a basis for accountability over the consequences of the use of force. To demonstrate that such control can be exercised, states must show that they understand the process by which specific systems identify individual target objects and understand the context, in space and time, where the application of force may take place.
 
Given the development of greater autonomy in weapon systems, states should make it explicit that meaningful human control is required over individual attacks and that weapon systems that operate without meaningful human control should be prohibited. For human control to be meaningful, the technology must be predictable, the user must have relevant information, and there must be the potential for timely human judgement and intervention.
 
States should come prepared to the CCW meeting provide their views on the key “touchpoints” of human/machine interaction in weapons systems. These include design aspects, such as how certain features may be encoded as target objects; how the area or boundary of operation may be fixed; the time period over which a system may operate; and, any possibility of human intervention to terminate the operation and recall the weapon system.
 
Based on these touchpoints, states should be prepared to explain how control is applied over existing weapons systems, especially those with certain autonomous or automatic functions.
 
What does the Human Rights Council say about killer robots?
 
The first multilateral debate on killer robots took place at the Human Rights Council in May 2013, but states have not considered this topic at the Council since then. Countries such as Austria, Brazil, Ireland, Sierra Leone, and South Africa affirm the relevance of human rights considerations and the Council in the international debate over fully autonomous weapons.
 
In February 2016, the UN Special Rapporteur on extrajudicial, summary or arbitrary executions and the Special Rapporteur on the rights to freedom of peaceful assembly and of association issued a report recommending that “autonomous weapons systems that require no meaningful human control should be prohibited.”
 
The UN Special Rapporteur on extrajudicial, summary or arbitrary executions last addressed a CCW meeting on lethal autonomous weapons in April 2016. Human rights are no longer considered relevant in the CCW talks, which raises the question of how to address human rights concerns with these weapons, particularly their use in law enforcement, border control and other circumstances outside of armed conflict.
 
http://www.stopkillerrobots.org/2018/03/fiveyears/
 
Aug. 2017
 
UN Secretary-General''s High Representative for Disarmament Affairs, Izumi Nakamitsu: “There are currently no multilateral standards or regulations covering military Artificial Intelligence (AI) applications.
 
The United Nations says it is “closely following developments related to the prospect of weapons systems that can autonomously select and engage targets, with concern that technological developments may outpace normative deliberations.” It expresses hope that UN member states “make meaningful progress toward a shared understanding on how to ensure the core values of the international community are safeguarded in this context.”
 
That’s according to a 22 May 2017 letter sent to the Campaign to Stop Killer Robots by the new Under Secretary-General High Representative for Disarmament Affairs, Izumi Nakamitsu, on behalf of the new UN Secretary-General, António Guterres. Guterres began his term on 1 January 2017, while Nakamitsu became the new UN disarmament chief on 1 May after working for the UN in humanitarian assistance and peacekeeping.
 
Nakamitsu first elaborated the UN’s “fundamental concerns” over killer robots in an address to the high-level “Artificial Intelligence for Good” summit convened by the International Telecommunication Union (ITU) in Geneva on 7 June. The six-page statement finds that fully autonomous weapon systems raise serious questions, including over their potential impact on international peace and security, the implications for global norms and mechanisms governing warfare, likely proliferation, and possibility they will be “sought after by unscrupulous actors with malicious intent.”
 
Under the heading of “What can we do?” the UN disarmament chief finds “there are currently no multilateral standards or regulations covering military AI applications.” She expresses the UN’s support for the process to discuss lethal autonomous weapons at the Convention on Conventional Weapons (CCW) in Geneva.
 
Nakamitsu says that states should decide “what they consider to be the acceptable degree of human control over the lethal functions of a weapon system, and whether a specific international treaty or instrument is required to ensure that control is maintained.”
 
However, the CCW process on killer robots is faltering and will not convene until November 2017 at the earliest, more than a year after the last substantive talks on the topic. A crucial week of formal discussions on killer robots that was due to take place in Geneva in April 2017 and then rescheduled to August has been cancelled because several states, most notably Brazil, failed to pay their dues for the convention’s meetings.
 
The Campaign to Stop Killer Robots strongly regrets this development and is working with Brazil and others to help resolve it so that the CCW process can continue. At this time, the campaign is intensifying its outreach in national capitals to check on the status of policy development and encourage legislative initiatives to ban fully autonomous weapons.
 
The campaign aims to engage at the regional level to build awareness and support for a collective response and it continues to explore other avenues that could lead states to adopt a new international instrument to retain meaningful human control over the critical functions of weapons systems.
 
Both the UN’s letter and statement call for “inclusive and comprehensive dialogue” on the concerns posed by lethal autonomous weapons systems. Nakamitsu recommends a “multi-sectoral and multi-stakeholder exchange” between governments and “civil society activists, the scientific community and the private sector.”
 
She views “human dignity and human security” as “essential” elements or principles to guide discussion, including the development of “human-centred norms.”
 
On 29 June 2017, Nakamitsu met with Campaign to Stop Killer Robots representatives Nobel Peace Laureate Jody Williams and Mary Wareham of Human Rights Watch in New York, where they discussed the revolutionary nature of fully autonomous weapons and paradigm shift they constitute for the conduct of warfare in future.
 
The UN letter was in response to March correspondence from the campaign. Since its launch in 2013 the campaign has engaged in regular dialogue with the UN disarmament chief, including previous representatives Angela Kane (until 2015) and then Kim Won-soo (until 2017)
 
In his remarks to the AI for Good summit, the Secretary-General of campaign co-founder Amnesty International, Salil Shetty, reiterated the urgent need for a pre-emptive ban on fully autonomous weapons. http://bit.ly/2x6PA6I http://www.stopkillerrobots.org/
 
Dec. 2016
 
Formalize ‘Killer Robots’ Talks; Aim for Ban - Fully Autonomous Weapons on Disarmament
 
Governments should agree at the upcoming multilateral disarmament meeting in Geneva to formalize their talks on fully autonomous weapons, with an eye toward negotiating a preemptive ban, Human Rights Watch said in a report released today.
 
The 49-page report, Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban, rebuts 16 key arguments against a ban on fully autonomous weapons.
 
Fully autonomous weapons, also known as lethal autonomous weapons systems and ‘killer robots,’ would be able to select and attack targets without meaningful human control.
 
These weapons and others will be the subject of the five-year Review Conference of the Convention on Conventional Weapons (CCW) from December 12-16, 2016.
 
“It’s time for countries to move beyond the talk shop phase and pursue a preemptive ban,” said Bonnie Docherty, senior arms researcher at Human Rights Watch, a co-founder of the Campaign to Stop Killer Robots. “Governments should ensure that humans retain control over whom to target with their weapons and when to fire.”
 
The report is co-published with Harvard Law School’s International Human Rights Clinic, for which Docherty is also a lecturer.
 
Human Rights Watch and the Harvard clinic examined the legal, moral, security, and other dangers of killer robots. They concluded that a ban is the only option for addressing all of the concerns. Other more incremental measures, such as adopting limited regulations on their use or codifying best practices for the development and acquisition of new weapons systems, have numerous shortcomings.
 
Countries participating in the Fifth Convention on Conventional Weapons Review Conference must decide by consensus on December 16 whether to continue deliberations on lethal autonomous weapons systems in 2017 and what shape the deliberations should take. Countries should establish a formal Group of Governmental Experts to delve more deeply into the problems of the weapons and to work toward new international law prohibiting them, said Human Rights Watch, which coordinates the Campaign to Stop Killer Robots.
 
Spurred to act by the efforts of the Campaign to Stop Killer Robots, countries that have joined the international treaty on conventional weapons have held three week-long informal meetings on lethal autonomous weapons systems since 2014. The formation of a Group of Governmental Experts at the review conference would compel countries to move beyond talk by formalizing the deliberations and creating the expectation of an outcome.
 
In past publications, Human Rights Watch has elaborated on the challenges that fully autonomous weapons would present for compliance with international humanitarian law and international human rights law and analyzed the lack of accountability that would exist for the unlawful harm caused by such weapons. The weapons would also cross a moral threshold, and their humanitarian and security risks would outweigh possible military benefits.
 
Several of the 121 countries that have joined the Convention on Conventional Weapons – including the United States, the United Kingdom, China, Israel, Russia, and South Korea – are developing weapons systems with increasing levels of autonomy. Critics who dismiss concerns about fully autonomous weapons depend on speculative arguments about the future of technology and the false presumption that technological developments could address all of the dangers posed by the weapons, Human Rights Watch and the Harvard clinic said.
 
Docherty will present the report at a Campaign to Stop Killer Robots briefing on December 14 at the United Nations in Geneva.
 
“The success of past disarmament treaties shows that an absolute prohibition on fully autonomous weapons would be achievable and effective,” Docherty said.
 
http://www.hrw.org/news/2017/01/15/banning-killer-robots-2017 http://www.hrw.org/news/2016/12/09/formalize-killer-robots-talks-aim-ban http://www.stopkillerrobots.org/2016/10/unga71/ http://www.buzzfeed.com/sarahatopol/how-to-save-mankind-from-the-new-breed-of-killer-robots http://www.stopkillerrobots.org/


Visit the related web page
 

View more stories

Submit a Story Search by keyword and country Guestbook