Autonomous weapon systems: Is it morally acceptable for a machine to make life and death decisions? by International Committee of the Red Cross 2:34pm 15th Apr, 2015 13 April 2015 Convention on Certain Conventional Weapons (CCW), Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), 13 - 17 April 2015, Geneva. Statement of the ICRC. The International Committee of the Red Cross (the ICRC) is pleased to contribute its views to this second CCW Meeting of Experts on "Lethal Autonomous Weapon Systems". The CCW, which is grounded in international humanitarian law (IHL), provides an important framework to further our understanding of the technical, legal, ethical and policy questions raised by the development and use of autonomous weapon systems in armed conflicts. This week will provide an opportunity to build on last year"s meeting to develop a clearer understanding of the defining characteristics of autonomous weapon systems and their current state of development, so as to begin to identify the best approaches to addressing the legal and ethical concerns raised by this new technology of warfare. The ICRC wishes to again emphasise the concerns raised by autonomous weapon systems under the principles of humanity and the dictates of public conscience. As we have previously stated, there is a sense of deep discomfort with the idea of any weapon system that places the use of force beyond human control. At the outset we would like to highlight a few key points on which the ICRC believes attention should be focused this week. We are urging States to consider the fundamental legal and ethical issues raised by autonomy in the ‘critical functions’ of weapon systems before these weapons are further developed or deployed in armed conflicts. We also wish to stress that our thinking about this complex subject continues to evolve as we gain a better understanding of current and potential technological capabilities, of the military purposes of autonomy in weapons, and of the resulting legal and ethical issues raised. To ensure a focussed discussion, the ICRC believes that it will be important to have a clearer common understanding of what is the object of the discussion, and in particular of what constitutes an autonomous weapon system. Without engaging in a definition exercise, there is a need to set some boundaries for the discussion. As the ICRC proposed at last year’s CCW Meeting of Experts, an autonomous weapon system is one that has autonomy in its ‘critical functions’, meaning a weapon that can select (i.e. search for or detect, identify, track) and attack (i.e. intercept, use force against, neutralise, damage or destroy) targets without human intervention. We have suggested that it would be useful to focus on how autonomy is developing in these ‘critical functions’ of weapon systems because these are the functions most relevant to ‘targeting decision-making’, and therefore to compliance with international humanitarian law, in particular its rules on distinction, proportionality and precautions in attack. Autonomy in the critical functions of selecting and attacking targets also raise significant ethical questions, notably when force is used autonomously against human targets. The ICRC believes that it would be most helpful to ground discussions on autonomous weapon systems on current and emerging weapon systems that are pushing the boundaries of human control over the critical functions. Hypothetical scenarios about possible developments far off in the future may be inevitable when discussing a new and continuously evolving technology, but there is a risk that by focussing exclusively on such hypothetical scenarios, we will neglect autonomy in the critical functions of weapon systems that actually exist today, or that are currently in development and intended for deployment in the near future. From what we understand, many of the existing autonomous weapon systems have autonomous ‘modes’, and therefore only operate autonomously for short periods. They also tend to be highly constrained in the tasks they are used for, the types of targets they attack, and the circumstances in which they are used. Most existing systems are also overseen in real-time by a human operator. However, future autonomous weapon systems could have more freedom of action to determine their targets, operate outside tightly constrained spatial and temporal limits, and encounter rapidly changing circumstances. The current pace of technological change and military interest in autonomy for weapon systems lend urgency to the international community’s consideration of the legal and ethical implications of these weapons. As the ICRC has stressed in the past, closer examination of existing and emerging autonomous weapon systems may provide useful insights regarding what level of autonomy and human control may be considered acceptable or unacceptable, and under which circumstances, from a legal and ethical standpoint. In this respect, we encourage States to share as far as possible their legal reviews of existing weapon systems with autonomy in their critical functions. This would allow for more informed deliberations. Based on discussions that took place last year in the CCW and elsewhere, there appears to be broad agreement among States on the need to retain human control over the critical functions of weapon systems. States should now turn their attention to agreeing a framework for determining what makes human control of a weapon meaningful or adequate. Discussions should focus on the types of controls that are required, in which situations, and at which stages of the process – programming, deployment and/or targeting (selecting and attacking a target). It is also not disputed that autonomous weapons intended for use in armed conflict must be capable of being used in accordance with international humanitarian law (IHL), in particular its rules of distinction, proportionality and precautions in attack. Indeed, weapons with autonomy in their critical functions are not being developed in a ‘legal vacuum’, they must comply with existing law. Based on current and foreseeable robotics technology, it is clear that compliance with the core rules of IHL poses a formidable technological challenge, especially as weapons with autonomy in their critical functions are assigned more complex tasks and deployed in more dynamic environments than has been the case until now. Based on current and foreseeable technology, there are serious doubts about the ability of autonomous weapon systems to comply with IHL in all but the narrowest of scenarios and the simplest of environments. In this respect, it seems evident that overall human control over the selection of, and use of force against, will continue to be required. In discussions this week, we encourage States that have deployed, or are currently developing, weapon systems with autonomy in their critical functions, to share their experience of how they are ensuring that these weapons can be used in compliance with IHL, and in particular to share the limits and conditions imposed on the use of weapons with autonomous functions, including in terms of the required level of human control. Lessons learned from the legal review of autonomy in the critical functions of existing and emerging weapon systems could help to provide a guiding framework for future discussions. In this respect, the ICRC welcomes the wide recognition of the obligation for States to carry out legal reviews of any new technologies of warfare they are developing or acquiring, including weapons with autonomy in some or all of their critical functions. CCW Meetings of States Parties and Review Conferences have in the past recalled the importance of legal reviews of new weapons, which are a legal requirement for States party to Additional Protocol I to the Geneva Conventions. The ICRC encourages States that have not yet done so to establish weapons review mechanisms and stands ready to advise States in this regard. In this respect, States may wish to refer to the ICRC’s Guide to the Legal Review of New Weapons, Means and Methods of Warfare. Finally, the ICRC wishes to again emphasise the concerns raised by autonomous weapon systems under the principles of humanity and the dictates of public conscience. As we have previously stated, there is a sense of deep discomfort with the idea of any weapon system that places the use of force beyond human control. In this respect, we would like this week to hear the views delegations on the following crucial question for the future of warfare, and indeed for humanity: would it be morally acceptable, and if so under what circumstances, for a machine to make life and death decisions on the battlefield without human intervention? http://www.icrc.org/en/document/licence-kill-autonomous-weapons November 14, 2014 Countries warn of potential dangers of autonomous weapons systems they say are at risk of violating international and humanitarian law. “Killer robots” – autonomous weapons systems that can identify and destroy targets in the absence of human control – should be strictly monitored to prevent violations of international or humanitarian law, nations from around the world demanded on Thursday. The European Union, France, Spain, Austria, Ireland, the Netherlands, Croatia, Mexico and Sierra Leone, among other states, lined up at a special UN meeting in Geneva to warn of the potential dangers of this rapidly advancing technology. Several countries spoke of the need for ongoing scrutiny to ensure that the weapons conformed to the Geneva conventions’ rules on proportionality in war. The Spanish delegation went further, invoking the possibility of a new arms race as developed countries scrambled to get ahead. Ireland, the Netherlands and other countries called for “meaningful human control” of lethal weapons to be enshrined in international law, although the meeting also admitted that the precise definition of that principle had yet to be clarified. The Geneva meeting was the second major gathering of world powers this year to discuss the looming threat or possibility of fully self-operating lethal weapons. As such, it was an indication of mounting global concern about the technology, as its adoption by military forces gathers apace. The US, the leader in the field, has already switched most of its aerial surveillance capabilities to unmanned aircraft – though the drones are still controlled by human pilots. It is a natural next step for the US air force to develop systems that can both deliver and then operate missiles and bombs robotically, with only minimal human intervention. The New York Times reported this week that Lockheed Martin has developed a long-range anti-ship missile for the US air force and navy that can fly itself, with no human touch, for hundreds of miles, changing its flight-path autonomously to avoid radar detection. Britain, Israel and Norway already carry out attacks on radar installations, tanks and ships using autonomous drones and missiles, the paper said. At the previous Geneva meeting on killer robots, Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions, called for an outright ban. “Machines lack morality and mortality, and as a result should not have life and death powers over humans,” he said. Human Rights Watch, which is a co-founder of the Campaign to Stop Killer Robots, told Thursday’s plenary that a ban was the only practical solution. The group lamented the fact that the UN had spent only eight or nine days over the past two years focused on an area that was fast-moving and raised huge legal and ethical issues. “There is a sense of urgency about how we deal with killer robots. Technology is racing ahead,” it said. Regulation of autonomous weapons falls under the so-called “convention on certain conventional weapons” or CCW – a part of the Geneva conventions that deals with the impact of the tools of war on civilian populations. Under CCW, weapons that are deemed to affect civilians indiscriminately or to cause inhumane suffering to combatants can be banned or heavily restricted. http://www.theguardian.com/world/2014/nov/13/killer-robots-strictly-monitored-un-meeting-geneva http://www.stopkillerrobots.org/ http://www.icrc.org/en/document/lethal-autonomous-weapons-systems-LAWS http://www.hrw.org/news/2015/04/08/killer-robots-accountability-gap http://www.icrc.org/en/resource-centre Visit the related web page |
|
Next (more recent) news item
| |
Next (older) news item
|