People's Stories Peace


Autonomous weapons systems that require no meaningful human control should be prohibited
by Campaign to Stop Killer Robots, agencies
 
4 Apr. 2018
 
Artificial intelligence researchers from nearly 30 countries are boycotting a South Korean university over concerns a new lab in partnership with a leading defence company could lead to “killer robots”.
 
More than 50 leading academics signed the letter calling for a boycott of Korea Advanced Institute of Science and Technology (KAIST) and its partner, defence manufacturer Hanwha Systems. The researchers said they would not collaborate with the university or host visitors from KAIST over fears it sought to “accelerate the arms race to develop” autonomous weapons.
 
“There are plenty of great things you can do with AI that will save lives, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern,” said Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales. “This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms.”
 
The boycott comes ahead of a United Nations meeting in Geneva next week on autonomous weapons, and more than 20 countries have already called for a total ban on killer robots. The use of AI in militaries around the world has sparked fears of a Terminator-like situation and questions have been raised about the accuracy of such weapons and their ability to distinguish friend from foe.
 
Hanwha is one of South Korea’s largest weapons manufacturers, and makes cluster munitions which are banned in 120 countries under an international treaty. South Korea, along with the US, Russia and China, are not signatories to the convention.
 
Walsh was concerned when a Korea Times article described KAIST as “joining the global competition to develop autonomous arms” and promptly wrote to the university asking questions but did not receive a response.
 
Subsequently KAIST’s president, Sung-Chul Shin, said he “would like to reaffirm that KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots,” Shin said in a statement.
 
“As an academic institution, we value human rights and ethical standards to a very high degree.. I reaffirm once again that KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.”
 
Toby Walsh said the reponse sets a very clear precedent for the future. I hope now that KAIST will lobby the Republic of Korea to call for meaningful human control at next week’s UN meeting on the topic of killer robots. Such small steps we hope will eventually lead to an outright ban. http://bit.ly/2GC1A9l http://bit.ly/2yV0Nwv http://www.hrw.org/news/2018/09/11/were-running-out-time-stop-killer-robot-weapons
 
Apr. 2018
 
April 2018 marks five years since the launch of Campaign to Stop Killer Robots. It is also the fifth time since 2014 that governments are convening at the Convention on Conventional Weapons (CCW) in Geneva to discuss concerns over lethal autonomous weapons systems, also known as fully autonomous weapons or “killer robots.”
 
The campaign urges states to participate in the CCW Group of Governmental Experts meeting, which opens at the United Nations (UN) on Monday, 9 April, and to commit to retain meaningful human control of weapons systems and over individual attacks.
 
Why the concern about killer robots?
 
Armed drones and other autonomous weapons systems with decreasing levels of human control are currently in use and development by high-tech militaries including the US, China, Israel, South Korea, Russia, and the UK. The concern is that a variety of available sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control.
 
If the trend towards autonomy continues, humans may start to fade out of the decision-making loop for certain military actions, perhaps retaining only a limited oversight role, or simply setting broad mission parameters.
 
Several states, the Campaign to Stop Killer Robots, artificial intelligence experts, faith leaders, and Nobel Peace Laureates, among others, fundamentally object to permitting machines to determine who or what to target on the battlefield or in policing, border control, and other circumstances. Such a far-reaching development raises an array of profound ethical, human rights, legal, operational, proliferation, technical, and other concerns.
 
While the capabilities of future technology are uncertain, there are strong reasons to believe that devolving more decision making over targeting to weapons systems themselves will erode the fundamental obligation that rules of international humanitarian law (IHL) and international human rights law be applied by people, and with sufficient specificity to make them meaningful.
 
Furthermore, with an erosion of human responsibility to apply legal rules at an appropriate level of detail there will likely come an erosion of human accountability for the specific outcomes of such attacks. Taken together, such developments would produce a stark dehumanization of military or policing processes.
 
What is a “killer robot”?
 
A weapons system that identifies, selects and employs force against targets without meaningful human control should be considered a lethal autonomous weapons system. It would have no human in the decision-making loop when the system selects and engages the target of an attack. Applying human control only as a function of design and in an initial deployment stage would fail to fulfill the IHL obligations that apply to commanders in relation to each “attack.”
 
Why the need for “human control”?
 
Sufficient human control over the use of weapons, and of their effects, is essential to ensuring that the use of a weapon is morally justifiable and can be legal. Such control is also required as a basis for accountability over the consequences of the use of force. To demonstrate that such control can be exercised, states must show that they understand the process by which specific systems identify individual target objects and understand the context, in space and time, where the application of force may take place.
 
Given the development of greater autonomy in weapon systems, states should make it explicit that meaningful human control is required over individual attacks and that weapon systems that operate without meaningful human control should be prohibited. For human control to be meaningful, the technology must be predictable, the user must have relevant information, and there must be the potential for timely human judgement and intervention.
 
States should come prepared to the CCW meeting provide their views on the key “touchpoints” of human/machine interaction in weapons systems. These include design aspects, such as how certain features may be encoded as target objects; how the area or boundary of operation may be fixed; the time period over which a system may operate; and, any possibility of human intervention to terminate the operation and recall the weapon system.
 
Based on these touchpoints, states should be prepared to explain how control is applied over existing weapons systems, especially those with certain autonomous or automatic functions.
 
What does the Human Rights Council say about killer robots?
 
The first multilateral debate on killer robots took place at the Human Rights Council in May 2013, but states have not considered this topic at the Council since then. Countries such as Austria, Brazil, Ireland, Sierra Leone, and South Africa affirm the relevance of human rights considerations and the Council in the international debate over fully autonomous weapons.
 
In February 2016, the UN Special Rapporteur on extrajudicial, summary or arbitrary executions and the Special Rapporteur on the rights to freedom of peaceful assembly and of association issued a report recommending that “autonomous weapons systems that require no meaningful human control should be prohibited.”
 
The UN Special Rapporteur on extrajudicial, summary or arbitrary executions last addressed a CCW meeting on lethal autonomous weapons in April 2016. Human rights are no longer considered relevant in the CCW talks, which raises the question of how to address human rights concerns with these weapons, particularly their use in law enforcement, border control and other circumstances outside of armed conflict.
 
* 22 states support the call to ban fully autonomous weapons: Algeria, Argentina, Bolivia, Brazil, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Guatemala, Holy See, Iraq, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Uganda, Venezuela, and Zimbabwe.
 
* For more details see: http://www.stopkillerrobots.org/2018/03/fiveyears/


Visit the related web page
 


The critical importance of respect for international humanitarian law
by Peter Maurer
International Committee of the Red Cross (ICRC)
 
Apr. 2018
 
Speech given by Mr Peter Maurer, President of the ICRC, during the Conference on International Security.
 
I would like to give you an overview of ICRC''s analysis of global trends in contemporary battlefields, rooted in its frontline knowledge and engagement with different belligerents; and to highlight two key areas for further dialogue between military-security and humanitarian actors.
 
For more than 150 years, through its neutral and impartial humanitarian action, the ICRC has amassed key insights and experiences as a frontline actor in favour of humanitarian space and as neutral intermediary between belligerents. These insights have informed our legal work, while principles and policies are guiding our practical response in almost all active conflicts.
 
In recent times, we''ve seen a strong new trend emerging, which defies the distinction between internal and international armed conflict, as articulated in the Geneva Conventions of 1949 and their Additional Protocols: We now see protracted situations of long-term violence continuing to rise with local, regional and global actors involved, with different types of support for partners and allies and opposing, often volatile, coalitions of State and non-state actors.
 
While each conflict has its particular dynamics, it is shocking to see the deep humanitarian impact of such fundamental transformations, which often is accompanied by a blurring of lines between civilians and militaries and an unwillingness and inability to adequately protect those who are not participating in hostilities.
 
Exponential growth of needs, combined with limited response capacities, are leaving millions of people without hope for a dignified life.
 
Over the last two months alone I have visited, amongst other places, ICRC operations in Syria, Iraq, Libya, Sudan and the Central African Republic. What I have seen in these countries is confronting: increasingly fragmented actors; unrestrained strategies in the use of force and an obvious imbalance in pondering military necessities and the protection needs of civilians; easily available weapons as a result of irresponsible transfers to irresponsible actors incapable and unwilling to implement the restraining rules of international humanitarian law; and as a consequence, human suffering, social systems falling apart and massive displacement.
 
Whatever the motives by which present day warfare is legitimized, this cannot be an acceptable result for responsible leadership.
 
The figures give a good indication of the scale of the humanitarian needs of today''s world:
 
128 million people are in need of humanitarian assistance and protection worldwide; 65 million are displaced - the highest number since the Second World War; More than 1.5 billion people, including 350 million of the world''s extreme poor, live in an environment of continuous fragility, violence and conflict; The annual economic impact of conflict and violence is $14 trillion - or 14% of global GDP.
 
Today''s conflicts are increasingly protracted, causing compounding impacts on populations. While the ICRC was created as a humanitarian organisation to respond to emergencies, we find ourselves working for decades in protracted contexts. In our ten largest operations, we have been on the ground for an average of 36 years... and still the wars continue.
 
The urbanisation of warfare is one of the important factors contributing to this bleak picture of suffering. Around the world, it is estimated that some 50 million people suffer the effects of urban conflicts. Cities and urban areas are intrinsically more vulnerable, especially to the use of massive explosive force and more amenable to illegal tactics of human shields.
 
Because of the significant likelihood of indiscriminate effects, we urge all parties to avoid the use of explosive weapons with a wide impact area in densely populated areas and to stop taking the civilian population hostage.
 
The consequences are devastating – not only in the immediate impacts of death, injury but also in the erosion of basic infrastructure like health, water, sanitation, education systems.
 
A sober analysis of our working environment tells us that our mitigating efforts through humanitarian assistance programs will have limited success if we do not make major efforts to shrink the needs through changes of behaviour in the battlefields. This will come first and foremost from respect for the rules of war, in particular the principles of distinction, proportionality and precaution.
 
Now I will turn to two critical areas of response – the dimensions of partnered warfare and of new technologies and cyber warfare.
 
Today no one fights alone. In many of the major conflicts in the Middle East, in Africa and beyond, coalitions pool their resources against common enemies.
 
We see wars that are fought by proxy; through both official and unofficial partnerships. This can create a climate in which political and military stakeholders see themselves freed from the scrutiny of accountability processes. Partnered warfare comes in different forms. Focuses can be on advice, training, equipment, surveillance, intelligence sharing, logistics, combined operations, kinetic support, detention operations and more, depending on circumstances.
 
In light of the global trends of conflict that we are witnessing, it has become increasingly urgent for States to look at how they can better leverage their partnerships and support to ensure civilians are better protected.
 
In partnerships, as in all other cases, the ICRC encourages all States to lead by example: to steadfastly respect their own obligations under international humanitarian law.
 
All States are obliged to ensure respect for IHL by the parties to armed conflicts, by refraining from encouraging or assisting violations of IHL, and by proactively influencing the parties to respect IHL.
 
The ICRC has developed a series of practical recommendations for States supporting parties to an armed conflict. To put it plainly, allied States have a responsibility to make sure their partners are not taking the cheap options.
 
Allied States can take a range of measures to ensure respect for international humanitarian law by their partners, such as: vetting potential partners to ensure they have the capacity and willingness to apply IHL; clarifying roles and responsibilities; and ensuring proper application of the rules governing the conduct of hostilities, detention and protection of civilians.
 
And to make a particular note on the arms trade. Arms transfers are at the highest levels since the end of the Cold War, with a significant proportion going to those fighting in the most brutal of wars. States have a special responsibility and must use their influence to ensure partners respect IHL, and cease transferring weapons, where there is a substantial or clear risk that the weapons would be used to commit IHL violations.
 
I believe there is huge untapped potential for States to positively use their influence over those they partner with or support. I have seen the positive impact when allied States do take such steps and measures and I encourage all States to examine their responsibilities and actions. Indeed, we look forward to constructively furthering this discussion with States over the coming year.
 
In today''s world, while major conflicts are happening in the physical world with kinetic power, we can''t ignore the new battlefields. New technologies are rapidly giving rise to unprecedented methods of warfare.
 
Innovations that yesterday were science fiction could cause catastrophe tomorrow, including fully autonomous combat robots and laser weapons. Cyber-attacks are a growing issue of concern because of their potential for serious humanitarian consequences.
 
The ICRC is urging States to look at the humanitarian impact of conflict in the virtual world and to uphold the protections afforded by the law.
 
In the ICRC''s view, it is clear that the general rules of international humanitarian law apply to and restrict the use of cyber capabilities as means and methods of warfare during armed conflicts. IHL prohibits cyber-attacks against civilian objects or networks, and prohibits indiscriminate and disproportionate cyber-attacks.
 
At the same time ICRC is raising critical questions such as : What is a security incident versus an act of war? How can cyber-attacks distinguish between civilian objects and military objectives? How to assess their proportionality? And what are States'' views on these questions?
 
The interconnectedness of military and civilian networks poses a significant practical and legal challenge in terms of protecting civilians from the dangers of cyber warfare. These challenges must be addressed. They also underscore the importance for States that develop or acquire cyber warfare capabilities – whether for offensive or defensive purposes – to assess their lawfulness under international humanitarian law.
 
To be clear, by asserting that IHL applies to cyber means and methods of warfare, the ICRC is not condoning cyber warfare or the militarization of cyberspace. Any resort to force by a State, whether physical or through cyberspace, remains constrained by the UN Charter.
 
The point is that - beyond the requirements of the UN Charter - IHL further restricts the use of cyber capabilities during armed conflicts.
 
I have focused my address today on the critical importance of international humanitarian law to prevent and mitigate the impacts of war. As seen in our recent history, when the law is respected, the cycle of violence can be broken, the impact of war contained and the foundations built for future peace and security, and much needed political solutions.
 
International humanitarian law is an inherently practical tool. It can shape behavior and influence those bound by it to exercise restraint. Each of its rules contains a balance between humanity and military necessity, allowing armies to exercise common decency. The law provides a basis, a shared language, for warring parties to eventually come to the table, and find common ground.
 
In today''s world where protracted, urban wars are the norm, when brutal conflicts are causing untold human suffering, we must use the tools at our disposal to break the cycle of violence and instability, and we must start work today. http://bit.ly/2v2v5M6


Visit the related web page
 

View more stories

Submit a Story Search by keyword and country Guestbook