Public opposition to killer robots grows while states continue to drag their feet
by Campaign to Stop Killer Robots, ICRC, agencies
25 Mar. 2019
Autonomous weapons that kill must be banned, insists UN chief. (UN News)
UN Secretary-General António Guterres has called on artificial intelligence (AI) experts meeting in Geneva on Monday to push ahead with their work to restrict the development of lethal autonomous weapons systems, or LAWS, as they are also known.
In a message to the Group of Governmental Experts, the UN chief said that “machines with the power and discretion to take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law”.
No country or armed force is in favour of such “fully autonomous” weapon systems that can take human life, Mr Guterres insisted, before welcoming the panel’s statement last year that “human responsibility for decisions on the use of weapons systems must be retained, since accountability cannot be transferred to machines”.
Although this 2018 announcement was an “important line in the sand” by the Group of Governmental Experts - which meets under the auspices of the Convention on Certain Conventional Weapons (CCW) – the UN chief noted in his statement that while some Member States believe new legislation is required, while others would prefer less stringent political measures and guidelines that could be agreed on.
Nonetheless, it is time for the panel “to deliver” on LAWS, the UN chief said, adding that “it is your task now to narrow these differences and find the most effective way forward…The world is watching, the clock is ticking and others are less sanguine. I hope you prove them wrong.”
The LAWS meeting is one of two planned for this year, which follow earlier Governmental Expert meetings in 2017 and 2018 at the UN in Geneva.
The Group’s agenda covers technical issues related to the use of lethal autonomous weapons systems, including the challenges the technology poses to international humanitarian law, as well as human interaction in the development, deployment and use of emerging tech in LAWS.
In addition to the Governmental Experts, participation is expected from a wide array of international organizations, civil society, academia, and industry.
The CCW’s full name is the 1980 Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, entered into force on 2 December 1983.
The Convention currently has 125 States Parties. Its purpose is to prohibit or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.
In previous comments on AI, the Secretary-General likened the technology to “a new frontier” with “advances moving at warp speed”.
“Artificial Intelligence has the potential to accelerate progress towards a dignified life, in peace and prosperity, for all people,” he said at the AI for Good Global Summit in 2017, adding that there are also serious challenges and ethical issues which must be taken into account – including cybersecurity, human rights and privacy. http://bit.ly/2TxM5RB
24 Mar. 2019
Resistance to killer robots growing. (DW)
Activists from 35 countries met in Berlin this week to call for a ban on lethal autonomous weapons, ahead of new talks on such weapons in Geneva. They say that if Germany took the lead, other countries would follow.
"I can build you a killer robot in just two weeks," says Noel Sharkey as he leans forward with a warning gaze. The white-haired English professor is a renowned specialist for robotics and artificial intelligence (AI). He was in Berlin to participate in an international meeting of the Campaign to Stop Killer Robots that ended on Friday.
Sharkey objects to talking about "lethal autonomous weapons systems" (LAWs) as if they were something out a science-fiction novel. Fully autonomous weapons systems are in fact a well-established reality, he says, adding that there is no need to argue about the definition thereof: These are weapons that seek, select and attack targets on their own.
That is also how the International Committee of the Red Cross (ICRC) defines them. Soldiers no longer push the firing button with such weapons; instead, the weapons themselves use built-in software to find and strike targets. Such weapons can come in the form of missiles, unmanned ground vehicles, submarines, or swarms of mini-drones.
The reality of fully automated autonomous weapons systems was on full display this February at IDEX in Abu Dhabi, the largest arms fair in the Middle East, where German arms manufacturers also enthusiastically hawked their new weapons with autonomous functions.
Violation of international law
The ICRC says that the use of such weapons is a clear breach of international law. "We all know that a machine is incapable of making moral decisions," emphasizes Sharkey, one of the leading figures in the Campaign to Stop Killer Robots.
He notes that a machine cannot differentiate between combatants and civilians as stipulated by international law, referring to failed attempts at facial recognition in which innocent civilians were identified as supposed criminals.
Facial recognition depends on artificial intelligence (AI) to autonomously find a person of interest. Once the machine has identified that person, it can attack on its own. A growing number of critics are horrified by such a scenario.
Meanwhile, some 100 non-governmental organizations (NGOs) have joined the global Campaign to Stop Killer Robots. At their Berlin meeting, those groups called on Germany to demand that autonomous weapons systems that violate international law be banned. The current German government affirmed such intentions in its coalition negotiations in 2018. Nevertheless, it has meekly pushed only for non-binding political declarations at the UN in Geneva.
"That isn''t enough to establish a ban," says Thomas Küchenmeister of the NGO Facing Finance. He says the German government should join the 28 countries currently pushing for a ban on lethal autonomous weapons systems.
UN Secretary-General Antonio Guterres and the European Parliament are also in favor of a ban. Recently, the German Informatics Society (GI), an organization of computer researchers, as well as the influential Federal Association of German Industry (BDI), also called for a legally binding ban on LAWs.
Although countries such as the USA and China are leading the world in AI use, much of the research that such systems depend on comes from Europe. That lends great weight to European voices in the ongoing debate.
Noel Sharkey is convinced: "If Germany takes the lead, others will follow." Sharkey also warns that non-binding political declarations, like those the German government is currently championing, provide "perfect cover" for countries opposed to a ban. Such countries include Russia, Israel and the USA.
The German government has argued that it is essentially in favor of a ban, but that it has pushed the notably weaker political declaration for tactical reasons. The logic behind that approach is that it allows Germany to maintain a dialogue with countries such as the USA, rather than alienating them altogether.
Nobel Peace Laureate Jody Williams is wholly unconvinced by that argument, called on German Foreign Minister Heiko Maas to reconsider his position. Williams argued that anyone waiting for the USA to come out in favor of a ban will be waiting forever.
International talks on how to regulate LAWs will be held in Geneva, Switzerland, from March 25 to March 29.
Public opposition to killer robots grows while states continue to drag their feet.
More than three in five people across 26 countries oppose the development of autonomous weapons that could select and kill targets without human intervention, according to a new poll commissioned by the Campaign to Stop Killer Robots.
The poll, which was carried out by Ipsos MORI, found that:
In the 26 countries surveyed in 2018, more than three in every five people (61%) oppose the development of lethal autonomous weapons systems.
Two-thirds (66%) of those opposed to lethal autonomous weapons systems were most concerned that they would “cross a moral line because machines should not be allowed to kill.”
More than half (54%) of those who opposed said they were concerned that the weapons would be “unaccountable.”
A near-identical survey in 23 countries in January 2017 found that 56% of respondents were opposed to lethal autonomous weapons systems – opposition growing.
More than half of respondents opposed killer robots in China (60%); Russia (59%); the UK (54%); France (59%), and the USA (52%).
The Campaign to Stop Killer Robots is a growing global coalition of NGOs, including Amnesty International, that is working to ban fully autonomous weapons.
“This poll shows that the states blocking a ban on killer robots are totally out of step with public opinion. Governments should be protecting people from the myriad risks that killer robots pose, not rushing into a new arms race which could have terrifying consequences,” said Rasha Abdul Rahim, Acting Deputy Director of Amnesty Tech.
“We still have time to halt the development and proliferation of fully autonomous weapons, but we won’t have that luxury for long. Governments should take note of this poll and urgently begin negotiating a new treaty to prohibit these horrifying weapons. Only this can help ensure respect for international law and address ethical and security concerns regarding delegating the power to make life-and-death decisions to machines.”
Amnesty International is calling for a total ban on the development, production and use of fully autonomous weapon systems, in light of the serious human rights, humanitarian and security risks they pose. The use of autonomous weapons without meaningful and effective human control would undermine the right to life and other human rights and create an accountability gap if, once deployed, they are able to make their own determinations about the use of lethal force.
However, a minority of states at the 2018 November annual meeting of the Convention on Conventional Weapons, used consensus rules to thwart meaningful diplomatic progress. Russia, Israel, South Korea, and the USA indicated at the meeting that they would not support negotiations for a new treaty, but the poll results show that more than half of respondents in Russia (59%) and the USA (52%) oppose autonomous weapons.
More than half of respondents opposed autonomous weapons in China (60%), South Korea (74%) and the UK (54%), which are among the leading states developing this technology.
Autonomous weapons: States must agree on what human control means in practice. (ICRC)
Should a weapon system be able to make its own “decision” about who to kill?
The International Committee of the Red Cross (ICRC) believes that the answer is no, and today is calling on States to agree to strong, practical and future-proof limits on autonomy in weapon systems.
During the annual meeting of the States party to the Convention on Certain Conventional Weapons in Geneva November 21-23, the ICRC will urge that the new mandate of the Group of Governmental Experts focuses on determining the type and degree of human control that would be necessary to comply with international humanitarian law and satisfy ethical concerns. Several questions need to be answered:
What is the level of human supervision, including the ability to intervene and deactivate, that would be required during the operation of a weapon that can autonomously select and attack targets? What is the level of predictability and reliability that would be required, also taking into account the weapon’s tasks and the environment of use?
What other operational constraints would be required, notably on the weapon system’s tasks, its targets, the environment in which it operates (e.g. populated or unpopulated area), the duration of its operation, and the scope of its movement?
"It is now widely accepted that human control must be maintained over weapon systems and the use of force, which means we need limits on autonomy," said ICRC President Peter Maurer. “Now is the moment for States to determine the level of human control that is needed to satisfy ethical and legal considerations."
Only humans can make context-specific judgements of distinction, proportionality and precautions in combat. Only humans can behave ethically, uphold moral responsibility and show mercy and compassion. Machines cannot exercise the complex and uniquely human judgements required on battlefields in order to comply with international humanitarian law. As inanimate objects, they will never be capable of embodying human conscience or ethical values.
Given militaries’ significant interest in increasingly autonomous weapons, there is a growing risk that humans will become so far removed from the choice to use force that life-and-death decision-making will effectively be left to sensors and software.
“Humans cannot delegate the decision to use force and violence to machines. Decisions to kill, injure and destroy must remain with humans. It is humans who apply the law and are obliged to respect it,” said Kathleen Lawand, the head of the ICRC’s arms unit.
Basic humanity and the public conscience support a ban on fully autonomous weapons, Human Rights Watch said in a report released today. Countries participating in an upcoming international meeting on such “killer robots” should agree to negotiate a prohibition on the weapons systems’ development, production, and use.
The 46-page report, Heed the Call: A Moral and Legal Imperative to Ban Killer Robots, finds that fully autonomous weapons would violate what is known as the Martens Clause. This long-standing provision of international humanitarian law requires emerging technologies to be judged by the “principles of humanity” and the “dictates of public conscience” when they are not already covered by other treaty provisions.
“Permitting the development and use of killer robots would undermine established moral and legal standards,” said Bonnie Docherty, senior arms researcher at Human Rights Watch, which coordinates the Campaign to Stop Killer Robots. “Countries should work together to preemptively ban these weapons systems before they proliferate around the world.”
The 1995 preemptive ban on blinding lasers, which was motivated in large part by concerns under the Martens Clause, provides precedent for prohibiting fully autonomous weapons as they come closer to becoming reality.
The report was co-published with the Harvard Law School International Human Rights Clinic, for which Docherty is associate director of armed conflict and civilian protection.
More than 70 governments will convene at the United Nations in Geneva from August 27 to 31, 2018, for their sixth meeting since 2014 on the challenges raised by fully autonomous weapons, also called lethal autonomous weapons systems. The talks under the Convention on Conventional Weapons, a major disarmament treaty, were formalized in 2017, but they are not yet directed toward a specific goal.
Human Rights Watch and the Campaign to Stop Killer Robots urge states party to the convention to agree to begin negotiations in 2019 for a new treaty that would require meaningful human control over weapons systems and the use of force. Fully autonomous weapons would select and engage targets without meaningful human control.
To date, 26 countries have explicitly supported a prohibition on fully autonomous weapons. Thousands of scientists and artificial intelligence experts, more than 20 Nobel Peace Laureates, and more than 160 religious leaders and organizations of various denominations have also demanded a ban. In June, Google released a set of ethical principles that includes a pledge not to develop artificial intelligence for use in weapons.
At the Convention on Conventional Weapons meetings, almost all countries have called for retaining some form of human control over the use of force. The emerging consensus for preserving meaningful human control, which is effectively equivalent to a ban on weapons that lack such control, reflects the widespread opposition to fully autonomous weapons.
Human Rights Watch and the Harvard clinic assessed fully autonomous weapons under the core elements of the Martens Clause. The clause, which appears in the Geneva Conventions and is referenced by several disarmament treaties, is triggered by the absence of specific international treaty provisions on a topic. It sets a moral baseline for judging emerging weapons.
The groups found that fully autonomous weapons would undermine the principles of humanity, because they would be unable to apply either compassion or nuanced legal and ethical judgment to decisions to use lethal force. Without these human qualities, the weapons would face significant obstacles in ensuring the humane treatment of others and showing respect for human life and dignity.
Fully autonomous weapons would also run contrary to the dictates of public conscience. Governments, experts, and the broader public have widely condemned the loss of human control over the use of force.
Partial measures, such as regulations or political declarations short of a legally binding prohibition, would fail to eliminate the many dangers posed by fully autonomous weapons. In addition to violating the Martens Clause, the weapons raise other legal, accountability, security, and technological concerns.
In previous publications, Human Rights Watch and the Harvard clinic have elaborated on the challenges that fully autonomous weapons would present for compliance with international humanitarian law and international human rights law, analyzed the gap in accountability for the unlawful harm caused by such weapons, and responded to critics of a preemptive ban.
The 26 countries that have called for the ban are: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China (use only), Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, Ghana, Guatemala, the Holy See, Iraq, Mexico, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe.
The Campaign to Stop Killer Robots, which began in 2013, is a coalition of 75 nongovernmental organizations in 32 countries that is working to preemptively ban the development, production, and use of fully autonomous weapons. Docherty will present the report at a Campaign to Stop Killer Robots briefing for CCW delegates scheduled on August 28 at the United Nations in Geneva.
“The groundswell of opposition among scientists, faith leaders, tech companies, nongovernmental groups, and ordinary citizens shows that the public understands that killer robots cross a moral threshold,” Docherty said. “Their concerns, shared by many governments, deserve an immediate response.”
April 2018 marks five years since the launch of Campaign to Stop Killer Robots. It is also the fifth time since 2014 that governments are convening at the Convention on Conventional Weapons (CCW) in Geneva to discuss concerns over lethal autonomous weapons systems, also known as fully autonomous weapons or “killer robots.”
The campaign urges states to participate in the CCW Group of Governmental Experts meeting, which opens at the United Nations (UN) on Monday, 9 April, and to commit to retain meaningful human control of weapons systems and over individual attacks.
Why the concern about killer robots?
Armed drones and other autonomous weapons systems with decreasing levels of human control are currently in use and development by high-tech militaries including the US, China, Israel, South Korea, Russia, and the UK. The concern is that a variety of available sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control.
If the trend towards autonomy continues, humans may start to fade out of the decision-making loop for certain military actions, perhaps retaining only a limited oversight role, or simply setting broad mission parameters.
Several states, the Campaign to Stop Killer Robots, artificial intelligence experts, faith leaders, and Nobel Peace Laureates, among others, fundamentally object to permitting machines to determine who or what to target on the battlefield or in policing, border control, and other circumstances. Such a far-reaching development raises an array of profound ethical, human rights, legal, operational, proliferation, technical, and other concerns.
While the capabilities of future technology are uncertain, there are strong reasons to believe that devolving more decision making over targeting to weapons systems themselves will erode the fundamental obligation that rules of international humanitarian law (IHL) and international human rights law be applied by people, and with sufficient specificity to make them meaningful.
Furthermore, with an erosion of human responsibility to apply legal rules at an appropriate level of detail there will likely come an erosion of human accountability for the specific outcomes of such attacks. Taken together, such developments would produce a stark dehumanization of military or policing processes.
What is a “killer robot”?
A weapons system that identifies, selects and employs force against targets without meaningful human control should be considered a lethal autonomous weapons system. It would have no human in the decision-making loop when the system selects and engages the target of an attack. Applying human control only as a function of design and in an initial deployment stage would fail to fulfill the IHL obligations that apply to commanders in relation to each “attack.”
Why the need for “human control”?
Sufficient human control over the use of weapons, and of their effects, is essential to ensuring that the use of a weapon is morally justifiable and can be legal. Such control is also required as a basis for accountability over the consequences of the use of force. To demonstrate that such control can be exercised, states must show that they understand the process by which specific systems identify individual target objects and understand the context, in space and time, where the application of force may take place.
Given the development of greater autonomy in weapon systems, states should make it explicit that meaningful human control is required over individual attacks and that weapon systems that operate without meaningful human control should be prohibited. For human control to be meaningful, the technology must be predictable, the user must have relevant information, and there must be the potential for timely human judgement and intervention.
States should come prepared to the CCW meeting provide their views on the key “touchpoints” of human/machine interaction in weapons systems. These include design aspects, such as how certain features may be encoded as target objects; how the area or boundary of operation may be fixed; the time period over which a system may operate; and, any possibility of human intervention to terminate the operation and recall the weapon system.
Based on these touchpoints, states should be prepared to explain how control is applied over existing weapons systems, especially those with certain autonomous or automatic functions.
What does the Human Rights Council say about killer robots?
The first multilateral debate on killer robots took place at the Human Rights Council in May 2013, but states have not considered this topic at the Council since then. Countries such as Austria, Brazil, Ireland, Sierra Leone, and South Africa affirm the relevance of human rights considerations and the Council in the international debate over fully autonomous weapons.
In February 2016, the UN Special Rapporteur on extrajudicial, summary or arbitrary executions and the Special Rapporteur on the rights to freedom of peaceful assembly and of association issued a report recommending that “autonomous weapons systems that require no meaningful human control should be prohibited.”
http://www.stopkillerrobots.org/2018/03/fiveyears/ http://www.hrw.org/news/2018/08/21/killer-robots-fail-key-moral-legal-test http://www.stopkillerrobots.org/2018/08/unsg/ http://www.amnesty.org/en/latest/news/2019/01/public-opposition-to-killer-robots-grows-while-states-continue-to-drag-their-feet/ http://www.stopkillerrobots.org/2019/01/global-poll-61-oppose-killer-robots/ http://www.politico.eu/article/killer-robots-overran-united-nations-lethal-autonomous-weapons-systems/ http://www.dw.com/en/resistance-to-killer-robots-growing/a-48040866
Visit the related web page
Ravaged by Ebola and war, Congo named most neglected crisis of 2018
by IRIN News, IRC, ACAPS, agencies
Ten humanitarian crises and trends to watch in 2019. (IRIN News)
1. Climate displacement: Tomorrow’s emergencies today
From rising sea levels to withering drought and unpredictable weather: projections for what the world can expect if climate change remains unchecked are grave.
Yet extreme weather is already uprooting populations around the globe, and the aid sector and governments are struggling to cope. Vulnerable communities have long known what the aid sector is just beginning to articulate: climate change is a humanitarian issue, and its fingerprints are already evident in today’s most pressing emergencies.
2. Syria: It’s not over ‘til it’s over
A win by President Bashar al-Assad is increasingly seen as a fait accompli, but with large parts of the country still controlled by rebels and others seemingly up for grabs, the fighting isn’t finished, nor are attempts to influence the aid effort.
3. Outsourcing risk: Local responders shoulder the danger
In insecure areas with limited access, many international aid organisations subcontract donor-funded programmes to local groups – “remote management” in industry jargon. But aid analysts say this increasingly widespread strategy carries ethical and moral quandaries.
4. Ethiopia: Gambling on reforms
Loosening a political straitjacket on 105 million people and weakening central control at the same time: Prime Minister Abiy Ahmed’s moves could be the biggest relaxation of state control – and the least predictable humanitarian planning scenario – since the death of Ethiopia’s Emperor Menelik in 1913. In a country whose poorest have little room for error, his experiment is a high-stakes gamble that could backfire and cause less welcome upheavals.
5. Returning refugees: The meaning of ‘voluntary’
Pressure is building on millions of vulnerable people to return to dangerous homelands, with 2019 shaping up as a pivotal year for the world’s four largest refugee crises. Between them, Syrians, Afghans, South Sudanese, and Myanmar’s Rohingya account for well over half the world’s refugees, not to mention an almost equal number of internally displaced people.
6. Infectious diseases: Healthcare as a casualty of crisis
Countries experiencing humanitarian crises are seeing the re-emergence of previously forgotten diseases; for example, diphtheria, which took a toll on Yemenis, Venezuelans, and Rohingya refugees in Bangladesh in 2018. And political and structural challenges in some of the world’s least developed countries are fostering rich environments for many other diseases to thrive: cholera, Ebola, malaria, measles, MERS, yellow fever, and Zika.
7. South Sudan and Congo: Politics versus peace
2019 is a political year of promise for the Democratic Republic of Congo and South Sudan: the reason we’ve grouped them together. While the world watches to see if the DRC can achieve a first peaceful transfer of democratic power and if a fledgling peace deal in South Sudan will hold, how both situations develop also carries major implications for millions of people in need of assistance.
8. Anti-terror compliance: When aid falls foul of the law
It’s getting harder to stay on the right side of counter-terrorism legislation, NGOs say. That means more vulnerable people could be left without the aid they and their families depend on. And the penalties for the wrong type of engagement with sanctioned groups can be very costly, as the NGO Norwegian People’s Aid found.
9. Militancy in Africa: Weak governments struggle, civilians suffer
Violent jihadism continues to gain ground in Africa, representing a serious trial for weak and neglectful governments, and driving up humanitarian needs for civilians.
10. Yemen: Risk of fragmenting conflict
Yemen’s main warring parties are finally talking, and even shaking hands. But even if the 45- month war ends – and that’s a big if – the country could easily slide into a series of local conflicts, bringing little respite for the 24 million civilians the UN says need some sort of aid, be it food, clean water, or shelter.
http://www.irinnews.org/feature/2019/01/02/ten-humanitarian-crises-and-trends-watch-2019 http://www.irinnews.org/feature/2019/01/03/six-aid-policy-priorities-watch-2019 http://www.irinnews.org/opinion/2019/01/04/humanitarian-change-aid-sector-grassroots-accountability
Ravaged by Ebola and war, Congo named most neglected crisis of 2018. (Reuters)
With an Ebola epidemic raging and millions caught in a forgotten "catastrophe" of conflict and hunger, Democratic Republic of Congo (DRC) was the most neglected crisis of 2018, according to an annual Thomson Reuters Foundation poll of aid agencies.
This year''s survey was unusual for the high number of "most forgotten crises", with experts also listing the Central African Republic, Lake Chad Basin, Yemen, Afghanistan, South Sudan, Burundi, Nigeria and, for the first time, Venezuela.
But Congo''s "mega-crisis" barely made headlines, they said, even as the country gears up for landmark elections which some fear could stoke further unrest.
"The brutality of the conflict is shocking, the national and international neglect outrageous," said Jan Egeland, head of the Norwegian Refugee Council. "I visited Congo this year and have seldom witnessed such a gap between needs and assistance."
Congo, where 13 million people in a population of 82 million need help, also topped the annual Thomson Reuters Foundation poll in 2017, but agencies said the situation had deteriorated.
Six of 21 agencies polled named Congo as the most neglected crisis, including WFP, Norwegian Refugee Council, Oxfam, ActionAid, International Rescue Committee, and Christian Aid.
ActionAid''s humanitarian advisor Rachid Boumnijel urged the international community to redouble efforts to end years of conflict characterised by sexual brutality.
"It''s been a catastrophe for the country, and for women and girls particularly," Boumnijel said.
Christian Aid''s head of humanitarian programmes Maurice Onyango said the violence had caused "large-scale trauma", with children witnessing parents and siblings being murdered.
An upsurge of fighting in the east of the mineral-rich country has also exacerbated the spread of the world''s second largest Ebola outbreak, agencies said.
The Central African Republic, where armed groups control much of the country and 60 percent of the population needs assistance, came a close second in the poll.
Listed as the most neglected by OCHA, UNICEF, Mercy Corps, Plan International, and Caritas, the country has been racked by violence since mainly Muslim rebels ousted the president in 2013, provoking a backlash from Christian militias.
Armed groups are increasingly targeting schools, hospitals, mosques and churches, while attacks on aid workers have impacted a "chronically underfunded" humanitarian response, they said.
U.N. children''s agency UNICEF said thousands of children had been trapped in armed groups or subjected to sexual violence.
"The crisis is growing increasingly desperate and resources are at breaking point," added UNICEF emergencies director Manuel Fontaine.
U.N. appeals for both DRC and CAR are less than 50 percent funded.
"Central African Republic is in a death spiral," said Caritas Secretary General Michel Roy. "While governments and the world''s media have turned their backs, we must not. It''s the only hope CAR has left."
Plan International said the media neglected complex crises like CAR and DRC because they lacked the shock factor of a sudden disaster like Indonesia''s huge earthquake in September.
Yemen, at risk of the world''s worst famine in 100 years, was highlighted by Muslim Hands and World Vision.
"With three quarters of the population needing assistance, I can''t see how Yemen isn''t at the top of everyone''s list," said World Vision emergencies chief Mark Smith.
International Medical Corps warned the disaster in Lake Chad basin, where climate change and a prolonged insurgency by Boko Haram and Islamic State have left 11 million needing help, was also set to worsen next year.
Action Against Hunger said millions caught up in the "almost invisible" crisis - affecting Nigeria, Niger, Chad, and Cameroon - faced poverty, hunger, sexual violence and child kidnapping.
The International Federation of Red Cross and Red Crescent Societies (IFRC), the world''s biggest relief network, said hunger and disease following major flooding across Nigeria threatened to create a second protracted crisis in the country.
"I''m shocked by how little attention this has received. The figures are staggering," said IFRC Secretary General Elhadj As Sy, adding that nearly 2 million people were impacted, more than 200,000 uprooted and swathes of cropland destroyed. "This massive disaster has gone largely unnoticed by many donors and journalists," he added.
This year was the first time Venezuela featured in the poll. About 3.3 million people have fled political turmoil and economic meltdown in the Latin American country - many driven by hunger and another 2 million could follow next year, according to U.N. estimates.
The United Nations has launched a $738 million appeal to help nearby countries cope with what one U.N. official called a "humanitarian earthquake".
CARE said evidence on the ground suggested the real number fleeing was far higher than the U.N. figure. "Given its scale, it''s incredible how neglected the situation in Venezuela is," said CARE humanitarian expert Tom Newby. "The world needs to wake up to this crisis."
Afghanistan was ranked the most neglected crisis by Islamic Relief Worldwide, and South Sudan by Save the Children. The UNHCR named Burundi while migration was highlighted by the Danish Refugee Council.
* Aid agencies name their 3 priorities for 2019: http://tmsnrt.rs/2LCWDMU
Dec. 2018 (ACAPS)
The 2019 Global risk analysis outlines 18 contexts where a significant deterioration is expected to occur within the next six to nine months, leading to a spike in humanitarian needs. This report comes as a result of ACAPS daily monitoring and independent analysis of the globe to support evidence-based decision-making in the humanitarian sector.
A risk is considered an event or series of events prompting a change from the status quo that leads to a significant deterioration in the humanitarian context and a higher number of people in need, or a higher severity of need. The crises identified in this report have been selected because there are certain triggers that may emerge over the coming six to nine months that point towards this potential shift.
Considering the diversity and complexity of the crises, combined with the number of contexts included in the report, it has not been possible to cover each crisis in detail. Instead, we have highlighted the broad evolution of the crises to flag potential deteriorations and inform operational, strategic, and policy decision-makers.
* Access the report (14pp): http://bit.ly/2EGQs9Q
* OCHA Global Humanitarian Overview 2019 (80pp): http://bit.ly/2QuohRd
* 2019 will be another year of crises, reports the Norwegian Refugee Council. In 2018, 68.5 million people were displaced by war and violent conflict. There is little evidence to suggest this number will decrease in 2019: http://www.nrc.no/shorthand/fr/2019-will-be-another-year-of-crises/index.html
http://theirworld.org/news/education-under-attack-in-2018-conflicts-natural-disasters http://www.unicef.org/press-releases/world-has-failed-protect-children-conflict-2018-unicef http://www.passblue.com/2018/12/24/where-is-the-worst-place-for-a-child-to-be-born/ http://data.unicef.org/resources/levels-and-trends-in-child-mortality/ http://reliefweb.int/report/yemen/half-million-homeless-yemenis-brink-famine-face-winter-freeze-oxfam http://www.savethechildren.net/article/famine-or-not-120000-children-yemen-are-catastrophic-condition http://www.unicef.org/mena/press-releases/yemens-children-15-million-lives-scarred-and-voices-not-heard http://bit.ly/2PIS3NJ
http://www.rescue.org/sites/default/files/document/3391/ircemergencywatchlist2019.pdf http://reliefweb.int/report/central-african-republic/breaking-vicious-cycle-between-hunger-conflict-central-african http://www.globalr2p.org/ http://www.crisisgroup.org/global/10-conflicts-watch-2019 http://www.hrw.org/news/2018/12/20/ten-good-news-stories-kids-2018 http://www.msf.org/year-pictures-2018 http://www.nrc.no/perspectives/2018/glimmers-of-hope/
Visit the related web page
View more stories|