People's Stories Freedom

View previous stories


The Forgotten Refugees
by Michelle Chen
The Nation
 
With tragic photos of bodies on beaches surfacing in the news, the refugee crisis seems to be concentrated at the borders of Europe and the United States. But the desperate exiles who have fled to the West actually represent a minority who have managed to make it across borders. Most of the world’s refugees are not even officially labeled as such by international standards.
 
From New Orleans to Bangladesh, millions of people have been forced out of their homes without crossing an international border. They’re labeled “internally displaced people,” and international humanitarian laws grant them virtually no protection, despite the fact that they often live under worse conditions than global refugees.
 
Globally and internally displaced populations overlap heavily, but the plight of “IDPs” is often overlooked because they are rarely counted. Fresh research by the International Displacement Monitoring Center (IDMC) attempts to track this massive, floating demographic by curating international and national data sets: More than 40 million people were internally displaced as of late 2016, with three-quarters of them having fled home in the past year. The chief causes were environmental disasters such as storms and floods, while conflicts and violence triggered about 7 million forced migrations.
 
Violence-driven internal migration is concentrated in war-torn countries like the Democratic Republic of the Congo, with about 920,000 displacements, followed by Afghanistan, with 652,000. But below the media radar, internal communal violence has displaced many communities—even in relatively “stable” countries like India and the Philippines, where conflict displaced more than 800,000 people combined. (Nonetheless, the overall rate of displacement is more intense when viewed in the context of DRC and Afghanistan’s relatively smaller populations.)
 
The emerging data on IDP flows expose how international humanitarian law offers only a partial solution to a global displacement crisis that transcends nation-state boundaries.
 
Refugee law emerged as a political response to modern warfare, as a way to provide temporary relief to victims of turmoil and political repression. While international refugees are afforded certain legal protections (like the right to claim asylum in the court system of their host country, or to access humanitarian aid like emergency shelter and medical care), IDPs are generally subject to domestic laws, and often live under oppressive regimes with policies that forced them to migrate in the first place.
 
A lack of relief at home is a key factor that drives international refugee migration, whether inside a given region or from Global South to North. Moreover, the IDMC notes, “the exact push and pull factors that explain how someone who is an IDP one day can become a refugee, an asylum seeker or an international migrant the next are still unclear.”
 
The crisis will deepen as climate-change destabilizes environments and livelihoods, as the IDMP acknowledges: “Increased frequency and intensity of extreme weather events and environmental degradation will increase displacement risk further.” Intersecting with spiking violence and entrenched inequality, environment-related social stressors and public-health crises will aggravate insecurity and resource competition. It could also breed even more conflict, as appears to be happening in the rapidly desertifying Middle East.
 
The link between violence and scarcity is illustrated today in the Horn of Africa, where intersecting stressors are spurring “recurring droughts, poor access to basic services and infrastructure, lack of livelihood options and ongoing conflict and insecurity,” leaving “highly vulnerable and exposed people with no other option but to move.” Famines in Somalia and territorial battles in rural Brazil show the impossibility of disentangling “man-made” and “natural” violence in regions that become simultaneously ecologically and socially uninhabitable.
 
Despite the fact that the internal displacement crisis is even larger than its global counterpart in numerical and humanitarian terms, the plight of IDPs remains shamefully neglected.
 
The needs of specific subgroups of IDPs exacerbate the humanitarian challenge. Women may be especially disadvantaged in mass evictions, because of political disenfranchisement and discriminatory property laws. Indigenous and minority groups face heightened risk in land-rights clashes, and the poverty linked to their social marginalization leaves them more exposed to environmental disasters.
 
In Colombia, a country plagued by climate change and civil war, Afro-Colombian and indigenous people make up three-quarters of victims of mass-displacement events, even though they constitute less than a fifth of the total population. As with environmental disaster, discrimination exacerbates the effects of displacement.
 
In reality, the chaotic refugee camps dotting the Mediterranean are the far edge of the refugee crisis. In many cases, the poorest and most traumatized cannot afford or survive the farthest-ranging smuggling routes, and the worst-off are forced to stay close to home. For example, the bulk of Syrian refugees now languish in camps just outside the border or in Turkey.
 
In extremely violent contexts, IDPs are forced to bounce across regions, surrounded by chaos. In Afghanistan, which shares porous (if hostile) borders with Iran and Pakistan, decades of global and civil war have meant that displacement is often permanent. Whether populations migrate to Europe or stay in the next province, much of the country has endured displacements across generations, constraining the government’s capacity to rebuild. A 2016 survey of Afghan refugees who entered through Greece revealed that one in four “were first or second generation refugees who had never lived in Afghanistan.”
 
Despite the fact that the internal-displacement crisis is even larger than the refugee crisis in numerical and humanitarian terms, the plight of IDPs remains shamefully neglected.
 
Yet every year this nation-sized diaspora grows, while borders harden and suffering deepens both within disaster-stricken societies and refugee-host countries. It is up to international institutions to prevent refugee crises by developing protections for the people adrift in the purgatory of humanitarian law. Whether inside or outside their home country’s borders, the displaced are a global responsibility.
 
http://www.internal-displacement.org/global-report/grid2017/


Visit the related web page
 


Facebook developing censorship tool to enter Chinese market
by New York Times, BBC, EJN, agencies
 
Mar. 2017
 
Youtube, Facebook failing to effectively monitor fake news, extremist views, hate speech. (BBC, Guardian News, agencies)
 
Google''s European boss has apologised after adverts from major firms and government agencies appeared next to extremist content on its YouTube site.
 
It came after Marks and Spencer became the latest firm to pull its online ads over the issue, joining others such as Audi, RBS and L''Oreal.
 
A recent investigation by the Times found adverts from a range of well-known firms and organisations had appeared alongside content from supporters of extremist groups on YouTube''s video site. The Times said that rape apologists, anti-Semites and hate preachers were among those receiving payouts.
 
The company, which insists it''s a technology platform not a media business, is finding it ever harder to hold that line. Media firms face tight regulation - and that is exactly what may be needed at Google.
 
Last week, British Government ministers summoned Google for talks at the Cabinet Office after imposing a temporary restriction on the government''s own adverts on the platform.
 
And on Monday M&S joined a growing list of firms to suspend their advertising from both Google''s search engine and YouTube site. Others include McDonald''s, HSBC, Lloyds, the BBC, Channel 4 and the Guardian.
 
Mar. 2017
 
Major brands pulling millions of dollars in advertising amid rows over extremist content on YouTube.
 
In the US, the telecom companies AT&T and Verizon, as well as the pharmaceutical company GSK, Pepsi, Walmart, Johnson & Johnson and the car rental firm Enterprise, have all pulled advertising from Google’s video-sharing platform, a contagion spreading from Europe, where a number of high-profile advertisers pulled out of YouTube following an investigation by the Times.
 
Major brands’ content was found to be appearing next to videos promoting extremist views or hate speech, with a cut of the advertising spend going to the creators.
 
Verizon’s ads were featured alongside videos made by Egyptian cleric Wagdi Ghoneim, who was banned from the US over extremism, and the hate preacher Hanif Qureshi, whose preachings were said to have inspired murder in Pakistan.
 
“We are deeply concerned that our ads may have appeared alongside YouTube content promoting terrorism and hate,” an AT&T spokesman said in a statement. “Until Google can ensure this won’t happen again, we are removing our ads from Google’s non-search platforms.”
 
“This marks a turning point for YouTube. For the first time, it’s dealing not only with reputation damage but revenue damage,” said Alex Krasodomski-Jones, a researcher at the thinktank Demos.
 
YouTube might purport to be a video-sharing service, but as with Google’s search engine and Facebook’s social network, the platform is really about one thing: advertising. “So when there’s a problem with advertising like this, it’s a big problem,” Krasodomski-Jones said.
 
The dispute adds weight to demands for companies such as Google to take more responsibility for what is on their websites, as Facebook was forced to confront in the wake of the “fake news” scandal.
 
22 March 2017
 
''Sex assault'' streamed on Facebook Live. (BBC)
 
The alleged sexual assault of a 15-year-old girl by five or six males was streamed on Facebook Live, according to Chicago Police.
 
Around 40 people were said to have been watching the stream at one point but nobody reported the incident to police.
 
A police spokesman said authorities first learned of what happened after the girl''s mother approached police. Detectives have questioned several people but no arrests have been made to date. The girl, who had been missing, has now been found by detectives and reunited with her family.
 
Her mother, whom the BBC is not naming, said her daughter appeared to be scared in the footage, adding "it''s so disgusting". A relative of the girl, says he was the last to see her before the alleged attack, after the two attended church together on Sunday.
 
"Nobody deserves that. No human being deserves for that to happen to them," he told local media. After the girl was found, she was taken to hospital, he told the Chicago Tribune.
 
In January, Chicago police arrested four people following a separate incident in which a man''s alleged assault was live streamed, also on Facebook Live.
 
Nov. 2016
 
The proliferation of politically biased, fake news stories on Facebook has become widespread, writes Olivia Solon in San Francisco. (Guardian News)
 
The company is being accused of abdicating its responsibility to clamp down on fake news stories and counter the echo chamber that defined the U.S. election.
 
Fake news and misinformation plagued the 2016 election on an unprecedented scale. Rather than connecting people – as Facebook’s euphoric mission statement claims – the bitter polarization of the social network over the last eighteen months suggests Facebook is actually doing more to divide the world.
 
“People have unfriended friends and family members because the style of discourse is so harsh,” said Claire Wardle, research director at the Tow Center for Digital Journalism. “Facebook stumbled into the news business without systems, editorial frameworks and editorial guidelines.”
 
Currently on Facebook, the truth of a piece of content is less important than whether it is shared, liked and monetized. These “engagement” metrics distort the media landscape, allowing clickbait, hyperbole and misinformation to proliferate. And on Facebook’s voracious news feed, the emphasis is on the quantity of posts, not spending time on powerful, authoritative, well-researched journalism.
 
The more we click, like and share stuff that resonates with our own world views the more Facebook feeds us with similar posts.
 
These information bubbles didn’t burst on 8 November, but the election result has highlighted how mainstream media and polling systems underestimated the power of alt-right news sources and smaller conservative sites that largely rely on Facebook to reach an audience. The Pew Research Center found that 44% of Americans get their news from Facebook.
 
What is a uniquely Republican problem is the validation given to fake news by the now president-elect. Trump has routinely repeated false news stories and whipped up conspiracy theories – whether that’s questioning Obama’s heritage, calling climate change a hoax or questioning “crooked” Hillary Clinton’s health – during high-profile rallies, while urging his followers not to trust corrupt traditional media.
 
The conspiracy theories are amplified by a network of highly partisan media outlets with questionable editorial policies, including a website called the Denver Guardian peddling stories about Clinton murdering people and a cluster of pro-Trump sites founded by teenagers in Veles, Macedonia, motivated only by the advertising dollars they can accrue if enough people click on their links.
 
The situation is so dire that this week President Obama spoke about the “crazy conspiracy theorizing” that spreads on Facebook, creating a “dust cloud of nonsense”.
 
“There is a cottage industry of websites that just fabricate fake news designed to make one group or another group particularly riled up,” said Fil Menczer, a professor at Indiana University who studies the spread of misinformation. “If you like Donald Trump and hate Hillary Clinton it’s easy for you to believe a fake piece of news about some terrible thing Hillary has done. These fake news websites often generate the same news just changing the name to get people on either side to be outraged.”
 
The misinformation being spread doesn’t always involve outlandish conspiracy theories. There’s a long tail of insidious half truths and misleading interpretations that fall squarely in the grey area, particularly when dealing with complex issues like immigration, climate change or the economy.
 
“Not everything is true or false, and in the gaps between what we can check and what is missing from our control we can create a narrative,” said Italian computer scientist Walter Quattrociocchi, who has studied the spread of false information. “Trump won at this. He was able to gather all the distrust in institutional power by providing an option for people looking for a change.”
 
According to Menczer’s research there’s a lag of around 13 hours between the publication of a false report and the subsequent debunking. That’s enough time for a story to be read by hundreds of thousands if not millions of people. Within Facebook’s digital echo chamber, misinformation that aligns with our beliefs spreads like wildfire, thanks to confirmation bias.
 
“People are more prone to accept false information and ignore dissenting information,” said Quattrociocchi. “We are just looking for what we want to hear.”
 
It’s a quirk of human psychology that the UK Independence party (Ukip) used during the campaign for Britain to leave the EU. Arron Banks, Ukip’s largest donor, told the Guardian that facts weren’t necessary for winning. “It was taking an American-style media approach. What they said early on was ‘facts don’t work’ and that’s it. You have got to connect with people emotionally. It’s the Trump success.”
 
While it might be human nature to believe what we want to hear, Facebook’s algorithms reinforce political polarization. “You are being manipulated by the system [for falling for the fake news] and you become the perpetrator because you share it to your friends who trust you and so the outbreak continues,” said Menczer. It’s a perfect feedback loop. So how do you break it?
 
Menczer says the solution is to create a filter. Before social media, the filter was provided by media companies, who acted as gatekeepers to the news and had staff trained in fact-checking and verifying information. In an age of budget cuts in traditional media, and the rise of clickbait and race-to-the-bottom journalism, standards have slipped across the board.
 
“Now the filter is us. But that’s not our job so we’re not good at it. Then the Facebook algorithm leverages that and amplifies the effect,” said Menczer.
 
In a separate study the social networking site worked out how to make people feel happier or sadder by manipulating the information posted on 689,000 users’ news feeds. It found it could make people feel more positive of negative through a process of “emotional contagion”.
 
“Instead of hiring more editors to check the facts, they got rid of the editors and now they are even more likely to spread misinformation,” said Menczer. “They don’t see themselves as a media company and they run the risk of being told they are picking sides. They are in a tough spot, but they are also making a lot of money.”
 
Facebook’s continued rejection of the idea that it is a media company doesn’t sit well with some critics. “It sounds like bullshit,” said high-profile investor Dave McClure, speaking from the Web Summit in Lisbon. “It’s clearly a source of news and information for billions of people. If that’s not a media organization then I don’t know what is.”
 
He added that technology entrepreneurs have a responsibility to enable a “more well-rounded experience” for their audiences. “A lot of them are only thinking about how to make money. Maybe we need to mix in having ethics and principles and caring about the fact that people have a reasonable and rational experience of the information they process.”
 
http://www.theguardian.com/technology/2017/mar/25/google-youtube-advertising-extremist-content-att-verizon http://www.theguardian.com/technology/2016/nov/10/facebook-fake-news-election-conspiracy-theories http://www.bbc.com/news/business-39325916 http://www.bbc.com/news/technology-39351075
 
Cracking the Code (ABC Four Corners)
 
"What''s on your mind?" It''s the Facebook question which lets you share what you''re thinking and what you''ve been up to. It''s also the question that unlocks the details of your life and helps turn your thoughts into Facebook''s profits.
 
Four Corners explores the world of Facebook and how your data is being mined to drive the huge financial success of the social media giant. Reporter Peter Greste examines the Facebook business model and shows how your private life is making them billions.
 
http://www.abc.net.au/4corners/stories/2017/04/10/4649443.htm
 
Nov 2016
 
Facebook developing censorship tool to enter Chinese market, by Mike Issac. (NYT)
 
Mark Zuckerberg, Facebook’s chief executive, has been cultivating relationships with China’s leaders. He has paid multiple visits to the country to meet its top internet executives. He has made an effort to learn Mandarin. Inside Facebook, the work to enter China runs far deeper.
 
The social network has quietly developed software to suppress posts from appearing in people’s news feeds in specific geographic areas, according to three current and former Facebook employees, who asked for anonymity because the tool is confidential.
 
The feature was created to help Facebook get into China, a market where the social network has been blocked, these people said. Mr. Zuckerberg has supported and defended the effort, the people added.
 
Facebook has restricted content in other countries before, such as Pakistan, Russia and Turkey, in keeping with the typical practice of American internet companies that generally comply with government requests to block certain content after it is posted.
 
Facebook blocked roughly 55,000 pieces of content in about 20 countries between July 2015 and December 2015, for example. But the new feature takes that a step further by preventing content from appearing in feeds in China in the first place.
 
Facebook would offer the software to enable a third party — in this case, most likely a partner Chinese company — to monitor popular stories and topics that bubble up as users share them across the social network, the people said. Facebook’s partner would then have full control to decide whether those posts should show up in users’ feeds.
 
The current and former Facebook employees caution that the software is one of many ideas the company has discussed with respect to entering China and, like many experiments inside Facebook, it may never see the light of day. The feature, whose code is visible to engineers inside the company, has so far gone unused, and there is no indication as yet that Facebook has offered it to the authorities in China.
 
But the project illustrates the extent to which Facebook may be willing to compromise one of its purported mission statements, “to make the world more open and connected,” to gain access to a market of 1.4 billion Chinese people.
 
China has been cordoned off to the social network since 2009 because of the government’s strict rules around censorship of user content.
 
The suppression software has been contentious within Facebook, which is separately grappling with what should or should not be shown to its users after the American presidential election’s unexpected outcome spurred widespead questioning over fake news on the social network.
 
Several employees who were working on the project have left Facebook after expressing misgivings about it, according to the current and former employees.
 
A Facebook spokeswoman said in a statement, “We have long said that we are interested in China, and are spending time understanding and learning more about the country.”
 
Facebook’s position underscores the difficulties that many American internet companies have had gaining access to China. For years, companies like Google and Twitter have been blocked there for refusing to yield to the government’s demands around censorship. In 2010, Google said it was directing users of its search engine in China to its service in Hong Kong, because of censorship and intrusion from hackers. Other companies, like the professional social networking service LinkedIn, agreed to censor content on their platforms in China.
 
China maintains strict controls over the internet. Still, some officials responsible for China’s tech policy have been willing to entertain the idea of Facebook’s operating in the country.
 
It would legitimize China’s strict style of internet governance, and if done according to official standards, would enable easy tracking of political opinions deemed problematic. Even so, resistance remains at the top levels of Chinese leadership.
 
Some analysts have said Facebook’s best option is to follow a model laid out by other internet companies and cooperate with a local company or investor. Finding a partner would transfer the censorship and surveillance operations. It would also let Facebook rely on a local company’s government connections and experience to deal with the difficult task of communicating with Beijing.
 
Facebook and Chinese officials have had intermittent talks in the last few years about the social network’s entering the market, according to employees who were involved in the discussions.
 
Facebook currently sells advertising for some Chinese businesses from its Hong Kong office. Among its customers are state-media sites that act as the propaganda arm of the Chinese government, and that operate official accounts where they post articles. Chinese citizens who wish to gain access to Facebook must tunnel in using a technology known as a virtual private network, or VPN.
 
It’s unclear when the suppression tool originated, but the project picked up momentum in the last year, as engineers were plucked from other parts of Facebook to work on the effort, the current and former employees said.
 
Unveiling a new censorship tool in China could lead to more demands to suppress content from other countries. The fake-news problem, which has hit countries across the globe, has already led some governments to use the issue as an excuse to target sites of political rivals, or shut down social media sites altogether.
 
Nov. 23, 2016
 
A major trade body for big publishers sends letter imploring CEOs of Google and Facebook to tackle fake news, writes Lara O''Reilly for The Business Insider UK.
 
Digital Content Next, a US trade body that represents premium online publishers, has sent a letter to Facebook CEO Mark Zuckerberg and Google CEO Sundar Pichai, calling on the two companies to do more to combat the fake news being discovered and shared across their sites.
 
In the letter, obtained by Business Insider, DCN CEO Jason Kint says the two companies "bear a special responsibility, one that you sometimes appear naïve to," to clean up the "garbage littering the digital media ecosystem."
 
Both Google and Facebook have faced increased scrutiny since the US presidential election about their inadequate efforts to tackle fake news.
 
False stories claiming that Pope Francis had endorsed Donald Trump and that Hillary Clinton had sold weapons to ISIS were widely shared on Facebook.
 
And the top Google News result for "final election count" was at one point surfacing a fake story from a WordPress blog that incorrectly said Trump had won the popular vote by a margin of almost 700,000. (Clinton is currently 2 million votes ahead in the popular vote.)
 
After first dismissing the notion that fake news on Facebook could have swung the US election, Zuckerberg acknowledged that the company had more work to do to rid the site of fake news. Pichai, on the other hand, was immediately more open to the suggestion that fake news might have affected the election, and he said Google had been looking at how to fact-check articles and promote stories from trusted sources.
 
But Kint of DCN says these efforts, while "encouraging and constructive," are not enough. He said the companies should apply themselves to ridding their services of fake news with the same "excitement, investment, and vigilance" as Google''s parent company, Alphabet, does with its forward-thinking technology projects, often referred to as "moonshots."
 
Kint writes: "However, to paraphrase a recent New York Times editorial, we believe you owe your users, and democracy itself, far more. Your companies make it a point to celebrate ''moonshots'' that require vision, resources and engineering prowess. Your capacity to pursue these projects is built on your extraordinary dominance over the digital media landscape.
 
"Wouldn''t it make sense for you to pursue cleaning-up the garbage littering the digital media ecosystem with the same excitement, investment and vigilance with which you pursue these huge projects? We don''t see that in your public statements or actions. Over the years, you have claimed repeatedly that you are not media companies; instead, the word ''utility'' has been used, occasionally by your own executives. But if even 1% of the water in our local utility was polluted, wouldn''t it be right to move heaven and earth to clean it up?"
 
DCN represents more than 70 media brands including The New York Times, The Washington Post, Viacom, Business Insider, the Financial Times, Time Inc., Hearst, Gannett, Bloomberg, ESPN, the Associated Press, and BBC.com.
 
Kint says in the letter that these publishers — who do sometimes get things wrong but are trusted by consumers to provide accurate and fair coverage and acknowledge mistakes — are willing to "devote time, resources, and energy to help clean up this mess."
 
Representatives for Google and Facebook were not immediately available for comment.
 
* External link: Ethical Journalism Network: Will media learn the lessons from the US election: http://storify.com/EJN/media-introspection-after-trump-win


Visit the related web page
 

View more stories

Submit a Story Search by keyword and country Guestbook