Facebook to curb livestreaming amid pressure over Christchurch massacre
Facebook boss Mark Zuckerberg has been under intense pressure since March when a self-described white supremacist used Facebook Live to stream his rampage at two mosques in the New Zealand city of Christchurch, which left 51 people dead.
The California-based platform said it would ban Facebook Live users who shared extremist content and seek to reinforce its own internal controls to stop the spread of offensive videos.
“Following the horrific recent terrorist attacks in New Zealand, we’ve been reviewing what more we can do to limit our services from being used to cause harm or spread hate,” Facebook vice-president of integrity Guy Rosen said in a statement.
Along with their counterparts from Britain, Canada, Norway, Jordan and Senegal, who will also be in Paris, Ardern and Macron will later issue the Christchurch Call to fight the spread of hateful and terror-related content.
The largely symbolic initiative is intended to keep up the pressure on social media companies who face growing calls from politicians across the world to restrict the spread of extremism and disinformation on their platforms.
Many countries have already tightened legislation to introduce fines for companies that fail to block offensive content, but experts say a new wave of regulation — championed by France in particular — could be looming.
The political meeting in Paris will run in parallel to an initiative launched by Macron called “Tech for Good” which will bring together 80 tech chiefs to discuss how to harness technologies for the common good.
The heads of Wikipedia, Uber, Twitter and Google will attend, but not Zuckerberg who held private one-to-one talks with Macron last week.
The social network giant will instead be represented by its vice president for global affairs and communications Nick Clegg, the former British deputy premier.
The Christchurch Call meeting is to get underway around 1400 GMT and finish with a press conference by Ardern and Macron at 1600 GMT.
‘Horrifying new trend’
In an opinion piece in The New York Times over the weekend, Ardern said the Christchurch massacre underlined “a horrifying new trend” in extremist atrocities.
“It was designed to be broadcast on the internet. The entire event was live streamed… the scale of this horrific video’s reach was staggering,” she wrote.
Ardern said Facebook removed 1.5 million copies of the video within 24 hours of the attack, but she still found herself among those who inadvertently saw the footage when it auto-played on their social media feeds.
“(We’re) asking both nations and private corporations to make changes to prevent the posting of terrorist content online, to ensure its efficient and fast removal and to prevent the use of live streaming as a tool for broadcasting terrorist attacks,” she wrote in The Times.
In Wednesday’s statement, Facebook acknowledged the inadequacy of its own systems.
“One of the challenges we faced in the days after the attack was a proliferation of many different variants of the video of the attack,” vice-president of integrity Rosen said.
“People — not always intentionally — shared edited versions of the video which made it hard for our systems to detect.”
New Zealand officials said Ardern found a natural partner for the fight against online extremism in Macron, who has repeatedly stated that the status quo is unacceptable.
“Macron was one of the first leaders to call the prime minister after the attack, and he has long made removing hateful online content a priority,” New Zealand’s ambassador to France, Jane Coombs, told journalists on Monday.
“It’s a global problem that requires a global response,” she said.
A French presidential source said it was time for tech companies to “anticipate how their features will be exploited.”
Firms themselves will be urged to come up with concrete measures, the source said, for example by reserving live broadcasting to social media accounts whose owners have been identified.
No comments yet