Facebook tightens live-streaming in crackdown on violence after Christchurch massacre

Social media giant Facebook is tightening access to live-streaming to prevent the sharing of graphic videos following the Christchurch mosque massacre in New Zealand.
3 min read
Facebook announced on Wednesday it would tighten access to its live-streaming feature. [Getty]
Facebook announced on Wednesday it would tighten access to its livestreaming feature as New Zealand's premier Jacinda Ardern and French leader Emmanuel Macron prepared to launch the global "Christchurch Call" initiative to tackle the spread of extremism online.

People who have broken certain rules, including those against "dangerous organizations and individuals," will be restricted from using the Facebook Live streaming feature, said vice president of integrity Guy Rosen.

"Following the horrific recent terrorist attacks in New Zealand, we've been reviewing what more we can do to limit our services from being used to cause harm or spread hate," he said in a statement.

A self-described white supremacist gunned down 51 people at two Christchurch mosques in March, and broadcast live footage of the violence on Facebook from a head-mounted camera.

A "one-strike" policy at Facebook Live will be applied to a broader range of offenses, with those who violate serious policies suspended from using the feature after a single offense.

Such violations would include sharing a link to a statement from a terrorist group with no context, according to Rosen.

"We plan on extending these restrictions to other areas over the coming weeks, beginning with preventing those same people from creating ads on Facebook," Rosen said.

He added that technical innovation is needed to get ahead of the kind of "adversarial media manipulation" seen after the New Zealand mosque massacre, such as users modifying videos in order to slip past filters.

"One of the challenges we faced in the days after the attack was a proliferation of many different variants of the video of the attack," Rosen said.

"People - not always intentionally - shared edited versions of the video which made it hard for our systems to detect."

Facebook announced that it was putting $7.5 million into research partnerships with three US universities to improve image and video analysis technology.

"This work will be critical for our broader efforts against manipulated media, including deepfakes," Rosen said, a reference to videos altered using artificial intelligence.

"We hope it will also help us to more effectively fight organized bad actors who try to outwit our systems as we saw happen after the Christchurch attack."

New Zealand Prime Minister Jacinda Ardern welcomed the move as "a good first step".

"The March 15 terrorist highlighted just how easily livestreaming can be misused for hate. Facebook has made a tangible first step to stop that act being repeated on their platform," she said.

Ardern was set to join other world leaders in launching the "Christchurch Call" to curb online extremism at an international meeting in Paris on Wednesday.

Top executives from Amazon, Google, Microsoft and Twitter were also expected to attend, though Facebook's Mark Zuckerberg was to be represented by another executive from the social media giant.

Follow us on Twitter: @The_NewArab