Social media giants 'shamefully failing' to tackle online hate

A new report from a UK parliament committee found that social media companies aren't doing enough to fight online terrorism, hate-speech or abuse and greater penalties were required.
3 min read
02 May, 2017
The report found Facebook, YouTube and Twitter were not doing enough to fight extremism [Getty]

Social media companies should be forced to pay the police to monitor their content if they won't do it themselves, a new report from the UK government has said.

The government should also consider bigger fines for companies that fail to tackle "illegal and dangerous content", the Parliament's Home Affairs Committee said.

"Social media companies currently face almost no penalties for failing to remove illegal content," the report concludes.

"We recommend that the government consult on a system of escalating sanctions, to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe."

The report mentions Neo-Nazi and Jihadist recruitment videos that remain online – even after their content has been flagged for removal.

"The biggest and richest social media companies are shamefully far from taking sufficient action to tackle illegal and dangerous content," the committee reported.

The inquiry was launched in the summer of 2016 after the murder of the Labour MP, Jo Cox, who was shot dead by a member of the far-right shortly before the referendum on leaving the EU.

Executives from Facebook, Twitter and Google were quizzed by MPs on the committee on this issue in March.

An executive from Twitter said it was "not doing a good enough job" at responding to reports of extremist content or cyber-bullying.

"We don't communicate with the users enough when they report something, we don't keep people updated enough and we don't communicate back enough when we do take action," said Nick Pickles, Twitter's head of public policy in the UK.

A Facebook executive was confronted with evidence that the social media platform did not do enough to monitor sexualised images of children.

Simon Milner said that evidence from a BBC investigation – where 82 out of 100 images of indecent pictures involving children were not removed – showed their system was not working.

There's a connection between race hate and anti-women language

The commission gives numerous examples of explicit content that was not removed, despite plentiful warning and said the social media companies were "big enough, rich enough and clever enough" to solve the problem.

"We found titles [on YouTube] that included 'White Genocide Europe – Britain is waking up', 'Diversity is a code word for white genocide' and 'Jews admit organising White Genocide'" the committee said in its report.

"Anti-Semitic holocaust denial videos included 'The Greatest Lie Ever Told', 'The Great Jewish Lie' and 'The Sick Lies of a Holocaust 'Survivor''."

MPs said that Google has "profited from hatred" and "allowed itself to be a platform from which extremists have generated revenue."

"Mainstream reputable companies, charity donors and taxpayers were inadvertently funding terrorists and their sympathisers."

Social media companies that fail to proactively search for and remove illegal material should pay towards costs of the police doing so instead

A large number of companies, including Johnson & Johnson, Verizon, AT&T, Enterprise and GSK all announced a boycott on YouTube advertising in March after a report found their content was paying for extremist content and supporting its creators.

"It is shocking that Google failed to perform basic due diligence regarding advertising on YouTube paid for by reputable companies and organisations," the MPs concluded.

The report mentions how UK football clubs are required to pay for policing outside matches – and then asks if the same should apply to online policing as well.

"Social media companies that fail to proactively search for and remove illegal material should pay towards costs of the police doing so instead."

Nick Lowles, chief executive at Hope Not Hate, the UK's largest anti-racism organisation said there was a "connection between race hate and anti-women language".

"Those things all contributed to an atmosphere where fact and reality in a way did not matter. It was all about emotion," he said.