Advertisers should pull their ads from the likes of YouTube and Facebook to put pressure on them to remove extremist and terror-related content, a committee of MPs has warned.
The UK parliament’s intelligence and security committee warned in a report yesterday that tech companies need to be threatened with action that "affects their profits" rather than appeal to their sense of "doing the right thing".
The Manchester bomber, Salman Abedi, had watched YouTube clips that contained information about how to build explosive devices.
The committee heard evidence from YouTube owner Google, Facebook and Apple, which said they do not routinely monitor content on their systems and therefore cannot block all extremist material automatically. Instead, they rely on user feedback.
The report looks at what needs to change in response to last year’s series of terror attacks in the UK. It was widely reported yesterday amid MI5’s admission that it had made mistakes in not preventing the Manchester bombing in May 2017.
But social media companies were also heavily criticised by MPs for "failing to assist the authorities by removing extremist material from their platforms" over the past four years.
"We question whether this is because efforts to persuade the CSPs [communications service providers] have sought to appeal to their sense of corporate and social responsibility, instead of concentrating on financial levers," the report said.
"When there was a social media backlash against companies whose ads appeared alongside extremist videos on YouTube, those companies had little choice but temporarily to stop advertising on YouTube.
"More recently, Unilever announced that it is considering withdrawing its business from companies that are not doing more to provide 'responsible digital infrastructure'."
The committee is now recommending that the government lobbies advertisers to take action, following Unilever’s example.
Phil Smith, director general of ISBA, said it and its members have "applied consistent pressure" when it cmes to unacceptable content through advertising.
But he warned the problem was not confined to Google and Facebook because smaller platforms and publishers reproduce terrorist material masquerading as news.
He said: "The major platforms have made significant progress in identifying and taking down terrorist content quickly through the use of AI, machine learning and human review.
"The recent proposals from Facebook for an independent content appeals board goes some way towards meeting ISBA’s calls for an independent oversight body, funded by industry. ISBA believe such a body should set principles and codes, certify policies and processes, audit reporting and create a route for recourse. In essence, a self and co-regulatory system, based on today’s well-proven advertising regulation system in the UK."