Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

UK lawmakers to social media: “You’re accessories to radicalization, accessories to crimes”, hearing on spread of extremist content

Save for later
  • 10 min read
  • 29 Apr 2019

article-image

Representatives from Facebook, YouTube, and Twitter were grilled and admonished on 23rd April, Tuesday by UK lawmakers on the spread of extremist and criminal content on social media platforms.

Facebook’s Public Policy Officer, Neil Potts; Twitter’s Head of UK Govt, Public Policy and Philanthropy, Katy Minshall and YouTube’s Public Policy Director, Marco Pancini represented their companies in front of the UK Home Affairs committee to discuss why social media companies are “actively” pushing their users to consume extremist content in order to drive up profits.

The hearing was chaired by Chairwoman Yvette Cooper and other committee members namely; MP Stephen Doughty, MP Tim Loughton, MP Stuart McDonald and other select members of the home affairs.

The hearing was spurred by the spread of the graphic Christchurch shooting video, which the platforms struggled to contain . The shooter, who killed 50 people and injured 50 more at two mosques in New Zealand, live-streamed the attack on Facebook. And then there were multiple versions of the attack which were spread across various mediums which the social media companies failed to take down from their platform. They only figured it out when it was originally uploaded and shared on the platform and reacted quickly to take it down, said all the three tech companies in their responses.

The committee also acknowledged the preceding weekend ban of social media in Sri Lanka in the wake of coordinated terrorist attacks that claimed over 250 lives and left many more seriously injured.

On extremist content takedown rates


The committee members slammed the companies for allowing hateful content to proliferate, especially in the case of YouTube, that it actually promotes its visibility through its recommendation algorithms.

Chairwoman Yvette Cooper strongly said, “You are making these crimes possible, you are facilitating these crimes," chairwoman Yvette Cooper said. "Surely that is a serious issue.” "What on Earth are you doing!? You’re accessories to radicalization, accessories to crimes," MP Stephen Doughty debated.

https://twitter.com/MarkDiStef/status/1121009074307465216

https://twitter.com/MarkDiStef/status/1121003221227646976

Neil Potts, Facebook representative, repeats his defense on every matter that it now has 30,000 staff working on safety and security, including engineers building best in class AI algorithms, language and subject matter experts and 15,000 content moderators. But when asked whether people spreading the terrorist propaganda had been reported to police, Mr Potts said proactive referrals were only made to law enforcement when there was an “imminent threat”.

In addition to removing the original live-streamed video, Facebook said it removed 1.5 million instances of the video, with 1.2 million of those videos blocked at upload within 24 hours of the attack.

Katy Minshall, Twitter representative said, 1.4 million tweets had been removed for promoting terrorism and the social network actively enforces its rules rather than relying on reports. Twitter has 1,500 people working on policy enforcement and moderation around the world, and is removing more content but is “never going to get a 100% success rate”, she said. She added: “There is a likely risk in the next few years that the better our tools get, the more users are removed, the more they will migrate to parts of the internet where nobody is looking.”

Facebook's Neil Potts said that he could not rule out that there were still versions of the Christchurch shooting on the platform. And YouTube's Marco Pancini, acknowledged that the platform's recommendation algorithms were driving people towards more extremist content — even if that's not what they "intended."

On reporting crimes to law enforcement


Chairwoman Cooper was particularly upset after Facebook said it doesn't report all crimes to the police. Potts said that Facebook reports crimes when there is a threat to life, and assessed crimes committed on the platform on a "case by case basis." Twitter and YouTube said they had similar policies.

"There are different scales of crimes," Potts said. To which Cooper responded. "A crime is a crime... who are you to decide what’s a crime that should be reported, and what crime shouldn’t be reported?"

On algorithms recommending extremist or hateful content


Further MPs took it upon themselves to test how YouTube's algorithm promotes extremist content. Prior to the hearing, they had searched terms like "British news," and in each case were directed to far-right, inflammatory content by the recommendation engine.

“You are maybe being gamed by extremists, you are effectively providing a platform for extremists, you are enabling extremism on your platforms,” Cooper said. "Yet you are continuing to provide platforms for this extremism, you are continuing to show that you are not keeping up with it, and frankly, in the case of YouTube, you are continuing to promote it. To promote radicalization that has huge damaging consequences to families, lives and to communities right across the country."

One of the members from the committee also accused YouTube, Facebook and Twitter of “not giving a damn” about fuelling radicalisation in the wake of the massacres in Sri Lanka and New Zealand.

MPs took particular aim at YouTube over the way its algorithms promote videos and create playlists for viewers that they accused of becoming increasingly extreme. The site has been repeatedly criticised for showing a variety of inflammatory comment in the recommendations pane next to videos. Due to this MPs said it could easily radicalize young people who begin watching innocent videos.

On promoting radicalization being embedded into platform success


MP Tim Loughton says tests showed a benign search could end with “being signposted to a Nazi sympathiser group”. He added: “There seems to be a systemic problem that you are actively signposting and promoting extremist sites."

YouTube representative responded that YouTube uses an algorithm to find out related and engaging content, so that users will stay on the site by clicking through videos. He further did not reveal the details of that algorithm, but mentioned that it allows YouTube to generate profits by showing more advertising the longer its users stay on the site.

MPs described how that chain of related videos would lead to more and more extreme in content, even if the first video had been relatively innocuous. Ms Cooper described clicking through videos and finding that “with each one the next one being recommended for me was more extreme”, going from right-wing sites to racist and radical accounts.

“The algorithms that you profit from are being used to poison debate,” she said.

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime

Can prioritizing authoritative content for breaking news offset effects of radicalization?


Marco Pancini, gave an explanation to this that the logic behind its algorithms “works for 90 per cent of experience of users on the platform”. And he also said that they are “aware of the challenge this represents for breaking news and political speech”, and was working to prioritise authoritative content and reduce the visibility of extremists.

He pointed to the work it has done to prioritise authoritative sources when people are searching for political speech or breaking news. Some of that has led to controversies of its own, such as when YouTube accidentally linked a video of the Notre Dame fire to video of the 9/11 attacks.

Mr Doughty accused YouTube of becoming “accessories to radicalisation” and crime, but Mr Pancini replied: “That is not our intention … we have changed our policies.”

He said the company was working with non-governmental organisations in 27 European countries to improve detection of offensive content.

On continuing to platform known extremist accounts and sites


MP Stephen Doughty said he found links to the websites of “well-known international organisations” and videos calling for the stoning of gay people on YouTube and other platforms.

“Your systems are simply not working and, quite frankly, it's a cesspit,” he added. “It feels like your companies really don't give a damn.

”You give a lot of words, you give a lot of rhetoric, you don't actually take action … all three of you are not doing your jobs.”

Representatives of Facebook, Twitter and YouTube said they had increased efforts against all kinds of extremism, using both automated technology and human moderators.

Further MPs pointed out to the the Islamist militant group that carried out church and hotel bombings that left more than 300 people dead in Sri Lanka still has a Twitter account, and its YouTube channel was not deleted until two days after one of the world’s deadliest terror attacks.

Ms Cooper also showed reports that clips of the Christchurch shooter’s Facebook Live video were still circulating and said she had been recommended Tommy Robinson videos on YouTube after a supposed crackdown.

MP Stephen Doughty also revealed that overnight, he had been alerted to weeks-old posts on a closed Facebook group with 30,000 members that spoke about she and her family should be shot as “criminals”. “Kill them all, every f***ing one, military coup National Socialism year one – I don’t care as long as they are eradicated,” read another post that remained online.

Ms Cooper accused the social media giants of “providing safe places to hide for individuals and organisations who spread hate”.

On inaction, delay and deflection tactics employed by social media


Yvette Cooper said we are “raising the same issues again and again over several years and it feels like now the government has lost trust and confidence in you that you are doing anything to sort this in anyway.”

In response to this when representatives of YouTube, Facebook and Twitter, outlined action taken against extremist content earlier this month which, MPs countered by provided fresh examples of neo-Nazi, Islamist and far-right posts on their platforms during the hearing.

“We have taken evidence from your representatives several times over several years, and we feel like we are raising the same issues again and again,” the former shadow home secretary added. “We recognise you have done some additional work but we are coming up time and again with so many examples of where you are failing, where you may be being gamed by extremists or where you are effectively providing a platform for extremism ... very little has changed.”

Ms Cooper said material on Facebook, Twitter and YouTube “leads to people being hurt and killed”.

The UK government has proposed the creation of an independent regulator to create a code of practice for tech companies, and enforce it with fines and blocks.

Representatives of Facebook, Twitter and YouTube said they supported measures proposed in the Online Harms white paper, which is currently under consultation.

They repeatedly insisted that they are working to increase both automated and human content moderation, building new tools and hiring thousands of employees. But lawmakers asserted that these are bandaids on systemic problems, and extremists are using the services exactly as they were meant to be used: to spread and share content, ignite passions, and give everyone a platform.

Jack Dorsey engages in yet another tone deaf “public conversation” to better Twitter

Online Safety vs Free Speech: UK’s “Online Harms” white paper divides the internet and puts tech companies in government crosshairs

Tech companies in EU to face strict regulation on Terrorist content: One hour take down limit; Upload filters and private Terms of Service