Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Is the YouTube algorithm’s promoting of #AlternativeFacts like Flat Earth having a real-world impact?

Save for later
  • 3 min read
  • 21 Nov 2018

article-image

It has not been long since the Logan Paul controversy hit the internet and people criticized YouTube algorithms and complained that they were still seeing recommendations of Logan Paul’s videos, even when it was brought down. Earlier this week, a “Flat Earth Conference” was held at Denver, Colorado where some attendees talked about how Youtube has persuaded them to believe the flat earth theory. In fact, Logan Paul was also one of the conference’s keynote speakers, despite not believing that the Earth is flat.

The attendees were interviewed by Daily Beast. In the conference, many participants told Daily Beast that they have come to believe in the Flat Earth theory based on YouTube videos. “It came on autoplay,” said Joshua Swift, a conference attendee. “So I didn’t actively search for Flat Earth. Even months before, I was listening to Alex Jones.

Recently, NBA star Kyrie Irving also spoke about his obsession with flat earth theory blaming YouTube videos for it. Irving spoke of having wandered deep down a “rabbit hole” on YouTube.

This has brought the emphasis back on the recommendation system that YouTube uses. In a blog post, Guillaume Chaslot, and ex-googler who helped build the YouTube algorithm explains, “Flat Earth is not a ’small bug’. It reveals that there is a structural problem in Google's AIs and they exploit weaknesses of the most vulnerable people, to make them believe the darnedest things.”

He mentions a list of Flat Earth videos which were promoted on Youtube.

https://www.youtube.com/watch?v=1McqA9ChCnA

 

https://www.youtube.com/watch?v=XFSH5fnqda4

This makes one question whether the YouTube algorithm is evil? The YouTube algorithm recommends videos based on watch time. More watch time means more revenue and more scope for targeted ads. What this changes, is the fundamental concept of choice and the exercising of user discretion. The moment the YouTube Algorithm considers watch time as the most important metric to recommend videos to you, less importance goes into the organic interactions on YouTube, which includes liking, commenting and subscribing to videos and channels.

Chaslot was fired by Google in 2013 over performance issues. His claim was that he wanted to bring about a change in the approach of the YouTube algorithm to make it more aligned with democratic values instead of being devoted to just increasing the watch time.
Chaslot has created Algotransparency, a site that scans and monitors YouTube recommendations daily.

Other Twitter users have also supported Chaslot’s article.

https://twitter.com/tristanharris/status/1064973499540869121

https://twitter.com/technollama/status/1064573492329365504

https://twitter.com/sivavaid/status/1064527872667369473


Is YouTube’s AI Algorithm evil?

YouTube has a $25 million plan to counter fake news and misinformation

YouTube went down, Twitter flooded with deep questions, YouTube back and everyone is back to watching cat videos

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €18.99/month. Cancel anytime