Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Youtube promises to reduce recommendations of ‘conspiracy theory’. Ex-googler explains why this is a 'historic victory'

Save for later
  • 4 min read
  • 12 Feb 2019

article-image
Talks of AI algorithms causing harms including addiction, radicalization. political abuse and conspiracies, disgusting kids videos and the danger of AI propaganda are all around. Last month, YouTube announced an update regarding YouTube recommendations aiming to reduce the recommendations of videos that promote misinformation ( eg: conspiracy videos, false claims about historical events, flat earth videos, etc). In a historical move, Youtube changed its Artificial Intelligence algorithm instead of favoring another solution, which may have cost them fewer resources, time, and money.

Last Friday, an ex-googler who helped build the YouTube algorithm, Guillaume Chaslot, appreciated this change in AI, calling it “a great victory” which will help thousands of viewers from falling down the rabbit hole of misinformation and false conspiracy theories. In a twitter thread, he presented his views as someone who has had experience working on Youtube’s AI.

Recently, there has been a trend in Youtube promoting conspiracy videos such as ‘Flat Earth theories’. In a blog post, Guillaume Chaslot explains, “Flat Earth is not a ’small bug’. It reveals that there is a structural problem in Google’s AIs and they exploit weaknesses of the most vulnerable people, to make them believe the darnedest things.” Youtube realized this problem and has made amends to its algorithm.

It’s just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube. To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube. As always, people can still access all videos that comply with our Community Guidelines”, states the YouTube team in a blog post.

Chaslot appreciated this fact in his twitter thread saying that although Youtube had the option to ‘make people spend more time on round earth videos’, they chose the hard way by tweaking their AI algorithm.

AI algorithms also often get biased by tiny groups of hyperactive users. As Chaslot notes, people who spend their lives on YouTube affect recommendations more. The content they watch gets more views, which leads to Youtubers noticing and creating more of it, making people spend even more time on that content. This is because YouTube optimizes for things you might watch, not things you might like. As a hacker news user observed, “The problem was that pathological/excessive users were overly skewing the recommendations algorithms. These users tend to watch things that might be unhealthy in various ways, which then tend to get over-promoted and lead to the creation of more content in that vein. Not a good cycle to encourage.

The new change in Youtube’s AI makes use of machine learning along with human evaluators and experts from all over the United States to train these machine learning systems responsible for generating recommendations. Evaluators are trained using public guidelines and offer their input on the quality of a video. Currently, the change is applied only to a small set of videos in the US as the machine learning systems are not very accurate currently. The new update will roll out in different countries once the systems become more efficient.

However, there is another problem lurking around which is probably even bigger than conspiracy videos. This is the addiction to spending more and more time online. AI engines used in major social platforms, including but not limited to YouTube, Netflix, Facebook all want people to spend as much time as possible. A hacker news user commented, “This is just addiction peddling. Nothing more. I think we have no idea how much damage this is doing to us. It’s as if someone invented cocaine for the first time and we have no social norms or legal framework to confront it.

Nevertheless, Youtube updating it’s AI engine was taken generally positively by Netizens. As Chaslot, concluded on his Twitter thread, “YouTube's announcement is a great victory which will save thousands. It's only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.” Now it is on Youtube’s part how they will strike a balance between maintaining a platform for free speech and living up to their responsibility to users.

Is the YouTube algorithm’s promoting of #AlternativeFacts like Flat Earth having a real-world impact?

YouTube to reduce recommendations of ‘conspiracy theory’ videos that misinform users in the US.

YouTube bans dangerous pranks and challenges

Is YouTube’s AI Algorithm evil?