Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Nestle, Disney, Fortnite pull out their YouTube ads from paedophilic videos as YouTube’s content regulation woes continue

Save for later
  • 3 min read
  • 21 Feb 2019

article-image

Youtube faced backlash for another content regulation problem when videos of young children with exposed private parts began surfacing. These videos also displayed advertising from major brands alongside the content, leading to major companies like Nestle, Disney, Fortnite pull these YouTube ads from the identified videos. This issue was first discovered on Sunday, when Matt Watson, a video blogger, posted a 20-minute clip detailing how comments on YouTube were used to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics.

Youtube received major criticism from companies and individuals alike for recommending videos of minors and allowing pedophiles to comment on these posts, with a specific time stamp of the video of when an exposed private part of the young child was visible. YouTube was also condemned for monetizing these videos allowing advertisements for major brands like Alfa Romeo, Fiat, Fortnite, Grammarly, L’Oreal, Maybelline, Metro: Exodus, Peloton and SingleMuslims.com, etc to be displayed on these videos.

Companies pull out ads from Youtube


Following this news, a large number of companies pulled their advertising spending from YouTube. Grammarly told Wired, “We’re absolutely horrified and have reached out to YouTube to rectify this immediately, we have a strict policy against advertising alongside harmful or offensive content. We would never knowingly associate ourselves with channels like this.”

A spokesperson for Fortnite publisher Epic Games told Wired, that it had paused all pre-roll advertising on YouTube. “Through our advertising agency, we have reached out to YouTube to determine actions they’ll take to eliminate this type of content from their service,” Fortnite added.

Disney and Nestle have also paused advertising on YouTube.

Replying to these accusations, a Youtube spokesperson said in an email, “Any content --including comments -- that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments.

People on Twitter have strongly condemned YouTube’s actions.

https://twitter.com/gossip_garden/status/1097396580691234816

https://twitter.com/tsosnierz/status/1097412787603759104

https://twitter.com/justin_ksu/status/1098419470253596679

https://twitter.com/rep_turd/status/1097984363948457984

Youtube also recently updated its algorithm, introducing a new strikes system to make its community guidelines more transparent and consistent. They are introducing more opportunities for everyone to understand Youtube’s policies, a consistent penalty for each strike, and better notifications. Last month, YouTube announced an update regarding YouTube recommendations aiming to reduce the recommendations of videos that promote misinformation and conspiracy theories.

YouTube bans dangerous pranks and challenges

Youtube promises to reduce recommendations of ‘conspiracy theory’. Ex-googler explains why this is a ‘historic victory’.

Is YouTube’s AI Algorithm evil?

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime