Several large companies — including Nestle and Disney — have pulled ads from YouTube after a popular blogger posted a video over the weekend, claiming the Google-owned platform facilitates a “soft-core pedophilia ring.”
In a 20-minute video viewed more than 2 million times since being uploaded Sunday, blogger Matt Watson explains how the video-hosting website features a bug that operates as a “wormhole” for pedophilic content.
Watson demonstrated, through screenshots and recordings, that if users click on one of the videos with underage girls, most of whom are seen doing gymnastics or posing in compromising positions, YouTube’s algorithm will then flood their computer screens with recommendations for similar clips.
“I can consistently get access to [the videos in question] from vanilla, never-before-used YouTube accounts via innocuous videos in less than 10 minutes, in sometimes less than five clicks,” he explained.
America Has a Serious Porn Problem — It’s Worse Than You Think
As a result of Watson’s alleged exposé, the Walt Disney Co., Nestle SA, and video game maker Epic Games, Inc., have paused ad spending on YouTube, according to Bloomberg.
How is YouTube responding?
In a statement released after Watson’s video went viral, a YouTube spokesperson described the content in question as “abhorrent,” noting the site took “immediate action” to remedy the issue.
Faithwire Launches ‘Set Free’ — A New E-Course Focused on Combatting Porn Addiction
“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” the spokesperson told Bloomberg. “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments.”
On Tuesday, YouTube announced an update to its community guidelines, streamlining the system and making it easier for videos to be reported and content to be removed and/or censored.