TikTok has a major problem with child pornography.
An extensive report from Forbes chronicles a terrifying reality: child pornography — legally known as child sexual abuse material — is easy to come by on TikTok, the short-form video sharing app with one billion monthly active users, making it the sixth most popular social media platform in the world.
To most, Forbes writer Alexandra Levine reported, the posts tied to the criminal handles “typically read like advertisements and come from seemingly innocuous accounts.”
Listen to the latest episode of CBN’s Quick Start podcast 👇
“But often,” she continued, “they’re portals to illegal child sexual abuse material quite literally hidden in plain sight — posted in private accounts using a setting that makes it visible only to the person logged in.”
The CSAM-filled account holders purportedly share illicit content using “post-in-private” settings, meaning the one accessing the photos and videos has to have the account’s login information or use specified phrases, bypassing algorithms that might otherwise result in violations of the app’s terms of use.
Seara Adair, a survivor of child sexual abuse and an advocate for children’s safety, told Forbes she has reached out to TikTok employees, but to no avail. She has tried to alert them to this trend, explaining she believes users have discovered ways to bypass computer-operated and monitored algorithms by posting black-screen videos that only last a few seconds and contain brief instructions for predators.
“There’s quite literally accounts that are full of child abuse and exploitation material on their platform,” she told the outlet. “Not only does it happen on their platform, but quite often, it leads to other platforms — where it becomes even more dangerous.”
Adair said she has seen videos depicting “a child completely naked and doing indecent things.”
For her part, Levine corroborated Adair’s comments, reporting it was relatively simple to access “post-in-private” accounts without any hurdles, while others just required potential predators to contribute their own images before gaining access to the account information. Some account users were reportedly recruiting girls as young as 13 years old.
The issue is hardly unique to TikTok, according to Haley McNamara, director of the International Centre on Sexual Exploitation. She told Forbes all social media platforms are plagued with CSAM.
“There is this trend of either closed spaces or semi-closed spaces that become easy avenues for networking of child abusers, people wanting to trade child sexual abuse materials,” she said. “Those kinds of spaces have also historically been used for grooming and even selling or advertising people for sex trafficking.”
In a statement to Forbes, TikTok spokesperson Mahsau Cullinane said the social media behemoth has “zero tolerance for child sexual abuse material and this abhorrent behavior, which is strictly prohibited on our platform.” The representative further noted that every video posted to TikTok — both publicly and privately — goes through the apps artificial intelligence-operated content moderation and additional human review, on an as-needed basis.
“When we become aware of any content, we immediately remove it, ban accounts, and make reports to [the National Center for Missing & Exploited Children],” said Cullinane.
***As the number of voices facing big-tech censorship continues to grow, please sign up for Faithwire’s daily newsletter and download the CBN News app, developed by our parent company, to stay up-to-date with the latest news from a distinctly Christian perspective.***