To sexual abuse survivor and advocate Eliza Bleu, the case of John Doe v. Twitter is the preeminent “modern-day example of David and Goliath.”
Attorneys for Twitter filed a motion last week to dismiss a lawsuit filed by a minor who claims the social media platform failed to remove pornographic videos of him and another 13-year-old child from the website.
Twitter’s lawyers are arguing the platform is immune to legal consequences under Section 230 of the Communications Decency Act of 1996.
What’s the background?
According to the lawsuit, the now-17-year-old boy — who goes by John Doe and lives in Florida — was taken advantage of as a young teenager by a predator on Snapchat pretending to be a 16-year-old female classmate, the New York Post reported. He was eventually blackmailed into sending more explicit photos and videos before he ultimately blocked the account.
The videos resurfaced on Twitter in 2019, and, in January of last year, Doe’s classmates found the pornographic clips and began teasing and bullying him about them. Shortly thereafter, Doe contacted Twitter, asking the platform’s staffers to take down the illicit and abusive content.
Twitter reportedly responded to Doe, telling him, “Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.”
“If you believe there’s a potential copyright infringement, please start a new report,” the site’s statement continued. “If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.”
***As the number of voices facing big-tech censorship continues to grow, please sign up for Faithwire’s daily newsletter and download the CBN News app, developed by our parent company, to stay up-to-date with the latest news from a distinctly Christian perspective.***
In response, Doe wrote, “What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos at all and they need to be taken down.”
The teenager even included his case number from his interactions with a local law enforcement agency. Nevertheless, Twitter failed to remove the content. Two days later, thanks to a mutual connection, Doe’s mother was put in contact with an agent from the U.S. Department of Homeland Security who had the videos removed from Twitter Jan. 30.
In the time before the videos were taken down, they amassed around 167,000 views and 2,223 retweets.
The minor teen filed a lawsuit against the social media platform in January, seeking to “shine a light on how Twitter has enabled and profited from CSAM [child sexual abuse material] on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity.”
In an email to Faithwire, Bleu, a survivor of sex trafficking on Twitter, said Doe “represents the voice of many children around the world that have not only been sexually exploited but have had images or video of their abuse spread around the world through social media.”
“Twitter knowingly profited off of his sexual exploitation by allowing the content to remain live on its platform after being informed multiple times and verifying his identity and age,” she explained.
What’s happening now?
Attorneys for Twitter are now arguing the platform is not responsible for the presence of child pornography on the website, citing Section 230 of the Communications Decency Act of 1996.
In a motion to dismiss the case, Twitter’s legal team stated, in part:
Congress recognized the inherent challenges of large-scale, global content moderation for platforms, including the potential for liability based on a platform’s alleged “knowledge” of offensive content if it chose to try to screen out that material but was unable to root out all of it.
Hoping to encourage platforms to engage in moderation of offensive content without risking incurring potentially ruinous legal costs, in 1996, Congress enacted Section 230 of the Communications Decency Act (“CDA § 230”), granting platforms like Twitter broad immunity from legal claims arising out of failure to remove content.
“Given that Twitter’s alleged liability here rests on its failure to remove content from its platform, dismissal of the complaint with prejudice is warranted on this ground alone,” Twitter’s attorneys claimed.
In its motion filed last Wednesday, Twitter admitted it “appears” Doe “suffered appallingly” at the hands of digital sex traffickers, but argued the platform itself bears no culpability for the illegal content hosted on its site.
The tech company’s lawyers argued Doe’s case “does not seek to hold those perpetrators [the sex traffickers] accountable for the suffering they inflected on” him. Instead, “this case seeks to hold Twitter liable because a compilation of that explicit video content (the ‘Videos’) was — years later — posted by others on Twitter’s platform and, although Twitter did remove the content, it allegedly did not act quickly enough,” the legal team said.
***As the number of voices facing big-tech censorship continues to grow, please sign up for Faithwire’s daily newsletter and download the CBN News app, developed by our parent company, to stay up-to-date with the latest news from a distinctly Christian perspective.***
“Twitter recognizes that, regrettably, [Doe] is not alone in suffering this kind of exploitation by such perpetrators on the internet,” the claim from Twitter continued. “For this reason, Twitter is deeply committed to combating child sexual exploitation (‘CSE’) content on its platform. And while Twitter strives to prevent the proliferation of CSE, it is not infallible.”
Ultimately, the platform’s lawyers said, “mistakes or delays do not make Twitter a knowing participant in a sex trafficking venture, as [Doe] here has alleged.”
To Twitter’s critics, that answer isn’t enough.
Though Bleu doesn’t blame the social media site for the grooming or the assault itself, she does begin holding Twitter accountable “from the second they are made aware of child sexual abuse material on their platform and they refuse to take it down.”
“This is a repeated pattern of negligence from the social media giant,” Bleu asserted. “If John Doe has an opportunity to receive justice, it will set a precedent that will send the message to Twitter and other platforms that they can no longer profit off of child exploitation. I am appalled that Twitter is choosing to fight this case and filed to have the lawsuit dropped. If you are moved by the plight of John Doe and the countless victims Twitter is profiting off of, please take time to encourage Twitter to do the right thing, and help spread John Doe’s story.”
“Changes,” she added, “are long overdue.”
If you or someone you know has been sexually exploited on Twitter as a child or as an adult, please reach out to the National Center On Sexual Exploitation Law Center.