The plaintiff in the lawsuit is a 17-year-old, whose videos engaging in sexual activity with another minor were distributed on Twitter in 2019 by sex predators after he refused their demands. He filed a lawsuit accusing Twitter of refusing to remove the pornographic videos.
At the time, Twitter claimed that the videos did not violate its policies. The plaintiff’s mother had to contact a Homeland Security agent to convince Twitter to remove the videos.
“Congress recognized the inherent challenges of large-scale, global content moderation for platforms, including the potential for liability based on a platform’s alleged ‘knowledge’ of offensive content if it chose to try to screen out that material but was unable to root out all of it,” Twitter argued in the motion to dismiss.
“Hoping to encourage platforms to engage in moderation of offensive content without risking incurring potentially ruinous legal costs, in 1996 Congress enacted Section 230 of the Communications Decency Act (‘CDA § 230’), granting platforms like Twitter broad immunity from legal claims arising out of failure to remove content.
“Given that Twitter’s alleged liability here rests on its failure to remove content from its platform, dismissal of the Complaint with prejudice is warranted on this ground alone,” Twitter argued.
Section 230 protects websites from liability for the posts of its users although there are exceptions for copyright violations, prostitution material, and violations of federal criminal law.
We obtained a copy of Twitter’s filing for you here.
The lawsuit, filed in January, further accuses Twitter of intentionally spreading child abuse and non-consensually shared porn, possessing child porn, benefiting from child abuse, and refusing to report child sex abuse.
Twitter insists the suffering is not its fault but the fault of the predators who posted the videos.
“This case ultimately does not seek to hold those Perpetrators accountable for the suffering they inflicted on Plaintiff. Rather, this case seeks to hold Twitter liable because a compilation of that explicit video content (the “Videos”) was — years later — posted by others on Twitter’s platform and although Twitter did remove the content, it allegedly did not act quickly enough.
“Twitter recognizes that, regrettably, Plaintiff is not alone in suffering this kind of exploitation by such perpetrators on the Internet. For this reason, Twitter is deeply committed to combating child sexual exploitation (“CSE”) content on its platform. And while Twitter strives to prevent the proliferation of CSE, it is not infallible.
“But, mistakes or delays do not make Twitter a knowing participant in a sex trafficking venture as Plaintiff here has alleged.
Plaintiff does not (and cannot) allege, as he must, that Twitter ever had any actual connection to these Perpetrators or took any part in their crimes. Thus, even accepting all of Plaintiff’s allegations as true, there is no legal basis for holding Twitter liable for the Perpetrators’ despicable acts,” Twitter said in the motion to dismiss.