Clicky

Court Rules TikTok Must Face Lawsuit, Signaling Shift in Platform Accountability and Raising Free Speech Concerns

Algorithmic recommendations face new scrutiny.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

The US Court of Appeals for the Third Circuit has issued an opinion in the Tawainna Anderson v. TikTok case in favor of the lawsuit proceeding on the grounds that Section 230 does not apply to platforms when it comes to algorithmic curation of content.

We obtained a copy of the opinion for you here.

That interpretation of the CDA’s much-debated section may or may not come back to haunt many major (US) social platforms, including where issues of free speech and censorship are concerned.

But for now, the story concerns China’s TikTok and one of the “viral challenges” posted on the social app by a third party, which resulted in a tragedy.

Namely, the lawsuit was brought by the mother of a 10-year-old who died while trying to complete a TikTok “blackout challenge.” The platform is accused of liability related to the death because this content is claimed to have been recommended algorithmically.

The appeals court Judge Patty Shwartz ruled that while the “challenge” itself (inviting users to choke themselves to the point of passing out) was protected user-generated content, TikTok allegedly recommending it via an algorithm was not.

The decision is a precedent compared to previous rulings in the US, which said that Section 230 stretches far enough to cover failure to prevent posting of messages considered as harmful that reach other users – and the judge recognized that.

But Shwartz justified the decision by citing the US Supreme Court in the Moody v. NetChoice case, which she interpreted to mean that promoting or downranking content through algorithms (programmed by a platform) is a form of editorialization, which in turn means that TikTok engaged in “first party speech” and was therefore unprotected by Section 230 and can be sued.

Observers likely keeping an eye on the bigger picture here (i.e., how the entirety of the social media “ecosystem” might be affected) insist, however, that the “moderation” and algorithmic ranking are protected by Section 230, while at the same time being “platform speech” is not contentious.

They focus on tech companies’ First Amendment rights, saying that “the whole point of 230 was to encourage and immunize moderation.”

Some others might say that “the whole point” was to allow platforms to host third-party content without liability while allowing users to express themselves and exercise their own First Amendment rights.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Share this post

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.