Subscribe for premier reporting on free speech, privacy, Big Tech, media gatekeepers, and individual liberty online.

Supreme Court Demands Deeper Look at Social Media Anti-Censorship Laws

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

The US Supreme Court has unanimously remanded two crucial cases involving social media regulation laws from Florida and Texas back to lower courts. This move concerns cases relating to both Florida and Texas, where the primary question was whether laws that restrict certain websites from making editorial censorship decisions violate the First Amendment.

On May 24, 2021, Florida Governor Ron DeSantis signed into law SB 7072, which aims to regulate social media platforms by prohibiting the deplatforming of political candidates and requiring platforms to provide explanations when censoring content, among other stipulations.  SB 7072 places several specific restrictions and requirements on social media platforms, including:

  • Prohibiting the willful deplatforming of political candidates,
  • Banning the censorship or deplatforming of journalistic enterprises based on content,
  • Imposing hefty fines on social media platforms that deplatform candidates for political office—up to $250,000 per day for statewide candidates and $25,000 per day for other candidates,
  • Requiring platforms to notify users and provide explanations before taking actions like censoring or deplatforming,
  • Granting Floridians the right to sue platforms for violations and seek monetary damages,
  • Empowering the Florida Attorney General to sue technology companies under the state’s Unfair and Deceptive Trade Practices Act,

That same year, Texas Governor Greg Abbott signed HB 20, a law regulating social media platforms by prohibiting them from censoring content based on viewpoint and imposing several obligations related to content moderation processes.

The key provisions of HB 20 stipulate that social media platforms with over 50 million monthly active users in the US cannot censor content based on viewpoint. Additionally, the law mandates that these platforms must notify users and provide explanations when content is removed, enable users to submit and track complaints about content removal decisions or instances where illegal content was not removed, and allow users to appeal content removal decisions.

The states were challenged on the constitutionality of the laws, in relation to the First Amendment. The key legal questions were:

  • Whether the First Amendment prevents a state from mandating that social media companies host third-party communications, and from controlling the manner in which they do so,
  • Whether the First Amendment stops a state from requiring social media companies to inform and explain to users when their content is censored,

After a back-and-forth legal challenge, the cases ended up at The Supreme Court.

During oral arguments, the Supreme Court had already expressed considerable doubts about the laws, suggesting that these laws might infringe upon the First Amendment rights of companies like Facebook and YouTube. The justices spent nearly four hours discussing the implications of these regulations.

In court, Florida Solicitor General Henry Whitaker argued that social media companies, which he described as mere “transmitters” of user speech, do not have a constitutional right to inconsistently apply censorship policies.

In contrast, trade group representative Paul Clement emphasized the necessity of editorial discretion to filter the vast content online, making platforms functional for users and advertisers alike.

Justice Kagan in particular questioned the constitutionality of the laws, particularly in how they prevent platforms from making independent editorial decisions, a critical aspect of First Amendment rights.

Justice Brett Kavanaugh highlighted the core issue—whether the government is impermissibly suppressing speech, noting the court’s precedent of protecting editorial control.

Meanwhile, Justice Amy Coney Barrett, as she did in the oral arguments for Murthy v. Missouri (another free speech case where she sided with the Biden administration) compared the role of social media platforms in moderating content to newspapers rather than venues like law schools, which can be compelled to host military recruiters under certain federal conditions.

Justice Clarence Thomas and Justice Samuel Alito showed more openness to the state laws, with Thomas questioning the extent of First Amendment protections for platform moderation decisions and Alito scrutinizing the terminology of “content moderation.”

On Monday, the Supreme Court instructed the appellate courts to revisit their rulings on the 2021 statutes that permit state oversight of content moderation by major social media companies.

We obtained a copy of the decision for you here.

Justice Elena Kagan, in her opinion for the court, stated that, even though the decisions of the lower courts were actually vacated on grounds unrelated to the First Amendment, the First Amendment argument stands. Justice Kagan took the time to explain how the court views First Amendment principles regarding this issue, taking a swipe at the Fifth Circuit’s initial ruling — “It is necessary to say more about how the First Amendment relates to the laws’ content-moderation provisions, to ensure that the facial analysis proceeds on the right path in the courts below. That need is especially stark for the Fifth Circuit, whose decision rested on a serious misunderstanding of First Amendment precedent and principle.”

The court is making it clear that the main objective of the Florida and Texas laws would not be expected to survive a First Amendment argument challenge.

“The Fifth Circuit was wrong in concluding that Texas’s restrictions on the platforms’ selection, ordering, and labeling of third-party posts do not interfere with expression. And the court was wrong to treat as valid Texas’s interest in changing the content of the platforms’ feeds,” the opinion reads.

The First Amendment argument could be summarized with this line: “However imperfect the private marketplace of ideas here was a worse proposal – the government itself deciding when speech was imbalanced, and then coercing speakers to provide more of some views or less of others.”

Justice Kagan ruled that the appellate courts had not adequately considered the broad challenge presented by NetChoice, focusing instead on narrower issues brought forth by the parties.

“Today, we vacate both decisions for reasons separate from the First Amendment merits, because neither Court of Appeals properly considered the facial nature of NetChoice’s challenge. The courts mainly addressed what the parties had focused on. And the parties mainly argued these cases as if the laws applied only to the curated feeds offered by the largest and most paradigmatic social-media platforms—as if, say, each case presented an as-applied challenge brought by Facebook protesting its loss of control over the content of its News Feed,” the court wrote.

Here are the key points of the court’s argument:

Facial Challenge and Scope of the Laws: The Court emphasized the importance of understanding the full scope of the statutes in question. It instructed that the lower courts should assess not just the Big Tech applications (like those affecting major social media feeds) but all potential applications and ramifications of the laws, including less obvious ones, to determine whether a substantial number of the law’s applications are unconstitutional.

First Amendment Protections: The Court argued that the editorial discretion of social media platforms is protected under the First Amendment. This includes their decisions to filter, prioritize, label, or exclude certain content. By compelling platforms to alter their expressive content, the laws potentially infringe on their First Amendment rights. “The Court has repeatedly held that ordering a party to provide a forum for someone else’s views implicates the First Amendment if, though only if, the regulated party is engaged in its own expressive activity, which the mandated access would alter or disrupt,” Justice Kagan stated.

Erroneous Lower Court Analyses: The Court noted that the previous appellate decisions did not conduct a proper facial analysis of the First Amendment challenges. The Fifth Circuit, in particular, erred by not recognizing the expressive activity involved in content moderation and by inadequately addressing the platforms’ First Amendment rights.

Importance of Editorial Discretion: The Court underscored that just like traditional media entities, social media platforms exercise editorial control that shapes their public communications. Compelled changes to this editorial content, such as requiring platforms to carry messages they wish to exclude, interfere with their expressive freedom.

Standard for Reviewing Facial Challenges: The Court clarified the standards for facial challenges in the context of the First Amendment, stating that challengers must demonstrate that the unconstitutional applications of the law substantially outweigh the constitutional ones.

The Supreme Court’s decision to remand the social media regulation cases back to the lower courts signals a cautious approach to resolving the tensions between state regulations and First Amendment rights. As the legal battles continue to unfold, the implications for how social media platforms operate within the United States could be profound, particularly in light of the First Amendment arguments made by TikTok, which is facing a ban.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.