New York City has filed a major federal lawsuit against Meta, Google, Snap, and TikTok’s parent company, ByteDance, placing the spotlight on the companies’ failure to meaningfully restrict access to their platforms by minors.
The complaint has a core allegation: that the social media giants have not implemented effective age verification, leaving children exposed to environments designed for addiction and emotional manipulation.
We obtained a copy of the lawsuit for you here.
The city argues that the absence of genuine age checks has created conditions where harm to youth is not only possible, but predictable.
The lawsuit alleges that platforms have knowingly allowed children under 13 to access their services while collecting their data in violation of longstanding federal protections.
“None of the Defendants conduct proper age verification or authentication. Instead, each Defendant relies on users to self-report their age. This unenforceable and facially inadequate system allows children under 13 to easily create accounts on Defendants’ platforms,” the complaint states.
New York City points out that the companies already possess the technological capacity to estimate user age, yet refuse to apply those tools toward safety.
“Given that Defendants have developed and utilized age-estimation algorithms for the purpose of selling user data and targeted advertisements, Defendants could readily use these algorithms to prevent children under 13 from accessing their platforms, but choose not to do so.”
The complaint goes further, targeting individual companies for their inaction. “Meta has failed to implement effective age-verification measures to keep children off Facebook and Instagram,” it states.
TikTok comes under similar scrutiny: “TikTok’s age-verification measures are dangerously deficient.” And in its section on YouTube, the lawsuit claims, “Google’s age-verification measures and parental controls are ineffective.”
Though these statements were made in the context of a lawsuit aimed at exposing the impact of platform design on youth mental health, they have broader implications.
Across the United States, lawmakers are increasingly turning to mandatory age verification requirements as a policy response to concerns about minors online.
But such proposals almost always rely on users submitting government-issued ID or undergoing biometric scans, steps that raise profound concerns about privacy, anonymity, and surveillance.
Tying access to digital services to government identification paves the way for a centralized identity layer on the internet.
Once companies are required to verify age with official documents, they inevitably retain that information or route it through third-party vendors, multiplying the risks of exposure.
Users may never be aware that their most sensitive data is stored elsewhere. And when breaches occur, the consequences can be severe.
That danger has already become reality. In a separate incident involving Discord, attackers breached a third-party support platform and reportedly accessed over 1.6 terabytes of data.
Among the compromised information were government-issued IDs submitted through the company’s age verification process.
Discord acknowledged that roughly 70,000 users had their IDs exposed, though the attackers claim the number is far higher. The breach was not caused by a software vulnerability but rather a compromised vendor account.
As governments and platforms accelerate the push for more stringent access controls, New York City’s complaint shows that the underlying systems responsible for age verification are themselves deeply flawed.
But the proposed remedy from lawmakers in other jurisdictions, mandatory digital ID for access, threatens to trade one crisis for another. It may reduce the visibility of minors on platforms, but does so by embedding a permanent identity requirement into the digital space. The risks to free expression, anonymous access, and personal data security would expand significantly.
New York’s lawsuit does not call for ID-based restrictions, but by emphasizing the platforms’ failures on age verification, it feeds directly into that legislative narrative.