Clicky

Join the pushback against online censorship, cancel culture, and surveillance.

UK Unveils Online Rules, Mandating Age Verification and Algorithmic Content Suppression

Mandatory age checks under Ofcom’s new rules edge the world closer to a de facto digital ID system for internet users.

Illustration of three overlapping smartphones each displaying a silhouette of a person, set against a stylized British flag background in red, white, and blue colors.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

UK’s online safety regulator Ofcom has presented a new, sweeping set of upcoming rules from the censorship law, the Online Safety Act – introducing age verification and algorithmic suppression, among a total of 40 measures.

The draft codes of practice were on Thursday made available to members of parliament, and if the process is completed there as expected, the new rules will be enforced from July 25.

Prior to that, July 24 is the deadline for companies to complete and record assessments of the risks their platforms may pose to minors. (This applies to most online services with over seven million average monthly active users in the UK).

According to Ofcom, the measures apply to apps and sites used by children, like social media, gaming, and search services. The goal is to prevent minors from accessing content “relating to” suicide, self-harm, eating disorders, pornography, misogynistic material, violent content, and online bullying.

Even though the measures are expansive in their subject and nature, a number of critics who favor an even more restrictive approach – children’s charity the NSPCC among them – want Ofcom to expand this push, particularly in the direction of weakening end-to-end encryption in messaging apps.

As things stand now, the new rules refer to age verification as a tool to prevent minors from accessing services that generate or share content listed by Ofcom, and achieve that by implementing “highly effective age assurance.” This refers to both services where “most” – but also “some” of the content falls under the harms categories.

If platforms that include algorithmic recommendations are considered to pose “medium or high risk” of harmful content (as defined) then they must have filters that will hide this content.

And UK’s authorities are requiring platforms to act quickly to remove whatever’s been found to be harmful once it has been “reviewed and assessed.”

Penalties for noncompliance go up to £18 ($24) million or 10% of global revenue. In case these violations persist, courts could block or limit access to offending platforms in the UK.

Ofcom Director Melanie Dawes was promoting the new rules in the media by referring to Netflix’s fictional show Adolescence, as a way to justify the restrictive rules.

“I think in the end what’s happening here and it’s not just in the UK, it is just a change in how people are seeing all of this. Whether it’s the drama Adolescence and that’s brought to life some of these problems of misogyny, pornography, violent content on our kids’ internet feeds,” Dawes told BBC’s Today program.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Logo with a red shield enclosing a stylized globe and three red arrows pointing upward to the right, next to the text 'RECLAIM THE NET' with 'RECLAIM' in gray and 'THE NET' in red

Join the pushback against online censorship, cancel culture, and surveillance.

Reclaim The Net Logo

Defend free speech and individual liberty online. 

Share this post