UK’s Online Safety Act has come into force and the Office of Communications (Ofcom) regulator has quickly set out to start enforcing it, with noncompliance resulting in high fines.
What those opposed to the legislation consider to be a censorship law and a sweeping one at that, is, according to Ofcom, a way to “protect” online users in the UK from illegal harms by legally requiring tech companies to “start taking action to tackle criminal activity on their platforms” as well as “make them safer by design.”
But what the law’s provisions in reality do, say critics, is bring in even more censorship, while at the same time providing for possibilities to undermine encryption via backdoors.
Then there are those who don’t think the Online Safety Act goes far enough, and are in particular upset by the gradual way it has been designed to boil this particular “frog.”
Right now, the deadline of March 15, 2025, has been given to tech companies to come up with risk assessments regarding the consequences that illegal content has on their users, and then starting two days later, they will have to begin putting measures in place to reduce those risks.
But going forward, Ofcom, which says the current requirements are “just the beginning,” plans to introduce more measures, including “crisis response protocols for emergency events (such as last summer’s riots).”
Here, the fear is that newsworthy content about various forms of protests could get censored as well.
Citing crimes like child abuse and terrorism as the reason, Ofcom also reserves the right to force tech firms to build and implement what are effectively encryption backdoors.
Ofcom says the Online Safety Act allows it to, “where we decide it is necessary and proportionate, make a provider use (or in some cases develop) a specific technology to tackle child sexual abuse or terrorism content on their sites and apps.”
Coupled with this, another provision – hash-matching – starts to gain sinister overtones, contrary to what the stated reason for it is, namely, preventing the sharing of “non-consensual intimate imagery and terrorist content.”
Ofcom is for now short on details regarding this, but the two requirements combined could easily be used for encryption backdoors.
Privacy is one victim of weakened encryption that immediately comes to mind, however, harm to online security, and the economy is often overlooked.
“Creating an encryption ‘backdoor’ for law enforcement would effectively be a blackmailer’s charter, allowing criminals and hostile foreign actors to exploit security flaws,” notes the Adam Smith Institute, and adds:
“Such measures would undermine the growth and competitiveness of the UK technology sector, potentially resulting in large companies withdrawing from the market entirely.”