Clicky

UK regulators plan fines and restrictions on internet video providers over “hatred” and more

UK-based video sharing platforms will be required to "take appropriate measures to protect users from harm."

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

“Hate” and “harm” are vague, subjective terms that are often used by the mainstream media and politicians to promote the idea that internet companies need to do more to censor speech and content on their platforms.

? The statistics show that very few internet users ever encounter “hate speech” and that attempts to regulate hate speech are relatively ineffective.

But this has done little to stop the constant push to regulate the internet on the basis of hate and harm.

From this fall, the UK government will start introducing new regulations that give the UK’s communications regulator, the Office of Communications (Ofcom), sweeping new powers to regulate UK-established online video-sharing platforms (VSPs) and force them to implement measures that protect their users from certain content based on vague, subjective terms such as “potentially harmful” and “hatred.”

By April 2022, these regulations will force VSPs to pay a regulatory fee to Ofcom and Ofcom will have the power to suspend or restrict their entitlement to provide a VSP if they fall foul of the regulations.

But before these regulations come into force, Ofcom is seeking a broad range of evidence and information from interested stakeholders.

Ofcom will be gathering evidence and information from interested stakeholders until 5pm on Thursday, September 24, 2020.

Some of the information Ofcom is seeking includes information on the specific functionality VSPs have to identify users, how VSPs implement and enforce their terms of service, the availability of parental control mechanisms in VSPs, the availability of media literacy tools in VSPs, and the indicators of “potential harm” Ofcom should be aware of.

Regulations timeline

The initial regulations will start being introduced from this fall as part of the European Union’s (EU’s) Audiovisual Media Services Directive (AVMSD) which the UK is required to implement during the Brexit transition period.

However, Ofcom notes that these regulations are an interim measure until the UK’s new Online Harms framework comes into force.

This framework is based on the controversial Online Harms White paper which initially proposed internet controls to combat what regulators deem to be “fake news” and “trolling” but after facing mass backlash, the government confirmed that under these regulations it “will not prevent adults from accessing or posting legal content, nor require companies to remove specific pieces of legal content.”

The UK government hasn’t announced when this Online Harms framework will come into force but Ofcom has provided information on when different parts of the regulations under AVMSD will be introduced.

Ofcom notes that when the regulations start coming into force from fall 2020, it’s “mindful of the need for an implementation period for the industry” and plans to “have regulatory guidance documents finalized by summer 2021.”

Between fall 2020 and summer 2021, Ofcom does “not generally expect to take formal enforcement action” but “serious instances of egregious or illegal harm from UK-based VSPs” could result in “robust enforcement actions.”

VSPs that fall under the scope of these regulations will be required to implement “appropriate measures” that protect young users from “potentially harmful” content and protect all users from “illegal content and incitement to hatred and violence.”

The recommended protective measures under these regulations include “having in place and applying terms and conditions, flagging and reporting mechanisms, age verification systems, systems to rate the content by the uploaders or users, parental control systems, easy-to-access complaints functions, and the provision of media literacy measures and tools.”

However, Ofcom notes that “it will be for VSPs to decide which measures are appropriate and proportionate based on their own assessment of the risk of harm.”

Affected companies

The regulations will initially affect VSPs that have headquarters in the UK with Ofcom stating that the UK government’s Department for Digital, Culture, Media and Sport (DCMS) has identified “six potential VSPs” that are likely to come under UK jurisdiction – Twitch, TikTok, LiveLeak, Imgur, Vimeo, and Snapchat.

In fall 2020, the UK government intends to provide clarity on the VSPs that will fall within the scope of Ofcom’s regulations and any other VSPs that fall under the notification requirements will also be subject to these regulations.

Additionally, Ofcom states that the Online Harms regulations are “likely to extend to services that are not headquartered in the UK” and apply to other online service providers, not just VSPs.

Enforcement and sanctions

The UK government intends to grant Ofcom the power to request information from VSPs to assess their compliance. Ofcom will also track complaints and a spike in complaints is one of the metrics that could trigger enforcement action.

When it comes to enforcement measures, Ofcom will have the power to suspend or restrict the entitlement to provide a VSP, impose financial penalties of up to 5% of “applicable qualifying revenue,” and issue legally binding decisions.

Ofcom plans to set out its enforcement approach, procedures, and guidance but hasn’t provided a date for when this information will be released.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share