Clicky

London police partner with Facebook to train AI on how to block terrorist livestreams

The idea is to improve the platform’s capabilities to detect and instantly flag streamed shootings.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

In the UK, a group of the Metropolitan Police has been collaborating with Facebook to improve the platform’s capabilities to detect and instantly flag streamed shootings. The goal is to develop technology capable of identifying this type of situation, informing the police and, of course, cut the broadcast on time.

Facebook provided body cameras to the Metropolitan Police, which then carried out several firearms training to provide footage for later examination.

Facebook wants to improve its current AI technology to accurately and quickly identify gunman incidents, and the Metropolitan Police is helping the platform to do so.

The test broadcast was viewed 200 times during the stream, which shows that Facebook’s system still needs improvement when it comes to detecting on-going violent events. AI and machine learning techniques require a lot of samples – firearms footage in this case – to effectively detect and flag broadcasts like these.

Click here to display content from Twitter.
Learn more in Twitter’s privacy policy.

Facebook seeks to prevent another Christchurch incident

This initiative is an answer to the proliferation of violent streams, namely the Christchurch attack in New Zealand, which was broadcasted on social media and left 51 dead. The aim is for Facebook to be able to alert authorities as fast as possible when they detect a live shooting.

The Head of the Met’s Counter Terrorism Command, Richard Smith, expressed his support for the initiative: “The live-streaming of terrorist attacks is an incredibly distressing method of spreading toxic propaganda, so I am encouraged by Facebook’s efforts to prevent such broadcasts”.

The Metropolitan Police is not only currently working with Facebook – they stated that the footage will also be shared with other tech firms and social networks to develop similar technology: “Crucially, we will make this technology available to the wider tech industry so collectively, we can prevent the spread of harmful content.”

The UK police is often renowned for having founded The Counter-Terrorism Internet Referral Unit, which carries out counter-terrorism investigations online.

The last incident of this type is the shooting at the Halle synagogue in Germany, which occurred on October 9th. This hideous act left 2 people deceased when an antisemitic individual opened fire during the holiest date of the Jewish calendar.

The perpetrator broadcasted the horrific event on Twitch with a helmet camera, and it took 35 minutes for Twitch to remove the content. By that time, 2,200 people had watched the replay.

Digital rights group EFF has, however, cautioned law enforcement and governments to be aware that rush to censor content from the internet, however violent, could have unintended consequences such as covering up human rights violations.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share