Clicky

Deleting video content about violent events can often put more people in danger, EFF warns

Often, actually deleting violent content can suppress the freedom of speech and expression of those most in need of such rights.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

In the wake of the Christchurch New Zealand Mosque shooting, the Electronic Frontier Foundation (EFF) has urged caution for those calling for stricter enforcement of online platforms’ terms of service aimed at preventing so-called “hate speech” content and promotion of violent content.

The EFF – a San Francisco, California-based non-profit dedicated to promoting digital rights – recognized that extreme violence spurs understandable calls for more stringent moderation, but stresses that this must not be allowed to have the opposite effect of stifling freedom of speech and expression of those most in need of such rights.

This comes after 49 people lost their lives when a gunman opened fire in two mosques in Christchurch, New Zealand. The massacre, announced in a manifesto posted on the web by the suspect, reportedly a 28-year-old Australian citizen, and then streamed live on Facebook, has been described by the New York Times as “feeling like the first internet-native mass shooting.

In a tweet posted on its Newsroom account, Facebook announced they had been alerted to the video by the police, and added that they acted to swiftly remove the shooter’s accounts from Facebook, and Facebook-owned Instagram, as well as the video itself.

Others, including Google’s YouTube, scrambled to removed the footage of the shooting, while Reddit decided to shut down two long-standing communities specializing in graphic and violent content – /r/watchpeopledie and /r/gore (communities that have been going for many years) – whose users shared the video. Many media outlets around the world, who picked up this content from social media posts, have also been urged to remove it, with many complying.

But this was not enough to convince everyone reacting to the shocking crime. Many users posted messages questioning current moderation practices put in place by social media sites. Politicians, such as Britain’s Home Secretary Sajid Javid took to Twitter to call on the social media service, as well as other giants such as YouTube, Google, and Facebook “to do more” to prevent promotion of violent extremism.

source: twitter @sajidjavid

The results and quality of both the algorithms and the human moderation used by social platforms in order to identify and remove content violating their terms of service have also come under renewed scrutiny.

However, its post entitled “Our Thoughts on the New Zealand Massacre”, EFF – set up to promote internet civil liberties – sounded a note of caution, remarking that most web platforms seem to have complied with their own rules and moved to delete the video of the shooting and other content connected to the crime.

Furthermore, the group warned against falling into the trap of “over-censoring” content on the web.

And this is something that will be impossible to avoid if online platforms are pressured by governments to introduce more stringent rules when it comes to policing speech, EFF argued.

The digital rights organization acknowledged that content moderation is among the internet’s most challenging issues, as lines get blurred between extremists’ speech and the voices of their intended victims who use the same platforms in an attempt to showcase crimes and defend themselves. However, EFF urged joint activity to make sure that the rules meant to prevent glorification and promotion of violence are not “wielded against the most vulnerable members of society.”

The EFF expressed concern that calls for implementing stricter speech policing could have the consequence of suppressing those who use the platforms legitimately to showcase police brutality and war crimes and draw attention to other forms of human rights violations.

In this context, EFF cited the example of Egyptian journalist and human rights activist Wael Abbas who was removed from YouTube for uploading videos showing police brutality, while Twitter’s reaction to his posts and uploads was to suspend his account.

Furthermore, the article recalls that tough speech restrictions are routinely abused by “8chan-style trolls” and state actors alike, who do this “in order to censor innocent people – often the members of society who are most targeted by organized hate groups.”

The organization also said it was joining the American Civil Liberties Union, the Center for Democracy and Technology, and others in supporting the Santa Clara Principles on Transparency and Accountability of Content Moderation Practices.

This set of general rules calls for web platforms to inform the public about the number of posts and accounts they remove, notify the users of these removals and explain the rules that guided their decision, as well as provide a mechanism to appeal content restrictions.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Share this post

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.