Clicky

When it comes to content moderation a little transparency goes a long way, study finds

The study was conduced by the Georgia Institute of Technology and University of Michigan researchers.

A new study by Georgia Institute of Technology and University of Michigan researchers - looking into user behavior on Reddit after receiving content removal explanations - has found that social media platforms could make their moderation job easier if they provided more transparency about their decisions.

The researchers, who note that current moderation practices are opaque to the degree that it's difficult to properly examine them, focus on the role that providing explanations for removal of content has in the way users interact on a platform in the future.

The study used 32 million Reddit posts as its sample, to sum up its findings by suggesting that social media platforms should invest more and make better use of content removal explanations as an important tool of moderation.

Referring to the "black-box nature of content moderation" deployed by a majority of platforms, the study says that this makes it difficult to understand the process, and difficult for users to understand its end-product - the decision to remove their posts.

Become a Member and Keep Reading…

Reclaim your digital freedom. Get the latest on censorship, cancel culture, and surveillance, and learn how to fight back.

Already a supporter? Sign In.

Share this post