Clicky

Minds rolls out its innovate jury public moderation system, letting user’s decide what’s fair and what’s not

Decentralizing who gets to decide the appropriateness of content is a novel idea.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

As anyone who has had the misfortune of dealing with online moderation can attest, it’s a necessary evil that is notoriously difficult to do, and a hard problem to solve.

Now Minds, an open-source, distributed social network with a strong emphasis on internet freedom and free speech, is bringing some fresh ideas to the table, with its “Minds Jury System.” It’s a moderation process that aims to involve users and give them a voice in deciding on the appropriateness of content.

This stands in stark contrast to what the company describes as the ills of moderation such as it is today: introduction of AI and machine learning, a blunt tool prone to mistakes and removal of legitimate content; and one-sided human moderation that tends to expose the biases of those conducting it.

Minds lay out the plan in a detailed blog post, putting the emphasis on giving users control, while making the moderation process as transparent as possible – in other words, ensuring “digital democracy” on the platform.

The Minds Jury System relies on the Santa Clara Principles, that call for letting users know why their content is removed or account suspended, giving them an opportunity to appeal against such decisions, and being transparent about the number of posts and accounts removed.

The Jury here consists of 12 randomly chosen users of the Minds network who are asked to review appeals on moderation decisions, filed by channels they are not subscribed to. It takes at least 75 percent of their votes in favor of an appeal to succeed. Participation in the jury is opt-in, and the process will also build the jurors’ reputation by awarding them a “confidence score” – or alternatively, disqualifying them from future juries.

However, in case a user receives an immediate ban for content such as terrorism, pedophilia, doxxing, malware, and threats of true violence, their appeal will be reviewed by the Minds staff, rather than a jury of peers.

The social platform acknowledges that the new system is unlikely to prove to be a complete solution at this time, but believes it to be a step in the right direction.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share