WhatsApp is end-to-end encrypted and so it means that there’s no easy way for the owners of the app, Facebook, to see what everyone is chatting about and step in a censor it in realtime – as it can with the main Facebook app and Instagram.
But, still wanting to cut down on spreading “conspiracy theories”, WhatsApp recently announced that it was going to add a blanket ban on sharing any content to too many people – in an effort to stop any conspiracies going “viral” on the app.
Recognizing what it sees are the perils of “forwarding” and how it leads to the spread of “conspiracy theories”, WhatsApp imposed the new limits on message forwards.
The new changes around message forwards seems to be fruitful for the company, as a report revealed that spreading of “highly forwarded” messages has dropped by at least 70% on the Facebook-owned messaging platform.
It is, however, unclear whether the controlled spreading of messaging helped curb “misinformation” or simply prevented users from sharing useful information. That’s likely up to the Facebook gods to decide what they think the difference is.
For a long time now, WhatsApp has faced scrutiny and mounting pressure from governments about how the message forwards on the platform led to the rampant spread of misinformation and conspiracies. WhatsApp, therefore, was compelled to enforce changes on the platform that led to decreased message forwards.
Countries such as India have asked WhatsApp to ensure that such harmful forwards are controlled on the platform.
“We’ve seen a significant increase in the amount of forwarding which users have told us can feel overwhelming and can contribute to the spread of misinformation,” the company said when it announced the new measures. “We believe it’s important to slow the spread of these messages down to keep WhatsApp a place for personal conversation.”