Clicky

Australia is concerned it can’t stop “misinformation” in private conversations

Fresh attempts to try and undermine a right to privacy.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Australia’s authorities have updated their “misinformation code” but remain unhappy that large end-to-end encrypted apps are still not “regulated” in a way they would find satisfactory.

That’s despite the fact the “update” does what various governments like the most – leave a lot of room to interpret the rules as best suits them. Thus harm is now communication that represents “serious and credible” threat. And the previous definition is that this threat must also be imminent – however, that is no longer included in the wording.

The code in question, published late last month, is said to be “voluntary” and concerns combating whatever’s flagged as “disinformation and misinformation” – but now the Australian Communications and Media Authority (ACMA) is making it clear that it is not nearly enough.

Currently, the “voluntary” reference has to do with the Digital Industry Group Inc (DIGI) and its members, such as Apple, Google, Facebook (Meta), TikTok and Twitter, “self-regulating” in a bid to find common ground with Australia’s government and avoid negative consequences to their business.

But now the regulator said that while the update is welcome, the work to gain powers necessary to force social media platforms to turn over data will continue.

At stake here are these companies revealing to state authorities how they fight against “misinformation,” and also, “how they respond to complaints.” Specifically, the push is to make those behind social media hand over information about “posts and audience.”

And that, in turn, the government claims, is necessary in its decision-making process, when it comes to making sure laws dealing with “misinformation” are ever stricter.

ACMA is particularly concerned with what they see as the lack of a “robust” framework that would expand the code to “cover the propagation of mis- and disinformation on messaging services that facilitate large-scale group messaging,” the regulator told Guardian Australia.

The article mentions WhatsApp and Facebook Messenger in particular in this context – and tries to back up the case for the need to access data from these apps by mentioning “false rumors about child abduction spreading in India through WhatsApp,” and, “the death tax scare campaign at the 2019 election” in Australia.

In addition to wanting the “voluntary” code to become more stringent consequences-wise – as it becomes more loosely worded – ACMA wants “reserve powers” for itself to bring about future codes that would be binding.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Share this post

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.